Metal organic frameworks; piezoelectricity; training robots like puppets.
Better absorbing materials
University of Illinois bioengineers have taken a new look at an old tool to help characterize a class of materials called metal organic frameworks (MOFs), used to detect, purify and store gases. The team believes these could help solve some of the world’s most challenging energy, environmental and pharmaceutical challenges – and even pull water molecules straight from the air to provide relief from drought.
The research team, led by bioengineering professor Rohit Bhargava, reported it is using infrared chemical imaging to examine and optimize the structure of MOFs. They noted that although around for more than a decade, IR imaging is greatly underutilized in materials analysis, and they found that with a few modifications to improve the speed of analysis, it is the perfect tool for this application.
MOFs are microscopic-scale porous crystals engineered from metal ions bound together by organic molecules called ligands. Although they are tiny, they have an immense absorptive ability. “The pores allow the MOFs to work like tiny sponges that can soak up chemicals such as pharmaceuticals and gases,” said Sanghamitra Deb, a postdoctoral researcher at the Beckman Institute for Advanced Science and Technology at the U. of I.
The precise structure and chemistry of MOFs greatly influence their functionality. Therefore, detailed characterization is essential in determining their best use. The traditional methods used in materials science analysis, like high-powered electron microscopy and spectroscopy, do not combine chemical insights with the spatial resolution of IR imaging, the researchers said, so they can only provide average chemical measurements. MOFs form by crystallizing out of a solution, and there is no way of fully controlling their structure or chemistry. This lack of control leaves a lot of room for defects to form, and the traditional methods for characterization only show that there is a defect but cannot pinpoint the specific location. IR imaging allows the chemistry and the structure to be seen in one shot.
This unique use of an older technique, but with new instrumentation, allows us to quickly determine the quality and best application for specific MOFs in a nondestructive way – something no other group has been able to do.
Identifying blood biomarkers to improve early-stage detection, treatment of numerous diseases
Purdue University researchers have found a method of identifying biological markers in small amounts of blood that they believe could be used to detect a myriad of diseases, infections and different medical conditions at early stages.
Jeffrey Rhoads, a professor in Purdue’s School of Mechanical Engineering; George Chiu, a professor in Purdue’s School of Mechanical Engineering, School of Electrical and Computer Engineering, and Department of Psychological Sciences; and Eric Nauman, a professor in the School of Mechanical Engineering, Department of Basic Medical Sciences and the Weldon School of Biomedical Engineering, are part of a team of researchers that has created microelectromechanical resonators, or small vibrating sensors, that can detect these biomarkers using just a drop or two of blood. The plate-style resonant sensors allows sensitive, inexpensive detection of biomarkers that can signify disease, illness or trauma.
The goal is to find the disease so early that it can be treated without invasive surgery, the researchers said. The test looks for a particular protein related to a disease, so this could be used for the detection of many different diseases.
The sensors use a piezoelectrically actuated resonant microsystem, which when driven by electricity can sense a change in mass. The sensitivity of the resonator increases as the resonant frequency increases, making high-frequency resonators excellent candidates for biomarker detection.
This is possible because the team discovered a way to conduct the test that identifies a minute amount of protein in a very small amount of blood.
Detecting biomarkers is like trying to find a handful of needles in a large haystack, so they devised a method that divided the large haystack into smaller haystacks. Instead of having a single sensor, it makes more sense to have an array of sensors and do statistical-based detection.
Training robots like puppets
Robots today must be programmed by writing computer code, but imagine donning a VR headset and virtually guiding a robot through a task, like you would move the arms of a puppet, and then letting the robot take it from there. That’s the vision of Pieter Abbeel, a professor of electrical engineering and computer science at the University of California, Berkeley, and his students, Peter Chen, Rocky Duan and Tianhao Zhang, who have launched a startup, Embodied Intelligence Inc., to use the latest techniques of deep reinforcement learning and artificial intelligence to make industrial robots easily teachable.
“Right now, if you want to set up a robot, you program that robot to do what you want it to do, which takes a lot of time and a lot of expertise,” said Abbeel, who is currently on leave to turn his vision into reality. “With our advances in machine learning, we can write a piece of software once — machine learning code that enables the robot to learn — and then when the robot needs to be equipped with a new skill, we simply provide new data.”
The “data” is training, much like you’d train a human worker, though with the added dimension of virtual reality. Using a VR headset without ever touching the robot, people can train a robot in a day, in contrast to the weeks to months typically required to write new computer code to reprogram a robot. The technique can work with robots currently in manufacturing plants and warehouses around the world.
“Commodity VR devices provide an easy way to control physical robots. Since the robot simply mimics the hand motion that’s tracked by VR, a person without any special training can make the robot do the right thing right from the beginning,” Chen said. “The robot will keep learning and after a while the robot says, ‘I got this, I can do this task on my own now.’ ”
In a paper posted online last month, Abbeel and his colleagues demonstrated the power of this type of imitation learning: Using a $1,000 VR headset and hand-tracking software, they trained a robot to coordinate its arms with its vision to learn new skills as complex as inserting a peg into a hole.
“It completely changes the turnaround time because the amount of data you need is relatively small,” Abbeel said. “You might only need a day of demonstrations from humans to have enough data for a robot to acquire the skill.”
“When we perform a task, we do not solve complex differential equations in our head. Instead, through interactions with the physical world, we acquire rich intuitions about how to move our body, which would be otherwise impossible to represent using computer code,” Duan said. “This is much like AlphaGo, which does not use any of the hard-coded strategies common in traditional approaches, but acquires its own intuitions and strategies through machine learning.”
AlphaGo is a computer program developed by Alphabet Inc. to play the ancient Chinese board game Go, which is considered more complicated for a computer than either checkers or chess. Using machine learning, AlphaGo earlier this year beat the world’s top-ranked Go player.
Abbeel, who is president and chief scientist of the startup, cofounded the company in September with three of his graduate students: Chen, now CEO; Duan, now CTO; and Zhang, now on the technical staff. Based in Emeryville, just south of Berkeley, it has already raised $7 million in seed funding.
Abbeel, Chen, Duan and Zhang have worked together for many years in the Berkeley AI Research lab. Abbeel, Chen and Duan also worked together at OpenAI, a non-profit company cofounded by Elon Musk, of Tesla and Space-X fame, and dedicated to building safe AI.
The idea behind the company came from the team’s observation that rapid advances in deep reinforcement learning and deep imitation learning over the past five years are not reflected in the industrial robots in use today to assemble cars and appliances or move stuff around warehouses.
“This is an amazing capability that we just developed here at UC Berkeley, and we decided we should put this into the world and empower companies still using techniques that are many years behind what is currently possible,” Abbeel said. “This will democratize access to robotic automation.”
Leave a Reply