System Bits: April 25

Cheaper wafers; cystic fibrosis sensor; autonomous aerial vehicles.

popularity

Graphene used as copy machine for cheaper semiconductor wafers
MIT researchers reminded that in 2016, annual global semiconductor sales reached their highest-ever point, at $339 billion worldwide while in that same year the semiconductor industry spent about $7.2 billion worldwide on wafers. Now, a technique developed by MIT engineers may vastly reduce the overall cost of that wafer technology and enable devices made from more exotic, higher-performing semiconductor materials than conventional silicon, the team said.

The new method uses graphene — single-atom-thin sheets of graphite — as a sort of copy machine to transfer intricate crystalline patterns from an underlying semiconductor wafer to a top layer of identical material.

A nickel film peel-off from a silicon wafer demonstrates the concept of using a 2-D-material-based transfer process for wafers. (Source: MIT)

The researchers said they worked out carefully controlled procedures to place single sheets of graphene onto an expensive wafer. They then grew semiconducting material over the graphene layer. They found that graphene is thin enough to appear electrically invisible, allowing the top layer to see through the graphene to the underlying crystalline wafer, imprinting its patterns without being influenced by the graphene.

Also, graphene is rather slippery and does not tend to stick to other materials easily, which allowed them to simply peel the top semiconducting layer from the wafer after its structures had been imprinted.

Jeehwan Kim, the Class of 1947 Career Development Assistant Professor in the departments of Mechanical Engineering and Materials Science and Engineering said in conventional semiconductor manufacturing, the wafer, once its crystalline pattern is transferred, is so strongly bonded to the semiconductor that it is almost impossible to separate without damaging both layers. “You end up having to sacrifice the wafer — it becomes part of the device.”

Using the new technique, manufacturers could use graphene as an intemediate layer, allowing them to copy and paste the wafer, separate a copied film from the wafer, and reuse the wafer many times over. In addition to saving on the cost of wafers, this could open opportunities for exploring more exotic semiconductor materials.

“The industry has been stuck on silicon, and even though we’ve known about better performing semiconductors, we haven’t been able to use them, because of their cost. This gives the industry freedom in choosing semiconductor materials by performance and not cost,” Kim said.

Going forward, the researchers said they will design a reusable “mother wafer” with regions made from different exotic materials. Using graphene as an intermediary, they hope to create multifunctional, high-performance devices. They are also investigating mixing and matching various semiconductors and stacking them up as a multimaterial structure.
 

Wearable sweat sensor diagnoses cystic fibrosis
A wristband-type wearable sweat sensor developed by researchers at the Stanford University School of Medicine, in collaboration with the University of California-Berkeley could transform diagnostics and drug evaluation for cystic fibrosis, diabetes and other diseases.

The sensor collects sweat, measures its molecular constituents and then electronically transmits the results for analysis and diagnostics but unlike old-fashioned sweat collectors, the device does not require patients to sit still for a long time while sweat accumulates in the collectors.

A wearable sensor that extracts sweat and analyzes its constituents could be a useful device for diagnosing and monitoring diseases. (Source: Stanford School of Medicine)

The two-part system of flexible sensors and microprocessors sticks to the skin, stimulates the sweat glands and then detects the presence of different molecules and ions based on their electrical signals. The more chloride in the sweat, for example, the more electrical voltage is generated at the sensor’s surface. The team used the wearable sweat sensor in separate studies to detect chloride ion levels — high levels are an indicator of cystic fibrosis — and to compare levels of glucose in sweat to that in blood. High blood glucose levels can indicate diabetes.

Conventional methods for diagnosing cystic fibrosis — a genetic disease that causes mucus to build up in the lungs, pancreas and other organs — require that patients visit a specialized center and sit still while electrodes stimulate sweat glands in their skin to provide sweat for the test. The electrodes can be annoying, especially for kids, in whom CF is most often diagnosed. Then, children have to sit still for 30 minutes while an instrument attached to their skin collects sweat. Even then, the test isn’t over. Families wait while a lab measures the chloride ions in the sweat to determine if the child has cystic fibrosis – and this cumbersome method hasn’t changed in 70 years. By comparison, the wearable sweat sensor stimulates the skin to produce minute amounts of sweat, quickly evaluates the contents and beams the data by way of a cellphone to a server that can analyze the results. The test happens all at once and in real time making it much easier for families to have kids evaluated.

Additionally, people living in underserved communities or in out-of-the-way villages in developing countries, where conventional testing is unavailable, could benefit from a portable, self-contained sweat sensor.

The sensor is not only for diagnosis and monitoring – as it could also be used to help with drug development and drug personalization. CF is caused by any of hundreds of different mutations in the CF gene, so it’s possible to use the sensor to determine which drugs work best for which mutations especially since CF drugs work on only a fraction of patients.

The team is now working on large-scale clinical studies to look for correlations between sweat-sensor readings and health.

Dogfighting with autonomous aerial vehicles
While aerial dogfighting began more than a century ago in the skies over Europe with propeller-driven fighter aircraft carried aloft on wings of fabric and wood, at an event held recently in southern California, academic researchers believe we could be entering a new chapter in this form of aerial combat.

In what they believe might be the first aerial encounter of its kind, researchers from the Georgia Tech Research Institute and Naval Postgraduate School recently pitted two swarms of autonomous aircraft against one another over a military test facility.

Georgia Tech Research Institute researchers Evan Hammac (left) and Rick Presley prepare Zephyr aircraft for flight during a live demonstration involving teams from the Georgia Tech Research Institute and the Naval Postgraduate School. (Source: U.S. Navy photo by Javier Chagoya)

And while the friendly encounter may not have qualified as an old-fashioned dogfight, it provided the first example of a live engagement between two swarms of unmanned air vehicles (UAVs), and allowed the two teams to demonstrate different combat tactics in flight.

Don Davis, division chief of the Robotics and Autonomous Systems Branch of the Georgia Tech Research Institute said, “The ability to engage a swarm of threat UAVs with another autonomous swarm is an area of critical research for defense applications. This experiment demonstrated the advances made in collaborative autonomy and the ability of a team of unmanned vehicles to execute complex missions. This encounter will serve to advance and inform future efforts in developing autonomous vehicle capabilities.”

The researchers reported that each team launched ten small propeller-driven Zephyr aircraft, though two of the aircraft experienced technical issues at launch and were unable to compete, resulting in a 10 versus 8 competition. Although the UAVs were physically identical, their computers used different autonomy logic, collaboration approaches, and communications software developed by the two institutions. GPS tracking allowed each aircraft to know the location of the others for this demonstration. In the future, this information will be provided by on-board cameras, radars, and other sensors and payloads.
 
Each aircraft used a single-board mission computer, and for this demonstration, an open-source autopilot maintained flight control. The aircraft also had Wi-Fi systems that allowed them to communicate with other aircraft and with a ground station.

Both teams were trying to solve the same problem of flying a large swarm in a meaningful mission, and the researchers came up with solutions that were similar in some ways and different in others, according to Charles Pippin, a senior research scientist at the Georgia Tech Research Institute.
“By comparing how well each approach worked in the air, we were able to compare strategies and tactics on platforms capable of the same flight dynamics.”
The event took place February 9, 2017 at Camp Roberts, a California National Guard facility in Monterey County, Calif. 

As part of the development, the teams ran hardware-in-the-loop simulations where the actual algorithms run on the hardware that will fly. The full software stack includes the autonomy logic, communications systems, collaboration algorithms and other software that is then inserted directly into the actual aircraft. In the third step, the tactics are flown on the aircraft on test ranges.

Further, the Georgia Tech researchers are using machine learning to help their autonomy system optimize performance and recognize under which circumstances a particular tactic may be advantageous. 

“Right now, we’re more interested in the research questions about autonomous coordination among the vehicles and the tactical behavior of the groups of vehicles,” Pippin explained. “We are focusing our efforts on how these vehicles cooperate and want to understand what it means for them to operate as a team.”

Dogfighting tactics have advanced dramatically since the World War I, but the advent of UAV swarms may bring a brand new set of challenges. Unmanned vehicles have freedom to dive, bank, and climb at rates human pilots cannot tolerate. But the real advantage may be in computing power that could track dozens of adversaries – far more than any human pilot could do – and develop new ways to address challenges.

“Autonomous techniques using machine learning may identify new tactics that a human would never think of,” added Davis. “Humans tend to base their techniques on tactics that manned fighters have used in the past. These autonomous aircraft may invoke new strategies.”


Tags:

Leave a Reply


(Note: This name will be displayed publicly)