System Bits: July 10

Light waves; the “cyberoctopus”; better capacitors.

popularity

Light waves run on silicon-based chips
Researchers at the University of Sydney’s Nano Institute and Singapore University of Technology and Design collaborated on manipulating light waves on silicon-based microchips to keep coherent data as it travels thousands of miles on fiber-optic cables.

Such waves—whether a tsunami or a photonic packet of information—are known as solitons. The Sydney-Singapore team has for the first time observed soliton dynamics on an ultra-silicon-rich nitride (USRN) device fabricated in Singapore using state-of-the-art optical characterization tools at Sydney Nano.

This foundational work, published in Laser & Photonics Reviews, is important because most communications infrastructure still relies on silicon-based devices for propagation and reception of information. Manipulating solitons on-chip could potentially allow for the speed-up of photonic communications devices and infrastructure.

Ezgi Sahin, a Ph.D. student at SUTD, conducted the experiments with Dr. Andrea Blanco Redondo at the University of Sydney.

“The observation of complex soliton dynamics paves the way to a wide range of applications, beyond pulse compression, for on-chip optical signal processing,” Sahin said. “I’m happy to be a part of this great partnership between the two institutions with deep collaboration across theory, device fabrication and measurement.”


Photo credit: Singapore University of Technology and Design

Professor Ben Eggleton, co-author of the study and director of Sydney Nano, said, “This represents a major breakthrough for the field of soliton physics and is of fundamental technological importance.

“Solitons of this nature—so-called Bragg solitons—were first observed about 20 years ago in optical fibers but have not been reported on a chip because the standard silicon material upon which chips are based constrains the propagation. This demonstration, which is based on a slightly modified version of silicon that avoids these constraints, opens the field for an entirely new paradigm for manipulating light on a chip.”

Professor Dawn Tan, a co-author of the paper at SUTD, said, “We were able to convincingly demonstrate Bragg soliton formation and fission because of the unique Bragg grating design and the ultra-silicon-rich nitride material platform we used. This platform prevents loss of information which has compromised previous demonstrations.”

Solitons are pulses that propagate without changing shape and can survive collisions and interactions. They were first observed in a Scottish canal 150 years ago and are familiar in the context of tsunami waves, which propagate thousands of kilometers without changing shape.

Optical soliton waves have been studied since the 1980s in optical fibers and offer enormous promise for optical communication systems because they allow data to be sent over long distances without distortion. Bragg solitons, which derive their properties from Bragg gratings (periodic structures etched in to the silicon substrate), can be studied at the scale of chip technology where they can be harnessed for advanced signal processing.

They are called Bragg solitons after Australian-born Lawrence Bragg and his father William Henry Bragg, who first discussed the concept of Bragg reflection in 1913 and went on to win the Nobel Prize in Physics. They are the only father-and-son pair to have won Nobel Prizes.

Researchers take the octopus as an AI, robotics model
Assistant Professor Girish Chowdhary at the University of Illinois’ Grainger College of Engineering is leading a project funded by the Office of Naval Research to advance the fields of artificial intelligence, control, and robotics, by learning from the brain and body of octopuses and other cephalopods.

Also included on the team is ECE ILLINOIS Associate Professor Prashant Mehta, along with Mechanical Science and Engineering’s Mattia Gazzola, and Rhanor Gillette (Molecular and Cellular Bioloogy) from Illinois, William Gilly (Biology) and Ivan Soltesz (Neurosurgery) from Stanford University, and John Rogers (Biomedical engineering), a former Illinois professor who is now at Northwestern University. The $7.5 million Multi-university Research Initiative (MURI) award is for building a Cyberoctopus, a software equivalent to the marine animal that will help the team understand and leverage its ability to conduct distributed inference and decision-making, its embodied control and intelligence, and its ability to learn new behavior quickly.

“We believe that bringing autonomy to the next level will require us to learn from animals,” said Chowdhary, assistant professor of agricultural and biological engineering and the lead Principal Investigator of the project. “The octopus is a species with just the right mix of complexity. We believe we can learn a lot from seeing how the octopus learns, evolves, and adapts.”

First, the group will study the neurodynamics of an octopus. In humans and many other animals, a centralized brain makes a majority of the decisions. In an octopus, most of the “brain” is distributed, not centralized, along the eight arms, meaning each appendage can act independently but also in a coordinated fashion. Being able to control a distributed system like this is a great challenge for modern AI.

“If you look at the neural networks in the brain of an octopus, each neuron is a dynamic system and not a static nonlinearity like some of the artificial neural networks,” said Mehta, also a professor in MechSe. “It will be exciting to understand the role played by neurodynamics in inference, control and learning functions of the octopus brain.”

Second, the team will apply principles of embodiment of control and intelligence to the cyberoctupus, leaning on the biology expertise of world-renowned octopus experts, William Kier (Biology, University of North Carolina at Chapel Hill), Gillette, and Gilly.

Gillette has a long track record of studying octopus and cephalopods and has observed them exhibiting a high level of intelligence. Octopuses can open jars from the inside, solve puzzles, and use other animals’ shells as their own. This level of intellect could be beneficial to machines.

“Learning how the octopus intelligence emerges from the seamless interplay between its neural and mechanical, distributed infrastructure is one of the keys to this research,” said Gazzola, also a member of the National Center for Supercomputing Applications.

Gazzola will be working closely with Soltesz to build realistic simulations of the cyberoctopus. The simulations will integrate realistic models of soft mechanics with detailed models of the neural architecture uncovered by the biologists in the team.

How octopuses and squids learn new behavior is the third focus area. Gilly’s research has shown that during the first several weeks of their lives, juvenile squids must learn how to hunt small, planktonic crustaceans to survive. If this skill isn’t developed in this time period, they cannot survive on their own.

“This acquisition of behavior that is not preprogrammed, but has to be learned, is fundamental to how animals learn to be intelligent,” said Chowdhary, who also holds appointments with the departments of computer sciences and aerospace engineering. “That’s good because that hints at far more efficient ways of doing reinforcement learning and adaptive control.”

To try and understand the inner workings of an octopus, Rogers is helping build sensor patches that can be embedded inside a real octopus to record neural activity. Rogers has previously installed these patches in mice and other animals, but this is the first time anyone has ever tried to put such patches inside an octopus.

“We did this because we think there needs to be a new stream of thought in AI. We seem to have gotten a lot out of the current deep learning thinking, there’s a lot of merit to that, but how do we go beyond its limitations?” Chowdhary explained. “Research gets done when people get out of the box.”

Chowdhary and Mehta are affiliated with the CSL. Chowdhary is also affiliated with the Beckman Institute.

Capacitors benefit from machine learning tech
Capacitors, given their high energy output and recharging speed, could play a major role in powering the machines of the future, from electric cars to cell phones. However, the biggest hurdle for capacitors as energy storage devices is that they store much less energy than a similar-sized battery.

Researchers at the Georgia Institute of Technology are tackling that problem in a novel way by using supercomputers and machine learning techniques to ultimately find ways to build more capable capacitors. The method was described in Nature Partner Journal’s Computational Materials, published in February 2019. The study involved teaching a computer to analyze at the atomic level two materials, aluminum and polyethylene, which are used to make some capacitors.

The researchers focused on finding a way to more quickly analyze the electronic structure of the capacitor materials, looking for features that could affect performance. “The electronics industry wants to know the electronic properties and structure of all of the materials they use to produce devices, including capacitors,” said Rampi Ramprasad, a professor in Georgia Tech’s School of Materials Science and Engineering.

For example, polyethylene is a very good insulator with a large band gap, the energy range forbidden to electrical charge carriers. But if it has a defect, unwanted charge carriers are allowed into the band gap, reducing efficiency, he said.

“In order to understand where the defects are and what role they play, we need to compute the entire atomic structure, something that so far has been extremely difficult,” said Ramprasad. “The current method of analyzing those materials using quantum mechanics is so slow that it limits how much analysis can be performed at any given time.”

Ramprasad and his colleagues used machine learning to help develop new materials. Here they used a sample of data created from a quantum mechanical analysis of aluminum and polyethylene as an input to teach a powerful computer how to simulate that analysis.

Analyzing the electronic structure of a material with quantum mechanics involves solving the Kohn-Sham equation of density functional theory, which generates data on wave functions and energy levels. That data is then used to compute the total potential energy of the system and atomic forces.

The researchers used the Comet supercomputer at the San Diego Supercomputer Center, an Organized Research Unit of the University of California San Diego for early calculations; and the Stampede2 supercomputer at the Texas Advanced Computing Center, at the University of Texas at Austin, for the later stages of this research. Both systems are funded by the National Science Foundation under multi-year awards.

“In the work leading up to the study, we used Comet extensively for high-throughput polymer electronic property calculation, such as the effect of polymer morphology on the bandage of polymers,” said study co-author Deepak Kamal, a graduate student advised by Ramprasad at the Georgia Tech School of Materials Science and Engineering. “We used Comet because it was fast and efficient at handling large number and quantities of calculations.”

Using the new machine learning method developed by Ramprasad and colleagues produced similar results several orders of magnitude faster than using the conventional technique based on quantum mechanics.

“This unprecedented speed-up in computational capability will allow us to design electronic materials that are superior to what is currently out there,” Ramprasad said. “Basically, we can say, ‘here are defects with this material that will really diminish the efficiency of its electronic structure.’ Once we can address such aspects efficiently, we can better design electronic devices.”

While the study focused on aluminum and polyethylene, machine learning could be used to analyze the electronic structure of a wider range of materials. Beyond analyzing electronic structure, other aspects of material structure now analyzed by quantum mechanics could also be hastened by the machine learning approach, Ramprasad said.

“In part we selected aluminum and polyethylene because they are components of a capacitor,” he explained. “But we also demonstrated that one can use this method for vastly different materials, such as metals that are conductors and polymers that are insulators.”

The faster processing allowed by the machine learning method would also enable researchers to more quickly simulate how modifications to a material will impact its electronic structure, potentially revealing new ways to improve its efficiency.

Added Kamal: “Supercomputing systems allow high-throughput computing which enables us to create vast databases of knowledge about various material systems. This knowledge can then be utilized to find the best material for a specific application.”

Authors of the study, called Solving the Electronic Structure Problem with Machine Learning, include Anand Chandrasekaran, Deepak Kamal, Rohit Batra, Chiho Kim, and Lihua Chen, in addition to Ramprasad and Kamal. All are from Georgia Tech’s School of Materials Science and Engineering. The research was also supported by the Office of Naval Research.



Leave a Reply


(Note: This name will be displayed publicly)