Controlling graphene; big data thinking; quantum approach to big data.
Precisely controlling graphene molecules
Researchers at UCLA’s California NanoSystems Institute have found that in the same way gardeners may use sheets of plastic with strategically placed holes to allow plants to grow but keep weeds from taking root, the same basic approach can be applied in terms of placing molecules in the specific patterns they need within tiny nanoelectronic devices, which could be useful in creating sensors that are small enough to record brain signals.
Led by Paul Weiss, a distinguished professor of chemistry and biochemistry, the UCLA researchers developed a sheet of graphene material with minuscule holes in it that they could then place on a gold substrate, a substance well suited for these devices, they said. The holes allow molecules to attach to the gold exactly where the scientists want them, creating patterns that control the physical shape and electronic properties of devices that are 10,000 times smaller than the width of a human hair.
The team wanted to develop a mask to place molecules only where they wanted them on a stencil on the underlying gold substrate. They said they knew how to attach molecules to gold as a first step toward making the patterns needed for the electronic function of nanodevices. But the new step here was preventing the patterning on the gold in places where the graphene was. The exact placement of molecules enables the team to determine exact patterning, which they said is key to their goal of building nanoelectronic devices like biosensors.
Thinking big about big data
To Michael Jordan, the Pehong Chen Distinguished Professor in Electrical Engineering and Computer Science as well as Statistics at UC Berkeley, the smart way to extract and analyze key information embedded in mountains of “Big Data” is to ignore most of it. Zero in on collections of small amounts of data instead.
“You may be swamped with data, but you’re not swamped with relevant data. Even in a huge database, there is often only a small amount of it that is relevant,” he said. In a relatively simple example, he explained, “To choose a restaurant in a city I’m visiting, I don’t need to know about all the restaurants that are out there and everyone’s experiences with them. I only need to know about the restaurants that I’m likely to be interested in and reviews by people who share my tastes. The beauty of ‘Big Data’ is that the chance of finding a relevant subset of data for any particular person is high. The challenge is finding that relevant subset.”
As such, Jordan has been working on ideas that span computer science and statistics, which have historically proceeded along separate paths. However, he sees them as natural allies — and Big Data is the phenomenon that has drawn them together. And bringing the two perspectives together is a win-win.
For the past few years, Jordan has been developing this and other strategies in UC Berkeley’s Algorithms, Machines and People (AMP) Lab. He’s a founder and co-director of the three-year-old lab, a small, highly collaborative team of a half dozen computer and statistical scientists and their graduate students. The team has selected and tackled a few big challenges that had blocked access to efficient, accurate analysis of data in the real world. AMP Lab software platforms have already been adopted by hundreds of companies as well as researchers — from neuroscientists to Netflix.
A quantum big data approach
In order to solve big data problems that even modern supercomputers can’t solve, researchers at MIT, the University of Waterloo, and the University of Southern California have developed at quantum computing approach
They explained that algebraic topology is key to the new method that helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world. In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. It is often these fundamental topological attributes that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.
Interestingly, it doesn’t matter what kind of dataset is being analyzed. The topological approach to looking for connections and holes works whether it’s an actual physical hole, or the data represents a logical argument and there’s a hole in the argument. This will find both kinds of holes.
Using conventional computers, that approach is too demanding for all but the simplest situations. Topological analysis “represents a crucial way of getting at the significant features of the data, but it’s computationally very expensive, which is where quantum mechanics kicks in. The new quantum-based approach could exponentially speed up such calculations.
Leave a Reply