Quantum computer coding in silicon; supercomputing feats; revolutionizing the camera.
Silicon-based quantum computer coding
With the goal of removing lingering doubts quantum computers can become a reality, researchers at the University of New South Wales have proven – with what they say is the highest score ever obtained – that a quantum version of computer code can be written and manipulated using two quantum bits in a silicon microchip, removing any doubt silicon can be the foundation for a powerful quantum computer.
The quantum code is built upon a class of phenomena called quantum entanglement, which allows for seemingly counterintuitive phenomena such as the measurement of one particle instantly affecting another – even if they are at opposite ends of the universe, the researchers explained.
Professor Andrea Morello, of the School of Electrical Engineering & Telecommunications at UNSW and Program Manager in the Centre for Quantum Computation & Communication Technology, who led the research said, “This effect is famous for puzzling some of the deepest thinkers in the field, including Albert Einstein, who called it ‘spooky action at a distance.’ Einstein was skeptical about entanglement, because it appears to contradict the principles of ‘locality’, which means that objects cannot be instantly influenced from a distance.”
Since then, physicists have struggled to establish a clear boundary between our everyday world – which is governed by classical physics – and this strangeness of the quantum world, the researchers said. For the past 50 years, the best guide to that boundary has been a theorem called Bell’s Inequality, which states that no local description of the world can reproduce all of the predictions of quantum mechanics.
But Bell’s Inequality demands a very stringent test to verify if two particles are actually entangled, known as the ‘Bell test’, named for the British physicist who devised the theorem in 1964. The key aspect of the Bell test is that it is extremely unforgiving: any imperfection in the preparation, manipulation and read-out protocol will cause the particles to fail the test, but the UNSW researchers passed the test.
The significance of the UNSW experiment is that creating these two-particle entangled states is tantamount to writing a type of computer code that does not exist in everyday computers. It therefore demonstrates the ability to write a purely quantum version of computer code, using two quantum bits in a silicon microchip – a key plank in the quest super-powerful quantum computers of the future.
Supercomputing feats
Two research groups from ETH Zurich competed in the finals for the high-performance computing-focused Gordon Bell Prize, quite an achievement considering only five teams were preselected worldwide. Simulations on supercomputers support experimental research in many scientific fields. The calculations are normally hugely complex and time-consuming, even for the most powerful supercomputers, and include, for example, the simulation of nanoscale electronic components such as transistors, or the simulation of blood flow through microscopic channels, separating out metastasis-building tumour cells from red blood cells.
In order to simulate physical processes in a reasonable time, it is essential to have not only the most powerful supercomputers but, above all, highly efficient software. Both of the ETH teams used the CSCS supercomputer Piz Daint to refine and optimize their simulation software. The simulations subsequently carried out on one of the fastest supercomputers in the world, Titan, at the Oak Ridge Leadership Computing Facility at the Oak Ridge National Laboratory in Tennessee, were what won the two ETH researchers their nominations.
ETH professors Mathieu Luisier from the Institute for Integrated Systems and Joost VandeVondele, Head of the Nanoscale Simulations Group, and their teams combined two software programs and also developed a new algorithm to maximize the throughput rate for hybrid computer systems consisting of conventional CPUs and GPUs, which allowed them not only to reduce the time taken to simulate nanoscale devices by 50x, but also increase the size of the nanocomponents to over 50,000 atoms – a number corresponding to the actual size of a nanodevice. They said up to now, even the most CPU-intensive models usually only enabled systems to be simulated that consisted of a maximum of 1000 atoms.
When simulating the behavior of nanocomponents, the quantum effects that occur in these tiny components have to be taken into account. To this end, Luisier has been working for over ten years on software called OMEN, and joined forces with VandeVondele, who helped to write and develop the CP2K code.
The second research team from ETH that has been nominated for the Gordon Bell Prize is led by ETH Professor Petros Koumoutsakos of the Computational Science & Engineering Laboratory (CSElab). In collaboration with researchers from the Università della Svizzera italiana, the United States (Brown University, NVIDIA) and Italy (CNR and University of Rome), the team used the Titan supercomputer to run state-of-the-art simulations of the flow of micron-sized red blood cells and tumor cells through microfluidic channels. According to the researchers, the geometric details and the number of cells have been improved by two orders of magnitude; thus the study redefines the boundaries for simulating flows through micron-scale devices, which allowed the researchers to emulate the laboratory experiments (lab-on-a-chip) of other researchers looking at how tumor cells could be filtered out of the blood to allow for the early diagnosis of metastatic cancer.
The “in silico lab-on-a-chip”, as Koumoutsakos calls the simulation, reproduces and assists in further optimizing laboratory experiments and designing novel lab-on-a-chip-devices that are essential to researchers in the pharmaceutical industry.
Revolutionizing the camera
Marking a breakthrough in camera technology, Rice University researchers have developed patented prototypes of FlatCam, which consists of little more than a thin sensor chip with a mask that replaces lenses in a traditional camera.
The researchers assert that what makes this practical are the sophisticated computer algorithms that process what the sensor detects and converts the sensor measurements into images and videos.
FlatCams can be fabricated like microchips, with the precision, speed and the associated reduction in costs, and without lenses, the most recent prototype is thinner than a dime.
The researchers explained that as traditional cameras get smaller, their sensors also get smaller, and this means they collect very little light. The low-light performance of a camera is tied to the surface area of the sensor. Unfortunately, since all camera designs are basically cubes, surface area is tied to thickness. This design decouples the two parameters, providing the ability to utilize the enhanced light-collection abilities of large sensors with a really thin device.
Leave a Reply