Photonic-electronic hardware processes 3D data; HSM built with open tools; erasing quantum errors.
Researchers from the University of Oxford, University of Muenster, University of Heidelberg, and University of Exeter are developing integrated photonic-electronic hardware capable of processing three-dimensional data, which the team claims boosts data processing parallelism for AI tasks.
The researchers added an extra parallel dimension to the processing capability of their previously-developed photonic matrix vector multiplier chips by exploiting multiple different radio frequencies to encode the data.
“We previously assumed that using light instead of electronics could increase parallelism only by the use of different wavelengths but then we realized that using radio frequencies to represent data opens up yet another dimension, enabling superfast parallel processing for emerging AI hardware,” said Bowei Dong of the Department of Materials, University of Oxford.
As a test case, the team applied their hardware to the task of assessing the risk of sudden death from electrocardiograms of heart disease patients. They were able to successfully analyze 100 electrocardiogram signals simultaneously, identifying the risk of sudden death with 93.5% accuracy.
The researchers further estimated that even with a moderate scaling of 6 inputs × 6 outputs, this approach could outperform state-of-the-art electronic processors, potentially providing a 100-times enhancement in energy efficiency and compute density. The team anticipates further enhancement in computing parallelism in the future, by exploiting more degrees of freedom of light, such as polarization and mode multiplexing.
Dong, B., Aggarwal, S., Zhou, W. et al. Higher-dimensional processing using a photonic tensor core with continuous-time data. Nat. Photon. (2023). https://doi.org/10.1038/s41566-023-01313-x
The HEP research consortium led by the Leibniz Institute for High Performance Microelectronics used open EDA tools to define, design, and manufacture a prototypical hardware security module (HSM) chip within two years. The HSM includes a crypto accelerator and tamper-resistant security functions.
The development tools used in the process were integrated into a common development environment and expanded to include missing functionality. This included extension of the open hardware description language SpinalHDL to enable the semi-automated implementation of security properties, formal verification of the RISC-V based VexRiscv processor, and development of an open source crypto accelerator.
The team also developed a semi-automated, open masking tool to prevent cryptographic calculations from being tracked through side channels.
The work lays the foundation for the first European PDK specifically designed for open tools, according to the researchers, who adapted the Openlane open tool chain used to convert a hardware description into three-dimensional chip designs for a European fab process.
The manufactured security chip works, but for design-open security products, an open, non-volatile memory and an open, physical random number generator are currently still missing. The project partners are working on solutions for both. The code for installation on an FPGA has been made publicly available.
Fabian Buschkowski et al, EasiMask-Towards Efficient, Automated, and Secure Implementation of Masking in Hardware, 2023 Design, Automation & Test in Europe Conference & Exhibition (2023) (2023). https://dx.doi.org/10.23919/DATE56975.2023.10137330
Arnd Weber et al, Verified Value Chains, Innovation and Competition, 2023 IEEE International Conference on Cyber Security and Resilience (CSR) (2023). https://dx.doi.org/10.1109/CSR57506.2023.10224911
Researchers from Caltech demonstrated a way to pinpoint and correct for mistakes in quantum computing systems known as “erasure” errors.
“It’s normally very hard to detect errors in quantum computers, because just the act of looking for errors causes more to occur,” said Adam Shaw, a graduate student at Caltech. “But we show that with some careful control, we can precisely locate and erase certain errors without consequence, which is where the name erasure comes from.”
The team focused on quantum computers based on arrays of neutral atoms. Specifically, they manipulated individual alkaline-earth neutral atoms confined inside “tweezers” made of laser light. The atoms were excited to high-energy Rydberg states, in which neighboring atoms start interacting.
“The atoms in our quantum system talk to each other and generate entanglement,” said Pascal Scholl, a former postdoctoral scholar at Caltech now at Pasqal. “However, nature doesn’t like to remain in these quantum entangled states. Eventually, an error happens, which breaks the entire quantum state. These entangled states can be thought of as baskets full of apples, where the atoms are the apples. With time, some apples will start to rot, and if these apples are not removed from the basket and replaced by fresh ones, all the apples will rapidly become rotten. It is not clear how to fully prevent these errors from happening, so the only viable option nowadays is to detect and correct them.”
The new error-catching system is designed in such a way that erroneous atoms fluoresce, or light up, when hit with a laser. “We have images of the glowing atoms that tell us where the errors are, so we can either leave them out of the final statistics or apply additional laser pulses to actively correct them,” Scholl added.
By removing and locating errors in their Rydberg atom system, the team claims that they can improve the overall rate of entanglement. In the study, only one in 1,000 pairs of atoms failed to become entangled, a factor-of-10 improvement over what was achieved previously.
Scholl, P., Shaw, A.L., Tsai, R.BS. et al. Erasure conversion in a high-fidelity Rydberg quantum simulator. Nature 622, 273–278 (2023). https://doi.org/10.1038/s41586-023-06516-4
Leave a Reply