How Hardware Can Bias AI Data


Clean data is essential to good results in AI and machine learning, but data can become biased and less accurate at multiple stages in its lifetime—from moment it is generated all the way through to when it is processed—and it can happen in ways that are not always obvious and often difficult to discern. Blatant data corruption produces erroneous results that are relatively easy to ident... » read more

Nvidia’s Top Technologists Discuss The Future Of GPUs


Semiconductor Engineering sat down to discuss the role of the GPU in artificial intelligence, autonomous and assisted driving, advanced packaging and heterogeneous architectures with Bill Dally, Nvidia’s chief scientist, and Jonah Alben, senior vice president of Nvidia’s GPU engineering, at IEEE’s Hot Chips 2019 conference. What follows are excerpts of that conversation. SE: There are ... » read more

System Bits: Aug. 5


Algorithm could advance quantum computing Scientists at the Los Alamos National Laboratory report the development of a quantum computing algorithm that promises to provide a better understanding of the quantum-to-classical transition, enabling model systems for biological proteins and other advanced applications. “The quantum-to-classical transition occurs when you add more and more parti... » read more

EUV, Deep Learning Issues In Mask Making


Semiconductor Engineering sat down to discuss extreme ultraviolet (EUV) lithography, photomask technologies and machine learning issues with Emily Gallagher, principal member of the technical staff at Imec; Harry Levinson, principal at HJL Lithography; Chris Spence, vice president of advanced technology development at ASML; Banqiu Wu, senior director of process development at Applied Materials;... » read more

Rethinking What Goes On A Chip


There are hints across the chip industry that chipmakers are beginning to reexamine one of the basic concepts of chip design. For more than 50 years, progress in semiconductors was measured by the ability to double the density of transistors on a piece of silicon. While that approach continues to be useful, the power and performance benefits have been dwindling for the past couple of nodes. ... » read more

Factoring Reliability Into Chip Manufacturing


Making chips that can last two decades is possible, even if it's developed at advanced process nodes and is subject to extreme environmental conditions, such as under the hood of a car or on top of a light pole. But doing that at the same price point as chips that go into consumer electronics, which are designed to last two to four years, is a massively complex challenge. Until a couple of y... » read more

Deep Learning Models With MATLAB And Cortex-A


Today, I’ve teamed up with Ram Cherukuri of MathWorks to provide an overview of the MathWorks toolchain for machine learning (ML) and the deployment of embedded ML inference on Arm Cortex-A using the Arm Compute Library. MathWorks enables engineers to get started quickly and makes machine learning possible without having to become an expert. If you’re an algorithm engineer interested ... » read more

Week in Review: IoT, Security, Auto


Products/Services Achronix Semiconductor selected the Rambus GDDR6 PHY for its next-generation Speedster7t line of field-programmable gate arrays. The Rambus GDDR6 PHY is used in advanced driver-assistance systems, artificial intelligence, graphics, machine learning, and networking applications. Arm and Marvell Technology Group will work together on design and development of Marvell’s nex... » read more

DAC 2019: Day 3


Two keynotes get day three of DAC started. The first by John Cohn, Massachusetts Institute of Technology & IBM Watson AI Lab. "I am a nerd. Look back 100 years in processing. We have gone from mechanical computing to where we are today, but it has not been a smooth curve. There are smooth places and then discontinuities. This is when what you were working on no longer works. How we make tho... » read more

System Bits: April 8


Computers trained to design materials Researchers in the University of Missouri’s College of Engineering are applying deep learning technology to educate high-performance computers in the field of materials science, with the goal of having those computers design billions of potential materials. “You can train a computer to do what it would take many years for people to otherwise do,” ... » read more

← Older posts