Five DAC Keynotes


The ending of Moore's Law may be about to create a new golden age for design, especially one fueled by artificial intelligence and machine learning. But design will become task-, application- and domain-specific, and will require that we think about the lifecycle of the products in a different way. In the future, we also will have to design for augmentation of experience, not just automation... » read more

Preparing For A 5G World


Semiconductor Engineering sat down to talk about challenges and progress in 5G with Yorgos Koutsoyannopoulos, president and CEO of Helic; Mike Fitton, senior director of strategic planning and business development at Achronix; Sarah Yost, senior product marketing manager at National Instruments; and Arvind Vel, director of product management at ANSYS. What follows are excerpts of that conversat... » read more

Reconfigurable AI Building Blocks For SoCs And MCUs


FPGA chips are in use in many AI applications today, including Cloud datacenters. Embedded FPGA (eFPGA) is now becoming used for AI applications as well. Our first public customer doing AI with EFLX eFPGA is Harvard University, who will present a paper at Hot Chips August 20th on Edge AI processing using EFLX: "A 16nm SoC with Efficient and Flexible DNN Acceleration for Intelligent IoT Devi... » read more

7nm Design Challenges


Ty Garibay, CTO at ArterisIP, talks about the challenges of moving to 7nm, who’s likely to head there, how long it will take to develop chips at that node, and why it will be so expensive. This also raises questions about whether chips will begin to disaggregate at 7nm and 5nm. https://youtu.be/ZqCAbH678GE » read more

Machine Learning’s Limits


Semiconductor Engineering sat down with Rob Aitken, an Arm fellow; Raik Brinkmann, CEO of OneSpin Solutions; Patrick Soheili, vice president of business and corporate development at eSilicon; and Chris Rowen, CEO of Babblelabs. What follows are excerpts of that conversation. To view part one, click here. SE: How much of what goes wrong in machine learning depends on the algorithm being wrong... » read more

Syntiant: Analog Deep Learning Chips


Startup Syntiant Corp. is an Irvine, Calif. semiconductor company led by former top Broadcom engineers with experience in both innovative design and in producing chips designed to be produced in the billions, according to company CEO Kurt Busch. The chip they’ll be building is an inference accelerator designed to run deep-learning processes 50x more efficiently than traditional stored-prog... » read more

IBM Takes AI In Different Directions


Jeff Welser, vice president and lab director at IBM Research Almaden, sat down with Semiconductor Engineering to discuss what's changing in artificial intelligence and what challenges still remain. What follows are excerpts of that conversation. SE: What's changing in AI and why? Welser: The most interesting thing in AI right now is that we've moved from narrow AI, where we've proven you... » read more

Deep Learning Neural Networks Drive Demands On Memory Bandwidth


A deep neural network (DNN) is a system that is designed similar to our current understanding of biological neural networks in the brain. DNNs are finding use in many applications, advancing at a fast pace, pushing the limits of existing silicon, and impacting the design of new computing architectures. Figure 1 shows a very basic form of neural network that has several nodes in each layer that ... » read more

Where The Rubber Hits The Road: Implementing Machine Learning On Silicon


Machine learning (ML) is everywhere these days. The common thread between advanced driver-assistance systems (ADAS) vision applications in our cars and the voice (and now facial) recognition applications in our phones is that ML algorithms are doing the heavy lifting, or more accurately, the inferencing. In fact, neural networks (NN) can even be used in application spaces such as file compressi... » read more

Finding And Fixing ML’s Flaws


OneSpin CEO Raik Brinkmann sat down with Semiconductor Engineering to discuss how to make machine learning more robust, predictable and consistent, and new ways to identify and fix problems that may crop up as these systems are deployed. What follows are excerpts of that conversation. SE: How do we make sure devices developed with machine learning behave as they're supposed to, and how do we... » read more

← Older posts