Scalable, Cloud-Ready IC Validator Solution For Advanced DRC Nodes


As we move to a data-centric world, semiconductor companies across the globe are working at a furious pace to develop and manufacture Artificial Intelligent [AI] chips. AI is all about an algorithm that mimics a human’s ability to learn and decide. For example, AI can be used to interpret and understand an image that helps a doctor make a better diagnosis for a patient. This requires chips to... » read more

Architecting For AI


Semiconductor Engineering sat down to talk about what is needed today to enable artificial intelligence training and inferencing with Manoj Roge, vice president, strategic planning at Achronix; Ty Garibay, CTO at Arteris IP; Chris Rowen, CEO of Babblelabs; David White, distinguished engineer at Cadence; Cheng Wang, senior VP engineering at Flex Logix; and Raik Brinkmann, president and CEO of O... » read more

Machine Learning’s Limits


Semiconductor Engineering sat down with Rob Aitken, an Arm fellow; Raik Brinkmann, CEO of OneSpin Solutions; Patrick Soheili, vice president of business and corporate development at eSilicon; and Chris Rowen, CEO of Babblelabs. What follows are excerpts of that conversation. To view part one, click here. SE: How much of what goes wrong in machine learning depends on the algorithm being wrong... » read more

IBM Takes AI In Different Directions


Jeff Welser, vice president and lab director at IBM Research Almaden, sat down with Semiconductor Engineering to discuss what's changing in artificial intelligence and what challenges still remain. What follows are excerpts of that conversation. SE: What's changing in AI and why? Welser: The most interesting thing in AI right now is that we've moved from narrow AI, where we've proven you... » read more

Where The Rubber Hits The Road: Implementing Machine Learning On Silicon


Machine learning (ML) is everywhere these days. The common thread between advanced driver-assistance systems (ADAS) vision applications in our cars and the voice (and now facial) recognition applications in our phones is that ML algorithms are doing the heavy lifting, or more accurately, the inferencing. In fact, neural networks (NN) can even be used in application spaces such as file compressi... » read more

When AI Goes Awry


The race is on to develop intelligent systems that can drive cars, diagnose and treat complex medical conditions, and even train other machines. The problem is that no one is quite sure how to diagnose latent or less-obvious flaws in these systems—or better yet, to prevent them from occurring in the first place. While machines can do some things very well, it's still up to humans to devise... » read more

What’s Next In Neuromorphic Computing


To integrate devices into functioning systems, it's necessary to consider what those systems are actually supposed to do. Regardless of the application, [getkc id="305" kc_name="machine learning"] tasks involve a training phase and an inference phase. In the training phase, the system is presented with a large dataset and learns how to "correctly" analyze it. In supervised learning, the data... » read more

Customizing Power And Performance


Designing chips is getting more difficult, and not just for the obvious technical reasons. The bigger issue revolves around what these chips going to be used for-and how will they be used, both by the end user and in the context of other electronics. This was a pretty simple decision when hardware was developed somewhat independently of software, such as in the PC era. Technology generally d... » read more

Bridging Machine Learning’s Divide


There is a growing divide between those researching [getkc id="305" comment="machine learning"] (ML) in the cloud and those trying to perform inferencing using limited resources and power budgets. Researchers are using the most cost-effective hardware available to them, which happens to be GPUs filled with floating point arithmetic units. But this is an untenable solution for embedded infere... » read more

Move Data Or Process In Place?


Should data move to available processors or should processors be placed close to memory? That is a question the academic community has been looking at for decades. Moving data is one of the most expensive and power-consuming tasks, and is often the limiter to system performance. Within a chip, Moore's Law has enabled designers to physically move memory closer to processing, and that has rema... » read more

← Older posts