Why Scaling Must Continue


The entire semiconductor industry has come to the realization that the economics of scaling logic are gone. By any metric—price per transistor, price per watt, price per unit area of silicon—the economics are no longer in the plus column. So why continue? The answer is more complicated than it first appears. This isn't just about inertia and continuing to miniaturize what was proven in t... » read more

Powering The Edge: Driving Optimal Performance With the Arm ML Processor


On-device machine learning (ML) processing is already happening in more than 4 billion smart phones. As the adoption of connected devices continues to grow exponentially, the resulting data explosion means cloud processing could soon become an expensive and high-latency luxury. The Arm ML processor is defining the future of ML inference at the edge, allowing smart devices to make independent... » read more

Where Should Auto Sensor Data Be Processed?


Fully autonomous vehicles are coming, but not as quickly as the initial hype would suggest because there is a long list of technological issues that still need to be resolved. One of the basic problems that still needs to be solved is how to process the tremendous amount of data coming from the variety of sensors in the vehicle, including cameras, radar, LiDAR and sonar. That data is the dig... » read more

Semiconductor’s Dinosaurs


Dinosaurs once ruled this planet. They existed in every shape and form – some large, others tiny. Each adapted to its own specific environment. Some stayed on the land, others went to sea, and yet another group took to the skies. They looked like they were invincible and would be the pinnacle of the food chain. Then a cataclysmic event happened, and dinosaurs went into a fairly rapid decline.... » read more

Power Is Limiting Machine Learning Deployments


The total amount of power consumed for machine learning tasks is staggering. Until a few years ago we did not have computers powerful enough to run many of the algorithms, but the repurposing of the GPU gave the industry the horsepower that it needed. The problem is that the GPU is not well suited to the task, and most of the power consumed is waste. While machine learning has provided many ... » read more

What’s Powering Artificial Intelligence


To scale artificial intelligence (AI) and machine learning (ML), hardware and software developers must enable AI/ML performance across a vast array of devices. This requires balancing the need for functionality alongside security, affordability, complexity and general compute needs. Fortunately, there’s a solution hiding in plain sight. To read more, click here (scroll down to "Download No... » read more

Speeding Up AI


Robert Blake, president and CEO of Achronix, sat down with Semiconductor Engineering to talk about AI, which processors work best where, and different approaches to accelerate performance. SE: How is AI affecting the FPGA business, given the constant changes in algorithms and the proliferation of AI almost everywhere? Blake: As we talk to more and more customers deploying new products and... » read more

Building An Efficient Inferencing Engine In A Car


David Fritz, who heads corporate strategic alliances at Mentor, a Siemens Business, talks about how to speed up inferencing by taking the input from sensors and quickly classifying the output, but also doing that with low power. » read more

CEO Outlook: It Gets Much Harder From Here


Semiconductor Engineering sat down to discuss what's changing across the semiconductor industry with Wally Rhines, CEO emeritus at Mentor, a Siemens Business; Jack Harding, president and CEO of eSilicon; John Kibarian, president and CEO of PDF Solutions; and John Chong, vice president of product and business development for Kionix. What follows are excerpts of that discussion, which was held in... » read more

Complexity’s Impact On Security


Ben Levine, senior director of product management for Rambus’ Security Division, explains why security now depends on the growing number of components and the impact of interactions between those components. This is particularly problematic with AI chips, both on the training and inferencing side, where security problems on the training side can alter models for AI inferencing. » read more

← Older posts