Low-Power Deep Learning Implementation For Automotive ICs


Examples of automotive applications abound where high-performance, low-power embedded vision processors are used, from in-car driver drowsiness detection, to a self-driving car ‘seeing’ the road ahead with pedestrians, oncoming cars, or the occasional animal crossing the road. Implementing deep learning in these types of applications requires a lot of processing power with the lowest possib... » read more

The Power Of Speech


With the widespread use of voice-activated virtual assistants, such as Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and the Google Assistant, voice has become an everyday way to interact with electronics. We’re talking to our devices more than ever, using speech to initiate searches, issue commands, and even make purchases. There are a number of reasons why using your voice to ... » read more

SM2: A Deep Neural Network Accelerator In 28nm


Deep learning algorithms present an exciting opportunity for efficient VLSI implementations due to several useful properties: (1) an embarrassingly parallel dataflow graph, (2) significant sparsity in model parameters and intermediate results, and (3) resilience to noisy computation and storage. Exploiting these characteristics can offer significantly improved performance and energy efficiency.... » read more

Why EDA Needs To Change


Why is it taking so long for [getkc id="305" kc_name="machine learning"] to have an impact within EDA? Most of the time when I talk to the experts within the field I hear about why designs are so different from other machine learning applications, and I know that is true. Many of you reading this may not be aware that I was a developer of EDA tools for more than 35 years before I ended up writi... » read more

RISC-V Gains Its Footing


The RISC-V instruction-set architecture, which started as a UC Berkeley project to improve energy efficiency, is gaining steam across the industry. The RISC-V Foundation's member roster gives an indication who is behind this effort. Members include Google, Nvidia, Qualcomm, Rambus, Samsung, NXP, Micron, IBM, GlobalFoundries, UltraSoC, Siemens, among many others. One of the key markets for... » read more

Who Will Regulate Technology?


Outside regulation and technological innovation don't mix well, particularly when it comes to modern electronics, but the potential for that kind of oversight is rising. In the past, most of the problems involving regulation stemmed from a lack of understanding about technology and science. This is hardly a new phenomenon. It literally dates back centuries. Galileo was forced to recant helio... » read more

Regain Your Power With Machine Learning


It wasn’t too long ago that machine learning (ML) seemed like a fascinating research topic. However, in no time at all, it has made a swift transition from a world far-off to common presence in news, billboards, workplaces, and homes. The concept itself is not new but evidently what has caused it to take off is the rapid growth of data in many applications and more computational power. Closer... » read more

Debugging Debug


There appears to be an unwritten law about the time spent in debug-it is a constant. It could be that all gains made by improvements in tools and methodologies are offset by increases in complexity, or that the debug process causes design teams to be more conservative. It could be that no matter how much time spent on debug, the only thing accomplished is to move bugs to places that are less... » read more

Deconstructing Deep Learning


I discuss AI and deep learning a lot these days. The discussion usually comes back to “what is a deep learning chip?” These devices are basically hardware implementations of neural networks. While neural nets have been around for a while, what’s new is the performance advanced semiconductor technology brings to the party. Applications that function in real time are now possible. But wh... » read more

Using Data Mining Differently


The semiconductor industry generates a tremendous quantity of data, but until very recently engineers had to sort through it on their own to spot patterns, trends and aberrations. That's beginning to change as chipmakers develop their own solutions or partner with others to effectively mine this data. Adding some structure and automation around all of this data is long overdue. Data mining h... » read more

← Older posts Newer posts →