Making Better Use Of Memory In AI


Steven Woo, Rambus fellow and distinguished inventor, talks about using number formats to extend memory bandwidth, what the impact can be on fractional precision, how modifications of precision can play into that without sacrificing accuracy, and what role stochastic rounding can play. » read more

Breaking Down The AI Memory Wall


Over the past few decades, the semiconductor industry has witnessed the rapid evolution of memory technology as new memories helped to usher in new usage models that characterized each decade. For example, synchronous memory helped drive the personal computer (PC) revolution in the 1990s, and this was quickly followed by specialized graphics memory (GPUs) for game consoles in the 2000s. When sm... » read more

Reducing Software Power


With the slowdown of Moore's Law, every decision made in the past must be re-examined to get more performance or lower power for a given function. So far, software has remained relatively unaffected, but it could be an untapped area for optimization and enable significant power reduction. The general consensus is that new applications such as artificial intelligence and machine learning, whe... » read more

What’s Powering Artificial Intelligence?


While artificial intelligence (AI) and machine learning (ML) applications soar in popularity, many organizations are questioning where ML workloads should be performed. Should they be done on a central processor (CPU), a graphics processor (GPU), or a neural processor (NPU)? The choice most teams are making today will surprise you. To scale artificial intelligence (AI) and machine learning (... » read more

The Critical But Less Obvious Risks In AI


AI has been the subject of intense debate since it was first introduced back in the mid-1950s, but the real threat is a lot more mundane and potentially even more serious than the fear-inducing picture painted by its critics. Replacing jobs with technology has been a controversial subject for more than a century. AI is a relative newcomer in that debate. While the term "artificial intelligen... » read more

How Hardware Can Bias AI Data


Clean data is essential to good results in AI and machine learning, but data can become biased and less accurate at multiple stages in its lifetime—from moment it is generated all the way through to when it is processed—and it can happen in ways that are not always obvious and often difficult to discern. Blatant data corruption produces erroneous results that are relatively easy to ident... » read more

Nvidia’s Top Technologists Discuss The Future Of GPUs


Semiconductor Engineering sat down to discuss the role of the GPU in artificial intelligence, autonomous and assisted driving, advanced packaging and heterogeneous architectures with Bill Dally, Nvidia’s chief scientist, and Jonah Alben, senior vice president of Nvidia’s GPU engineering, at IEEE’s Hot Chips 2019 conference. What follows are excerpts of that conversation. SE: There are ... » read more

Week in Review – IoT, Security, Autos


Products/Services Rambus entered an exclusive agreement to acquire the Silicon IP, Secure Protocols, and Provisioning business from Verimatrix, formerly known as Inside Secure. Financial terms were not revealed. The transaction is expected to close this year. Rambus will use the Verimatrix offerings in such demanding applications as artificial intelligence, automotive, the Internet of Things, ... » read more

Autonomous Vehicles Are Reshaping The Tech World


The effort to build cars that can drive themselves is reshaping the automotive industry and its supply chain, impacting everything from who defines safety to how to ensure quality and reliability. Automakers, which hardly knew the names of their silicon suppliers a couple of years ago, are now banding together in small groups to share the costs and solve technical challenges that are well be... » read more

Machine Learning For Autonomous Drive


Advances in Artificial Intelligence (AI) and Machine Learning (ML) is arguably the biggest technical innovation of the last decade. Although the algorithms for AI have been in existence for many years, the recent explosion of both data as well as faster compute made it possible to apply those algorithms to solve many real life use cases. One of the most prominent of these use cases is fully aut... » read more

← Older posts Newer posts →