The Next Disruption


Machine learning (ML) is an inherently disruptive technology because the algorithm architectures are evolving so fast and are very compute intensive, requiring innovative silicon for acceptable performance. This blog looks at where we’ve been and where ML is going – into another market ready for disruption. ML started in the data center In the early days of the ML explosion – a mere 8 o... » read more

MIPI DSI-2 & VESA Video Compression Enable Next-Generation Displays


By Joseph Rodriguez and Simon Bussières It is hard to believe, but it has been 20 years since MIPI Alliance was first founded. The organization was originally formed to standardize the video interface technologies for cameras and displays in phones, with the MIPI acronym standing for Mobile Industry Processor Interface (MIPI). As the mobile industry has evolved, MIPI Alliance has evolved wi... » read more

Disaggregating And Extending Operating Systems


The push toward disaggregation and customization in hardware is starting to be mirrored on the software side, where operating systems are becoming smaller and more targeted, supplemented with additional software that can be optimized for different functions. There are two main causes for this shift. The first is rising demand for highly optimized and increasingly heterogeneous designs, which... » read more

Chiplets Taking Root As Silicon-Proven Hard IP


Chiplets are all the rage today, and for good reason. With the various ways to design a semiconductor-based system today, IP reuse via chiplets appears to be an effective and feasible solution, and a potentially low-cost alternative to shrinking everything to the latest process node. To enable faster time to market, common IP or technology that already has been silicon-proven can be utilized... » read more

Everything, Everywhere, All At Once: Big Data Reimagines Verification Predictability And Efficiency


Big data is a term that has been around for many years. The list of applications for big data are endless, but the process stays the same: capture, process and analyze. With new, enabling verification solutions, big data technologies can improve your verification process efficiency and predict your next chip sign-off. By providing a big data infrastructure, with state-of-the-art technologies... » read more

How To Raise Reliability, Availability, And Serviceability Levels For HPC SoCs


By Charlie Matar, Rita Horner, and Pawini Mahajan While once the domain of large data centers and supercomputers, high-performance computing (HPC) has become rather ubiquitous and, in some cases, essential in our everyday lives. Because of this, reliability, availability, and serviceability, or RAS, is a concept that more HPC SoC designers should familiarize themselves with. RAS may sound... » read more

Simulating Reality: The Importance Of Synthetic Data In AI/ML Systems For Radar Applications


Artificial intelligence and machine learning (AI/ML) are driving the development of next-generation radar perception. However, these AI/ML-based perception models require enough data to learn patterns and relationships to make accurate predictions on new, unseen data and scenarios. In the field of radar applications, the data used to train these models is often collected from real-world meas... » read more

Secure Device Updates On Matter


There are many who share the Arm vision of smart connected devices enabling rapid innovation in our work and home in the coming years. Such connectivity promises to yield new applications for solving problems and improving lives. But onlookers are keen to see how the industry resolves a large obstacle to the next phase of digital transformation: how to keep these smart devices securely upd... » read more

Unified AI/ML Solution Helps Accelerate Verification Curve


With the surge in usage requirements and increasing customer demands, hardware design is quickly becoming more complex. The rapid change in market trends, with a greater focus on technologies such as electric vehicles, dictates the demand for efficient power management and high-performance processing. Verification throughput continues to be a bottleneck as SoC designs increase in size, and so d... » read more

Improving Verification Predictability And Efficiency Using Big Data


Big data is a term that has been around for decades. It was initially defined as data sets captured, managed, and processed in a tolerable amount of time beyond the ability of normal software tools. The only constant in big data’s size over this time is that it’s been a moving target driven by improvements in parallel processing power and cheaper storage capacity. Today most of the industry... » read more

← Older posts Newer posts →