In Memory And Near-Memory Compute


Steven Woo, Rambus fellow and distinguished inventor, talks about the amount of power required to move store data and to move it out of memory to where processing is done. This can include changes to memory, but it also can include rethinking compute architectures from the ground up to achieve up to 1 million times better performance in highly specialized systems. Related Find more ... » read more

Week in Review: IoT, Security, Auto


Products/Services Arm rolled out its Flexible Access program, which offers system-on-a-chip design teams the capability to try out the company’s semiconductor intellectual property, along with IP from Arm partners, before they commit to licensing IP and to pay only for what they use in production. The new engagement model is expected to prove useful for Internet of Things design projects and... » read more

Blog Review: July 17


Mentor's John McMillan takes a look at the three general classes that have been established by IPC-2221B to reflect progressive increases in sophistication, functional performance requirements, and testing/inspection frequency for PCBs. Synopsys' Dinesh Siwal and Thenmozhy Kaliyamurthy point out the new features and improvements in DisplayPort 2.0, including greater speeds, better power effi... » read more

GDDR Accelerates Artificial Intelligence And Machine Learning


The origins of modern graphics double data rate (GDDR) memory can be traced back to GDDR3 SDRAM. Designed by ATI Technologies, GDDR3 made its first appearance in NVidia’s GeForce FX 5700 Ultra card which debuted in 2004. Offering reduced latency and high bandwidth for GPUs, GDDR3 was followed by GDDR4, GDDR5, GDDR5X and the latest generation of GDDR memory, GDDR6. GDDR6 SGRAM supports a ma... » read more

Low-Power Design Becomes Even More Complex


Throughout the SoC design flow, there has been a tremendous amount of research done to ease the pain of managing a long list of power-related issues. And while headway has been made, the addition of new application areas such as AI/ML/DL, automotive and IoT has raised as many new problems as have been solved. The challenges are particularly acute at leading-edge nodes where devices are power... » read more

Protecting Computing Systems in a Post-Meltdown/ Spectre World


When Jann Horn of Google’s Project Zero posted a detailed blog titled “Reading privileged memory with a side-channel,” it set off a firestorm of activity as the post confirmed that secret information inside a computer could be accessed via two different attacks, Meltdown and Spectre. Essentially, both attacks utilize CPU data cache timing to efficiently exploit and leak informatio... » read more

Blog Review: July 10


Synopsys' Eric Huang takes a look at how backward compatibility with USB 2.0 is provided when the IO voltages of new nodes can't support 3.3V signaling and how eUSB2 can boost the signal and provide support for external or legacy peripherals. In a video, Mentor Colin Walls explains endianness in embedded systems with a look at what it is, when it matters, and how to accommodate it in code. ... » read more

Blog Review: July 3


Cadence's Paul McLellan digs into 5G with a two-part post explaining the basics of the technology, what makes it so different from 4G, and the challenges ahead including the limitations of mmWave. Synopsys' Vikramjeet Bamel and Pankaj Sharma note the features that make GDDR6 a dominant memory in the high performance segment and allowing it to expand beyond graphics to automotive, AI, and AR/... » read more

Understanding The Importance Of Silicon Security


Vulnerabilities like Meltdown, Spectre and Foreshadow are understandably considered quite serious by the semiconductor industry. This is because they can be exploited by a determined attacker to access sensitive data that should be securely locked down but isn’t. We can think about a cloud-based server running multiple applications that process and store sensitive data. Vulnerabilities lik... » read more

Machine Learning Inferencing Moves To Mobile Devices


It may sound retro for a developer with access to hyperscale data centers to discuss apps that can be measured in kilobytes, but the emphasis increasingly is on small, highly capable devices. In fact, Google staff research engineer Pete Warden points to a new app that uses less than 100 kilobytes for RAM and storage, creates an inference model smaller than 20KB, and which is capable of proce... » read more

← Older posts