GDDR Accelerates Artificial Intelligence And Machine Learning


The origins of modern graphics double data rate (GDDR) memory can be traced back to GDDR3 SDRAM. Designed by ATI Technologies, GDDR3 made its first appearance in NVidia’s GeForce FX 5700 Ultra card which debuted in 2004. Offering reduced latency and high bandwidth for GPUs, GDDR3 was followed by GDDR4, GDDR5, GDDR5X and the latest generation of GDDR memory, GDDR6. GDDR6 SGRAM supports a ma... » read more

Low-Power Design Becomes Even More Complex


Throughout the SoC design flow, there has been a tremendous amount of research done to ease the pain of managing a long list of power-related issues. And while headway has been made, the addition of new application areas such as AI/ML/DL, automotive and IoT has raised as many new problems as have been solved. The challenges are particularly acute at leading-edge nodes where devices are power... » read more

What’s Powering Artificial Intelligence


To scale artificial intelligence (AI) and machine learning (ML), hardware and software developers must enable AI/ML performance across a vast array of devices. This requires balancing the need for functionality alongside security, affordability, complexity and general compute needs. Fortunately, there’s a solution hiding in plain sight. To read more, click here (scroll down to "Download No... » read more

System Bits: July 15


Automating bridge inspections with robotics The University of Waterloo has come up with robotics that could be used in automated inspection of bridges, making sure such critical infrastructure is safe and sound. The technology promises to make bridge inspection cheaper and easier. The system collects data for defect detection and analysis through a combination of autonomous robots, cameras,... » read more

How To Improve ML Power/Performance


Raymond Nijssen, vice president and chief technologist at Achronix, talks about the shift from brute-force performance to more power efficiency in machine learning processing, the new focus on enough memory bandwidth to keep MAC functions busy, and how dynamic range, precision and locality can be modified to improve speed and reduce power. » read more

Week in Review: IoT, Security, Auto


Products/Services Arteris IP reports that Bitmain licensed the Arteris Ncore Cache Coherent Interconnect intellectual property for use in its next-generation Sophon Tensor Processing Unit system-on-a-chip devices for the scalable hardware acceleration of artificial intelligence and machine learning algorithms. “Our choice of interconnect IP became more important as we continued to increase t... » read more

Inferencing Efficiency


Geoff Tate, CEO of Flex Logix, talks with Semiconductor Engineering about how to measure efficiency in inferencing chips, how to achieve the most throughput for the lowest cost, and what the benchmarks really show. » read more

June Startup Funding


During the month of June, there were 15 startups that brought in funding rounds of $100 million or more, as investors continued to chase deals in cybersecurity, automotive technology, semiconductors, and a variety of services. There were no billion-dollar deals as spring slid into summer; yet, those 15 companies together raised a total of about $3.13 billion. Aurora Innovation, the developer... » read more

System Bits: July 10


Light waves run on silicon-based chips Researchers at the University of Sydney’s Nano Institute and Singapore University of Technology and Design collaborated on manipulating light waves on silicon-based microchips to keep coherent data as it travels thousands of miles on fiber-optic cables. Such waves—whether a tsunami or a photonic packet of information—are known as solitons. The... » read more

HW/SW Design At The Intelligent Edge


Adding intelligence to the edge is a lot more difficult than it might first appear, because it requires an understanding of what gets processed where based on assumptions about what the edge actually will look like over time. What exactly falls under the heading of Intelligent Edge varies from one person to the next, but all agree it goes well beyond yesterday’s simple sensor-based IoT dev... » read more

← Older posts