Synthesizing Hardware From Software


The ability to automatically generate optimized hardware from software was one of the primary tenets of system-level design automation that was never fully achieved. The question now is whether that will ever happen, and whether it is just a matter of having the right technology or motivation to make it possible. While high-level synthesis (HLS) did come out of this work and has proven to be... » read more

Machine Learning Inferencing Moves To Mobile Devices


It may sound retro for a developer with access to hyperscale data centers to discuss apps that can be measured in kilobytes, but the emphasis increasingly is on small, highly capable devices. In fact, Google staff research engineer Pete Warden points to a new app that uses less than 100 kilobytes for RAM and storage, creates an inference model smaller than 20KB, and which is capable of proce... » read more

Machine Learning Drives High-Level Synthesis Boom


High-level synthesis (HLS) is experiencing a new wave of popularity, driven by its ability to handle machine-learning matrices and iterative design efforts. The obvious advantage of HLS is the boost in productivity designers get from working in C, C++ and other high-level languages rather than RTL. The ability to design a layout that should work, and then easily modify it to test other confi... » read more

Accelerating Endpoint Inferencing


Chipmakers are getting ready to debut inference chips for endpoint devices, even though the rest of the machine-learning ecosystem has yet to be established. Whatever infrastructure does exist today is mostly in the cloud, on edge-computing gateways, or in company-specific data centers, which most companies continue to use. For example, Tesla has its own data center. So do most major carmake... » read more

Driving AI, ML To New Levels On MCUs


One of the most dramatic impacts of technology of late has been the implementation of artificial intelligence and machine learning on small edge devices, the likes of which are forming the backbone of the Internet of Things. At first, this happened through sheer engineering willpower and innovation. But as the drive towards a world of a trillion connected devices accelerates, we must find wa... » read more

April’19 Startup Funding: Corporate Gushers


It was another rich month for startups, large and small. In April’s top 11 funding rounds, five were investments by big corporations or corporate venture capital funds—an investor consortium led by the SoftBank Vision Fund, PayPal, Ford Motor, NTT DoCoMo, and HAPSMobile, a joint venture of SoftBank Group and AeroVironment. Those 11 investments totaled $3.74 billion. Intel Capital was als... » read more

Big Shift In Multi-Core Design


Hardware and software engineers have a long history of working independently of each other, but that insular behavior is changing in emerging areas such as AI, machine learning and automotive as the emphasis shifts to the system level. As these new markets consume more semiconductor content, they are having a big impact on the overall design process. The starting point in many of these desig... » read more

Utilizing More Data To Improve Chip Design


Just about every step of the IC tool flow generates some amount of data. But certain steps generate a mind-boggling amount of data, not all of which is of equal value. The challenge is figuring out what's important for which parts of the design flow. That determines what to extract and loop back to engineers, and when that needs to be done in order to improve the reliability of increasingly com... » read more

The Automation Of AI


Semiconductor Engineering sat down to discuss the role that EDA has in automating artificial intelligence and machine learning with Doug Letcher, president and CEO of Metrics; Daniel Hansson, CEO of Verifyter; Harry Foster, chief scientist verification for Mentor, a Siemens Business; Larry Melling, product management director for Cadence; Manish Pandey, Synopsys fellow; and Raik Brinkmann, CEO ... » read more

Can Debug Be Tamed?


Debug consumes more time than any other aspect of the chip design and verification process, and it adds uncertainty and risk to semiconductor development because there are always lingering questions about whether enough bugs were caught in the allotted amount of time. Recent figures suggest that the problem is getting worse, too, as complexity and demand for reliability continue to rise. The... » read more

← Older posts