Using Data Mining Differently

The semiconductor industry generates a tremendous quantity of data, but until very recently engineers had to sort through it on their own to spot patterns, trends and aberrations. That's beginning to change as chipmakers develop their own solutions or partner with others to effectively mine this data. Adding some structure and automation around all of this data is long overdue. Data mining h... » read more

Data Buffering’s Role Grows

Data buffering is gaining ground as a way to speed up the processing of increasingly large quantities of data. In simple terms, a data buffer is an area of physical [getkc id="22" kc_name="memory"] storage that temporarily stores data while it is being moved from one place to another. This becomes increasingly necessary in data centers, autonomous vehicles, and for [getkc id="305" kc_name=... » read more

How To Deal With The Flood Of Analog Data

Analog data from a variety of sensors and other devices is a huge problem. Here are three approaches to overcoming the problems that big analog data can cause. Approach 1: Analyze at the Edge A lot of data can be collected at the point of capture, but most of it’s uninteresting. You can save and analyze it all or you can take advantage of intelligent embedded software that constantly meas... » read more

Full-Chip Power Integrity And Reliability Signoff

As designs increase in complexity to cater to the insatiable need for more compute power — which is being driven by different AI applications ranging from data centers to self-driving cars—designers are constantly faced with the challenge of meeting the elusive power, performance and area (PPA) targets. PPA over-design has repercussions resulting in increased product cost as well as pote... » read more

Let’s Be Smart About Artificial Intelligence

Technology visionaries no less than Stephen Hawking and Elon Musk have called artificial intelligence (AI) the greatest threat facing the future of mankind. But unless we all wind up running for our lives from a “Terminator” killing machine, don’t the benefits of AI far outweigh the downsides? Looking past purely mathematic calculators from the abacus to Charles Babbage’s difference ... » read more

How Neural Networks Think (MIT)

Source: MIT’s Computer Science and Artificial Intelligence Laboratory, David Alvarez-Melis and Tommi S. Jaakkola Technical paper link MIT article General-purpose neural net training Artificial-intelligence research has been transformed by machine-learning systems called neural networks, which learn how to perform tasks by analyzing huge volumes of training data, reminded MIT research... » read more

Targeting And Tailoring eFPGAs

Robert Blake, president and CEO of Achronix, sat down with Semiconductor Engineering to discuss what's changing in the embedded FPGA world, why new levels of customization are so important, and difficulty levels for implementing embedded programmability. What follows are excerpts of that discussion. SE: There are numerous ways you can go about creating a chip these days, but many of the prot... » read more

Thinking Much Bigger

For the better part of the past decade the focus has been on integrating an increasing number of smaller components on a piece of silicon. It's time to start thinking much bigger. While there is still plenty of work to be done building more powerful processors, or networks of connected processors on a chip or in a package, new opportunities are opening up in markets such as automotive, medic... » read more

The Week In Review: IoT

Finance Santa Monica, Calif.-based Sixgill reports raising $27.9 million in its Series B round of private financing, led by DRW Venture Capital. Mobile Financial Partners participated in the round. The startup last year raised $6 million in its Series A funding, also led by DRW. The company offers the Sixgill Sense sensor data services platform, addressing applications in the Internet of Thing... » read more

Executive Insight: Aart de Geus

Aart de Geus, chairman and co-CEO of [getentity id="22035" e_name="Synopsys"], sat down with Semiconductor Engineering to discuss machine learning and big data, the race toward autonomous vehicles, systems vs. chips, software vs. hardware, and the future of EDA. What follows are excerpts of that conversation. SE: The whole tech world is buzzing over data and how it gets used in areas such as... » read more

← Older posts