System Bits: Dec. 5


[caption id="attachment_429586" align="aligncenter" width="300"] Vivienne Sze, an associate professor of electrical engineering and computer science at MIT. Source: MIT[/caption] Building deep learning hardware A new course at MIT is bringing together both electrical engineering and computer science to educate student in the highly sought after field of deep learning. Vivienne Sze, an assoc... » read more

The Week in Review: IoT


Products/Services At this week’s AWS re:Invent conference in Las Vegas, Nevada, Amazon Web Services introduced a number of products and services for the Internet of Things, machine learning, and other areas. These include Amazon FreeRTOS (an operating system for IoT microcontrollers), AWS IoT Device Defender (security management), AWS IoT 1-Click, AWS IoT Device Management, AWS IoT Analytics... » read more

One-On-One: Mike Muller


Arm CTO Mike Muller sat down with Semiconductor Engineering to discuss a wide range of technology and market shifts, including the impact of machine learning, where new market opportunities will show up and how the semiconductor industry will need to change to embrace them. What follows are excerpts of that conversation. SE: It's getting to the point where instead of just developing chips, w... » read more

Week In Review: Design


Acquisitions Marvell signed a definitive agreement to buy Cavium for roughly $6 billion. The deal is expected to close in mid-2018. The Cavium deal fits squarely on the cloud side and gives Marvell a much bigger reach into enterprise networking and infrastructure, as well as some developing markets. Siemens paid an undisclosed price to buy Solido Design Automation, which tracks variation i... » read more

Quantum Madness


The race is on to commercialize quantum computing for everything from autonomous vehicles to supercomputers for hire. IBM has been working on a 50-qubit computer. Intel and QuTech, its Dutch research partner, showed off a 17-qubit test chip last month. And Alphabet, Google's parent company, is developing a 20-qubit computer. These numbers sound paltry compared to the billions of transistors ... » read more

The Next Phase Of Machine Learning


Machine learning is all about doing complex calculations on huge volumes of data with increasing efficiency, and with a growing stockpile of success stories it has rapidly evolved from a rather obscure computer science concept into the go-to method for everything from facial recognition technology to autonomous cars. [getkc id="305" kc_name="Machine learning"] can apply to every corporate fu... » read more

China’s Ambitious Automotive Plans


China has big plans for cars—and other related markets. After years of trailing behind Japanese, European and U.S.-based carmakers in automotive technology, reliability, status, and even market share within its own political borders, the country is making a concerted push into internally developed and manufactured assisted- and self-driving vehicles. The strategy plays out well for China o... » read more

Let’s Be Smart About Artificial Intelligence


Technology visionaries no less than Stephen Hawking and Elon Musk have called artificial intelligence (AI) the greatest threat facing the future of mankind. But unless we all wind up running for our lives from a “Terminator” killing machine, don’t the benefits of AI far outweigh the downsides? Looking past purely mathematic calculators from the abacus to Charles Babbage’s difference ... » read more

ADAS Design Shifts Toward Hardware


Autonomous driving will challenge system-level designers like never before with the simultaneous integration of three critical areas: Supercomputing complexity, real-time embedded performance, and functional safety. To get there, developers will need to shift their focus from a software-centric approach toward custom hardware development to produce a system that meets the safety, cost, and powe... » read more

Move Data Or Process In Place?


Should data move to available processors or should processors be placed close to memory? That is a question the academic community has been looking at for decades. Moving data is one of the most expensive and power-consuming tasks, and is often the limiter to system performance. Within a chip, Moore's Law has enabled designers to physically move memory closer to processing, and that has rema... » read more

← Older posts Newer posts →