The Week In Review: IoT

Memory Kilopass Technology uncorked its new eNVM, which includes vertical layered thyristor DRAM technology. The key advantages, according to the company, is that it eliminates the need for DRAM refresh, can be manufactured using existing processes, and improves power and area efficiency. A full memory test chip is currently in the early stages of testing. A thyristor is basically a latch tech... » read more

Seeing The Future Of Vision

Vision systems have evolved from cameras that enable robots to “see” on a factory floor to a safety-critical element of the heterogeneous systems guiding autonomous vehicles, as well as other applications that call for parallel processing technology to quickly recognize objects, people, and the surrounding environment. Automotive electronics and mobile devices currently dominate embedded... » read more

IoT Has Always Been With Us

By most accounts, Kevin Ashton of the Auto-ID Center at the Massachusetts Institute of Technology coined the term “the Internet of Things” in 1999, referring to a system of ubiquitous sensors connecting the Internet with the physical world. We were well into the 21st century before the Internet of Things, as a marketing term or a short description of a certain technology, came to be wide... » read more

The Looming AI War

A recent spate of acquisitions and announcements in AI and machine learning is setting the stage for a colossal showdown across the tech industry. Among those vying for top spots are Samsung, Google, Apple, Microsoft and Amazon, each with a large enough revenue stream to support an M&A feeding frenzy and the sustained investments required to remain competitive. Consider the most recent a... » read more

Fear Of Machines

In the tech industry, the main concern over the past five decades has been about what machines could not do. Now the big worry is what they can do. From the outset of the computer age, the biggest challenges were uptime, ease of use, reliability, and as devices became more connected, the quality and reliability of that connection. As the next phase of machines begins, those problems have bee... » read more

The Week In Review: IoT

Security The Industrial Internet Consortium this week unveiled the Industrial Internet Security Framework, a set of specifications for connected health-care devices and hospitals, intelligent transportation, smart electrical grids, smart factories, and other cyber-physical systems in the Internet of Things. AT&T, Fujitsu, Hitachi, Infineon Technologies, Intel, Microsoft, and Symantec are among... » read more

Grappling With Manufacturing Data

As complexity goes up with each new process node, so does the amount of data that is generated, from initial GDSII to photomasks, manufacturing, yield and post-silicon validation. But what happens to that data, and what gets shared, remain a point of contention among companies across the semiconductor ecosystem. The problem is that to speed up the entire design through manufacturing process,... » read more

Executive Insight: Sundari Mitra

Sundari Mitra, co-founder and CEO of [getentity id="22535" e_name="NetSpeed Systems"], sat down with Semiconductor Engineering to discuss machine learning, shifting from a processor-centric to a memory-centric design, and what needs to change to make that all happen. What follows are excerpts of that conversation. SE: What is the biggest change you’re seeing? Mitra: We go through a cycl... » read more

The Week In Review: IoT

Analysis After reading a blog post touting the Internet of Things for home security, Jon Hedren wrote this post detailing how IoT-based home systems can be easily compromised and could fail in multiple ways. “The IoT ‘dream’ as sold by the industry is pretty cool, but it’s still just a dream. For now, these devices remain generally shoddy, insecure, and easily breakable—and must be t... » read more

What’s Missing From Machine Learning

Machine learning is everywhere. It's being used to optimize complex chips, balance power and performance inside of data centers, program robots, and to keep expensive electronics updated and operating. What's less obvious, though, is there are no commercially available tools to validate, verify and debug these systems once machines evolve beyond the final specification. The expectation is th... » read more

← Older posts