Manufacturing Bits: May 16


Musical learning chips Imec has demonstrated a neuromorphic chip. The brain-inspired chip, based on OxRAM technology, has the capability of self-learning and has been demonstrated to have the ability to compose music. Imec has combined state-of-the-art hardware and software to design chips that feature these characteristics of a self-learning system. Imec’s goal is to design the process t... » read more

What’s Next In Neural Networking?


Faster chips, more affordable storage, and open libraries are giving neural network new momentum, and companies are now in the process of figuring out how to optimize it across a variety of markets. The roots of neural networking stretch back to the late 1940s with Claude Shannon’s Information Theory, but until several years ago this technology made relatively slow progress. The rush towar... » read more

Conflicting Goals In Data Centers


Two conflicting goals are emerging inside of data centers—speed at any cost, and the ability to extend hardware well beyond its expected lifetime to amortize that cost. Layered across both of those are concerns about how to move data back and forth more efficiently, how to secure it, and how to best integrate different generations of technology. But these widely different goals have create... » read more

AI Storm Brewing


AI is coming. Now what? The answer isn't clear, because after decades of research and development, AI is finally starting to become a force to reckon with. The proof is in the M&A activity underway right now. Big companies are willing to pay huge sums to get out in front of this shift. Here is a list of just some of the AI acquisitions announced or completed over the past few years: ... » read more

Software Modeling Goes Mainstream


Software modeling is finally beginning to catch on across a wide swath of chipmakers as they look beyond device scaling to improve performance, lower power, and ratchet up security. Software modeling in the semiconductor industry historically has been associated with hardware-software co-design, which has grown in fits and starts since the late 1990s. The largest chipmakers and systems compa... » read more

Bidding War On H-1B Visas?


Good help is hard to find. It's about to get harder—and more expensive. The U.S. tech industry's solution until now has been to leverage expertise from around the world, drawing top graduates and entry-level professionals under the H-1B visa program. Last year, there were 85,000 H-1B visas issued, of which 20,000 are required to hold a U.S. master's degree or higher. There are some exce... » read more

The Power And Limits Of Money


[getperson id="11694" p_name="Wally Rhines"], CEO of [getentity id="22017" e_name="Mentor Graphics"], sat down with Semiconductor Engineering to discuss how semiconductor engineering teams make their dollars work even when budgets are limited. The issue is as important as ever, given the industry's unrelenting margin and cost pressure and the growing competition for top talent. What follows are... » read more

Looking Back On IoT In 2016


The Internet of Things was going great guns for most of 2016. Until October 21, that is. That’s the date of the coordinated cyberattacks on Dyn, an Internet performance management services firm. The distributed denial-of-service attacks quickly had impacts on Airbnb, Amazon, Facebook, Netflix, PayPal, Reddit, Twitter, and other popular websites. Dyn was able to fight off the aggressive att... » read more

Watch Out: Reality Is Set To Explode


Walk into any store right now (December 2016), and you can probably find a VR headset for $20. These will be a popular gift item this holiday season, but ultimately it may hurt the market if consumers have bad experiences with these bargain-basement VR headsets. There exists a considerable amount of confusion regarding the virtual reality, augmented reality, and mixed reality industries. Sem... » read more

Fear Of Machines


In the tech industry, the main concern over the past five decades has been about what machines could not do. Now the big worry is what they can do. From the outset of the computer age, the biggest challenges were uptime, ease of use, reliability, and as devices became more connected, the quality and reliability of that connection. As the next phase of machines begins, those problems have bee... » read more

← Older posts