Architecting Faster Computers


To create faster computers, the industry must take a major step back and re-examine choices that were made half a century ago. One of the most likely approaches involves dropping demands for determinism, and this is being attempted in several different forms. Since the establishment of the von Neumann architecture for computers, small, incremental improvements have been made to architectures... » read more

It’s All About The Data


The entire tech industry has changed in several fundamental ways over the past year due to the massive growth in data. Individually, those changes are significant. Taken together, those changes will have a massive impact on the chip industry for the foreseeable future. The obvious shift is the infusion of AI (and its subcategories, machine learning and deep learning) into different markets. ... » read more

Performance To The People


Ever since the IoT became a household term, the almost universal concept was that extremely low-power, simplistic devices would rule the edge. They would collect data, send it to the cloud, and the cloud would send back useful information. That's a great marketing concept for gateways and cloud services, but it's not scalable. Consumers don't just want to know when their heartbeat is irregul... » read more

The Return Of Time Sharing


As early as the 1960s, it wasn't uncommon to hear that transistors would be free. Those were pretty bold statements at the time, considering most computers in those days cost $1 million, required special rooms, and budding computer scientists usually had to sign up to use mainframe computers for one-hour time slots—often in the middle of the night or on weekends. Still, those predictions ... » read more

The Multiplier And The Singularity


In 1993, Vernor Vinge, a computer scientist and science fiction writer, first described an event called the Singularity—the point when machine intelligence matches and then surpasses human intelligence. And since then, top scientists, engineers and futurists have been asking just how far away we are from that event. In 2006, Ray Kurzweil published a book, "The Singularity is Near," in whic... » read more

The Limits Of Parallelism


Parallelism used to be the domain of supercomputers working on weather simulations or plutonium decay. It is now part of the architecture of most SoCs. But just how efficient, effective and widespread has parallelism really become? There is no simple answer to that question. Even for a dual-core implementation of a processor on a chip, results can vary greatly by software application, operat... » read more

Reading About Quantum Computing


For the last several months, I’ve been working on a series of articles about quantum computing: how quantum computers are different from conventional computers, what materials systems might be appropriate for use in qubits, and, for the upcoming last article, how one might actually build and program a quantum computer. Some of the subtopics are familiar ground for me, and probably for most... » read more