Looking Beyond Technology

As industry pushes into AI, it’s now about what can be done with computing. That opens up a whole new set of problems.

popularity

The semiconductor industry is beginning to make real progress in deep learning and artificial intelligence, opening up bigger opportunities across more markets than have ever existed in the history of technology. But before this revolution goes much further, the industry also needs to step back and establish a set of guidelines about how this technology will be used.

This is an entirely different kind of roadmap than the one that has been driving chipmakers for the past five decades. Since the introduction of the first integrated circuit, the focus has been to shrink everything and make it run faster. The chip industry has achieved that many times over. As a point of comparison, a 1975 Cray-1 supercomputer ran 80 million floating point operations per second.

Apple’s A9X chip, released last fall, runs at 345.6 gigaFLOPS. It also uses far less power—it runs on a rechargeable battery—and fits in your pocket instead of requiring a separate room with a raised floor. It also costs about $3.5 million dollars less than the water-cooled Cray. And in terms of raw speed, that’s like a snail compared to China’s new Sunway TaihuLight supercomputer, which runs at 93 petaFLOPS.

All of this computing has occurred in a box, where performance and power can be measured. There are bragging rights to go along with this, and maybe business deals for whoever can develop the world’s fastest chips, systems, and for those who can use the least amount of power. But this is increasingly beside the point. The technology is now good enough from the perspective of power, performance, memory and connectivity, to look beyond the technology itself. It’s no longer about the machine. It’s what can be done with it as that compute power becomes even more distributed.

In the early 1990s, Digital Equipment Corp. had an artificial intelligence unit, as did all of the big computer companies of the time. But the conclusion they reached was that they had gone as far as they could with the technology available at that time. Engineers working in the group said they would have to wait for advances over the next decade to be able to continue their work.

It took more than a decade. In fact, it took a full quarter century before many of the concepts they were trying to develop began resurfacing in a big way. Machine learning, deep learning, artificial intelligence are all slices of the same pie. They’re all ill-defined at the moment. As a proof point, ask any two engineers what they mean by deep learning and you’ll almost certainly get different answers. There are enough crossovers and interactions to make it look more like an amoeba than a neat Venn diagram. But the basic concepts are the same—add enough intelligence into computing to recognize patterns and predict behavior based upon different scenarios and probabilities.

There is still plenty of work to turn this into an everyday, every device kind of technology, where a person can issue commands in a regular conversational tone and be assured those commands are accurately relayed into some system that interprets what they are saying, regardless of background noise or regional accents. The tech industry is making significant progress in that area. But as this work continues, it needs to start thinking well beyond just the hardware and software to what else can be done, and perhaps even more important, what rules will be needed once these devices begin interacting.

“There are no laws about what things can do to other things,” said Lucio Lanza, managing director of Lanza techVentures. “There is no set of rules. We have a lot of work to make a society of things consistent, and to define what are protocols of interaction. You don’t want a device to talk to a pacemaker. But who will define the layers? What is the ‘Bluebook’ for IoT and who’s going to write it?”

Related Stories
Decoding The Brain
Biologically Inspired Electronic Design
Inside AI And Deep Learning
What’s happening in AI and can today’s hardware keep up?
What Cognitive Computing Means For Chip Design
Computers that think for themselves will be designed differently than the average SoC; ecosystem impacts will be significant.
Convolutional Neural Networks Power Ahead
Adoption of this machine learning approach grows for image recognition; other applications require power and performance improvements.



Leave a Reply


(Note: This name will be displayed publicly)