Rethinking Computing For The AI Age


Cisco estimates that global cloud IP traffic will nearly quadruple in the next five years. Information consumption is exploding with artificial intelligence (AI) embedded into all devices and experiences surrounding us. However, we do not want that to come at a cost of our security and privacy. Talk about pressure. On you. Today, much of computing is done in the cloud for things that you are... » read more

The Week In Review: IoT


Services AT&T and IBM are expanding their joint Internet of Things effort to offer AT&T’s new IoT analytics capability, helping customers yield insights from their industrial IoT data. The capability takes in AT&T’s M2X, Flow Designer, Control Center, and other IoT offerings; IBM Watson IoT; the IBM Watson Data Platform; and the IBM Machine Learning Service, part of Watson Data Platform on... » read more

Ubiquitous AI


We have witnessed an amazing expansion of compute power over the past four years. Go inside the numbers of the recent 100 billion ARM-based chips milestone and you will see that 50 billion were shipped by our partners from 2013 to 2017, which demonstrates the industry’s insatiable demand for more compute. Even more extraordinary is that we expect our partners to ship the next 100 billion ARM-... » read more

Even good bots fight: The case of Wikipedia (Oxford & Alan Turing Institute)


Source: University Of Oxford published via PLOS ONE, Milena Tsvetkova, Ruth García-Gavilanes, Luciano Floridi, Taha Yasseri "The research paper, published in PLOS ONE, concludes that bots are more like humans than you might expect as they appear to behave differently in culturally distinct online environments. The paper says the findings are a warning to those using artificial intelligence ... » read more

Pattern Classification in a Mixed-Signal Circuit Based on Embedded 180-nm Floating-Gate Memory Cell Arrays


Source: Cornell University Library. F. Merrikh Bayat, X. Guo, M. Klachko, M. Prezioso, K. K. Likharev, D. B. Strukov (Submitted on 6 Oct 2016 (v1), last revised 10 Oct 2016 (this version, v2)) "We have designed, fabricated, and successfully tested a prototype mixed-signal, 28x28-binary-input, 10-output, 3-layer neuromorphic network ("MLP perceptron"). It is based on embedded nonvolatile flo... » read more

What Does An AI Chip Look Like?


Depending upon your point of reference, artificial intelligence will be the next big thing or it will play a major role in all of the next big things. This explains the frenzy of activity in this sector over the past 18 months. Big companies are paying billions of dollars to acquire startup companies, and even more for R&D. In addition, governments around the globe are pouring additional... » read more

What Does AI Really Mean?


Seth Neiman, chairman of eSilicon, founder of Brocade Communications, and a board member and investor in a number of startups, sat down with Semiconductor Engineering to talk about advances in AI, what's changing, and how it ultimately could change our lives. What follows are excerpts of that conversation. SE: How far has AI progressed? Neiman: We’ve been working with AI since the mid 1... » read more

Happy 25th Birthday, HAL!


“Good afternoon, gentlemen. I am a HAL 9000 computer. I became operational at the H. A. L. plant in Urbana, Illinois on the 12th of January, 1992.”—Stanley Kubrick and Arthur C. Clarke, 2001: A Space Odyssey (1968). Nearly a half-century ago, Arthur C. Clarke and Stanley Kubrick introduced us to cinema’s most compelling example of artificial intelligence: the HAL 9000, a heuristicall... » read more

The Multiplier And The Singularity


In 1993, Vernor Vinge, a computer scientist and science fiction writer, first described an event called the Singularity—the point when machine intelligence matches and then surpasses human intelligence. And since then, top scientists, engineers and futurists have been asking just how far away we are from that event. In 2006, Ray Kurzweil published a book, "The Singularity is Near," in whic... » read more

2017: Manufacturing And Markets


While the industry is busy chatting about the end of Moore's Law and a maturing of the semiconductor industry, the top minds of many companies are having none of it. A slowdown in one area is just an opportunity, in another and that is reflected in the predictions for this year. As in previous years, Semiconductor Engineering will look back on these predictions at the end of the year to see ... » read more

← Older posts