Systems & Design
SPONSOR BLOG

The Hunt For The Next Application To Drive System-Level Design And Verification

Defining a system is getting harder because it incorporates far more than just the SoCs in the cloud servers, a smart phone or wearable sensors. Managing that complexity may be even more difficult.

popularity

In recent years, most of my customer presentations highlighted some type of mobile device – system and system on chip (SoC) — to explain the challenges for system-level design and verification. But I also like to look into other application domains to understand how challenges may develop over time and to identify similarities and differences in challenges between application domains.

Consider the current landscape. Last week SemiWiki reported from the IP-SoC Conference that in the design IP space really only the smart phones and media tablets application domains would offer IP royalties growth of more than 20% in Design until 2017. That outlook, which came from Gartner, noted that digital set-top boxes, microcontrollers, smart cards and automotive electronics would grow 10-20%.

At the recent Gartner Semiconductor Briefing, analyst Bryan Lewis had attributed 67% of the share in expected dollar growth contribution to the 2013 semiconductor market to smart phones , 28% to SSDs and 17% to media tablets (PC and other are expected to contribute -6% and -36%, respectively). The market overall is expected to grow by 5.6% to $315B for 2013.

Next design opportunities?

Wow! So what is next beyond smart phones, tablets and SSDs? For those answers, I decided to attend this week’s ICCAD to brush up on other application domains. I was especially drawn to the keynote address “The Future of Computing through Brain-inspired Architectures,” given by Thomas Sterling representing the Center for Research in Extreme Scale Technologies.

Just before the keynote, I scanned Horace Dediu’s post at Asymco called “Seeing What’s Next.” He looked at the adoption of different consumer technologies from radio to washer to microwave to VCR to cell phones, smart phones and tablets. Smart phone penetration in the U.S. is on track to reach 90% of its available audience by August 2016, a mere eight years after smart phones reached 10% penetration. The graph in Horace’s post reminded me of my previous blog called “Putting Kurzweil’s Singularity To The Mobile Test.“ From that graphical representation it certainly looks like technology invention and adoption is accelerating. However, pointing to the open space on the right, Horace also stated that “there is a lack of visibility or certainty among observers that anything worthy of inclusion on this graph will ever emerge.”

Well, the ICCAD keynote would show whether brain inspired computing would be among them, and, of course, what the requirements for its design aspects would be. Sterling’s keynote offered interesting insights, starting from the supercomputer Tianhe-2. The Chinese-designed system (33.86 petaflops) is half way to exascale performance, and Sterling offered an interesting analysis how power is actually limiting the pure performance growth and how it actually may ring in the end of Moore’s Law at some point. There were interesting side comments how Von Neumann actually did not invent the architecture carrying his name, but instead another one, followed by a review of the changes in computing execution models from Von Neumann to the Vector Model in 1975, SIMD –array Model in 1983 to the CSP Model in 1991.

To speculate about what is next, Sterling then switched to brain-inspired technologies representing the basic brain properties of consciousness, thinking and extreme complexity. Starting off with historic brain inspired rudiments like chess computers – including the famous “turk” hidden under the game board being a very human “computer,” we were reminded of expert systems (including the hype associated with it, referencing a New York Times article), artificial neural networks and the two brain-inspired projects–CRIS (Cognitive Real-time Interactive Systems) and CCA (Continuum Computer Architecture).

Separating wheat from chaff

While there is obviously lots of system design happening at that scale of engineering, and lots of silicon will be consumed, the keynote left me somewhat unsettled because there was no clear “killer app,” so to speak. However, one of the graphs Sterling referred to was the famous Gartner “Hype Cycle.” The graph below is from the free webinar “Emerging Technologies Hype Cycle for 2013” on the Gartner website.

1306_GartnerHypeCycle

 

The hype cycle is a Gartner-branded graphical tool that has five phases, from the “Innovation Trigger” reaching a “Peak of Inflated Expectations” that is then followed by a “Through of Disillusionment” and finally reaches a “Plateau of Productivity” after a long “Slope of Enlightenment”.

Looking at July 2013 version of the hype cycle, not surprisingly some of the key topics on everybody’s mind – the Internet of Things, Big Data, 3D Printing and Wearable User Interfaces — are pretty much at the peak of inflated expectations. Virtual Reality is currently at the bottom of the through of disillusionment, but for example Cloud Computing is expected to reach the plateau of productivity in 2 to 5 years.

So how do smart phones, cloud storage and Thianhe-2-style computing go together? Well, it seems that in day-to-day life, system-level design spans far beyond the individual components.  For example the individual components enabling some of the applications I am personally using in the Internet of Things, have to be considered in conjunction. My Jawbone Up wristband tracks my steps and sleep patterns, transfers information to my smart phone as hub which then uses the Internet to transfer my data for storage in the cloud. That enables Jawbone’s Big Data computing to tell me that my sleep pattern from yesterday puts me in the top 20% of deep cycle sleepers.

It turns out, the “system” and its design here has become much bigger than just the SoCs in the cloud servers, my smart phone and the sensor I am wearing. Managing that complexity in itself may be one of the next challenges in system-level design.



Leave a Reply


(Note: This name will be displayed publicly)