Deciphering what’s real and what’s hype will help determine where to focus your resources.
By Frank Ferro
CES draws a lot of attention. Everyone wants to be first to see the latest and greatest consumer products. If you don’t mind squeezing through the crowd, you can glimpse the startling picture quality of an OLED TV. Never mind viewing the quality of a 4K Ultra HDTV, at CES you can skip a generation and see what an 85” 8K UHDTV looks like. Talk about resolution! You also can explore a working smart home connected by a host of products enabling the “Internet-of-Things,” see products that can sling video from your phone to other screens, and then see robots clean windows. You can even use your brain waves to control toy helicopters and kitty ears. And the list goes on.
This is all fun, but CES is also a place where you can collect valuable data points on markets, products and companies. Careful observation will help get answers to the following questions:
All the hype, discussion and speculation around these questions I like to call the ‘CES Effect’.
What is real? One of the hits of this year’s show was the 4K UHDTVs. There is no question that these TV’s are going to find their way into consumers’ homes. The only question is when. I remember when HDTVs first appeared at CES in the late ’90s at a cost of about $10K. I knew that it would be a long time before one would show up in my home. Ten years later, in 2009, I purchased my first HD set for about $600. Cost was not the only factor that limited widespread HD adoption; it also was limited by the available content and lack of infrastructure.
A very similar discussion is now taking place with regard to UHDTV including: where is the content? Can the infrastructure handle higher resolution? Higher frame rates are needed to view sporting events; you need HDMI 2.0, and so on. Given this, and the price tag, it will be a few more years before UHDTVs are adopted by consumers. Technologies like H.265 will certainly help the deployment providing similar or better quality with about 50% reduction in media files. I am sure that when my current HD set is on its last legs (hopefully five to six years from now), I probably will have no choice but to purchase a 4K set because these will eventually overtake existing HD technology.
What‘s not real on the other hand are 3D TVs. Yes, they have been at CES for a few years now, and maybe it is me, but the user experience seems to be getting worse and not better. Not to ‘toot my own horn’ but about a year ago I predicted that we are not ready for 3D because there is not a practical consumer use case. Even for movies, my wife and I will not pay extra to see the 3D version, preferring the 2D instead. 3D will remain a novelty for games or special applications, but not the widespread adoption that was expected. Actually, if you want a real ‘3D’ experience, go and view the 8K resolution UHDTVs. The depth and clarity of this picture gave the impression of three dimensions. Unfortunately, I will have to wait even longer to get one of these. Gesture recognition is another technology that was hyped a year ago but was basically absent for similar reasons as 3D—lack of a scalable use model for the consumer (also discussed in Dec 2011 SLD blog).
Just when CES was starting to feel like a “mobile show,” this year the clock was turned back to more traditional mix of consumer electronics with only a handful smart phone announcements. Perhaps companies are holding their announcements for Mobile World Congress in February. Even so, it is clearly a sign that the smart phone market is maturing and there is less jockeying for position.
Providing an interesting dichotomy to the show were a number of processor announcements from Intel, Nvidia, Qualcomm, Samsung and ST-Ericsson—a dichotomy because you can see iPhone cases next to semiconductor booths. At a consumer show do buyers from big box stores care about 8 CPU processors cores or 72 GPUs? Maybe the PC market has trained the consumer to just know that a dual-core processor is better than a single-core and a quad-core is better than dual-core.
In any case, semiconductor companies are ‘leaning forward’ with very aggressive designs to cover a range of markets. The Tegra 4 from Nvidia, for example, with four ARM Cortex-A15 CPU cores and 72 GPUs, is targeting the gaming and tablet markets with enough power to support 4K (UHDTV) output. Similarly, the Snapdragon 800 from Qualcomm will support higher-end gaming, augmented reality and 4K content. The Samsung Exynos 5 Octa uses ARM’s big.LITTLE architecture with 4 Cortex-A15s (big) and 4 Cortex-A7s (LITTLE) in order to save significant power over the previous quad-core version. Intel on the other hand is targeting value smart phones with its Lexington platform and is giving the ‘heads-up’ on Clover Trail+ along with a new 22nm Atom-based design.
If I can boil all this this data down, the ‘CES effect’ on the SoC world is the need for more performance, higher complexity and longer usage per charge (lower power). This should not be a big surprise to anyone tracking the SoC market. The consumer’s demand for all these high-tech gadgets is unrelenting and the pace of SoC development is not letting up anytime soon. I also could add to the list lower SoC cost (both development and product cost) and better execution (TTM). To keep up this pace, contributions are needed from all parts of the semiconductor ecosystem including better IP, improved system architecture and analysis tools.
And P.S.: If I see another Dick Tracy watch at CES (which I did) I will scream. Give up already!
—Frank Ferro is director of product marketing at Sonics.
Leave a Reply