Reflections On 2015

Part 2: Who got it right, who was wrong, and what really happened in between.

popularity

It is easy to make predictions, but few people can make them with any degree of accuracy. Most of the time, those predictions are forgotten by the end of the year and there is no one to do a tally of who holds more credibility for next year. Not so with SemiEngineering. We like to hold people’s feet to the fire, but while the Pants-On-Fire meter may be applicable to politicians, we like to think of it in a more positive light.

In part one we looked at the predictions related to semiconductor manufacturing, packaging, markets, politics and some aspects of design.

In each case, the original comment is inserted in bold, often compressed, with their reflection after. Not all contributors chose to provide comments about their predictions.

One entry was missed from semiconductors: Marco Casale-Rossi, product marketing manager within the Design Group of Synopsys, talked about a few issues surrounding semiconductor manufacturing. “EUV will be ready for primetime after the 10 nanometer technology node. 2.5D– and 3D-IC are still taxing while the supply-chain is being sorted out, but the economics of emerging technology nodes, and the rise of the (IoT) may definitely boost them.”

EUV is slowly advancing; last summer, a leading U.S. semiconductor company ordered 15 ASML EUV systems, worth approximately $3B. Regarding 2.5D, there are a number of chips in development or hitting the market in areas such as networking. 3D-IC remains more elusive.

Tools and flows
A new term became very visible in 2015 — Shift Left. Chi-Ping Hsu, senior vice president, chief strategy officer for EDA at Cadence defined it as: “Shift Left means that steps, once done later in the design flow, must now start earlier. Software development needs to begin early enough to contemplate hardware changes.”

Shift left methodologies are now pervasive in the mobile market where schedules are critical to achieving eventual profit. In the mobile space, the software bring-up during design now includes the full software stack-up to the application level. is a key technology that makes hardware-software co-verification possible. Performance and power can be measured under various use cases to yield insights into the architectural choices, and hard data can be garnered early enough to tweak and tune various alternatives.

Other markets also are seeing the benefits and have adopted these practices, moving to low-level elements of software integration that directly access and control hardware features. Competitive pressures will lead these practices forward for an even broader set of designs in 2016, lest they get left behind.

Adds , CEO of Agnisys: “The much talked about shift left has happened in 2015, so the talk by the big three was spot on. Hardware and software teams are sharing a common specification for the design. Common tools will follow.”

Any talk about shift left very quickly talks about virtual prototypes. Zibi Zalewski, hardware division general manager for Aldec, says that the usage of virtual platforms is definitely growing and it was a right prediction. “It may be even bigger next year when integration with hardware based tools becomes more popular leading to early verification of the whole SoC by software and hardware teams at once.”

Another aspect to shift left centers around synthesis and the incorporation of more information into that flow. Hsu said that “vertical co-optimization, an element of the overall shift left trend, has been an important innovation in advanced node enablement. It offers opportunities to see how decisions in one realm affect the other elements of the enablement ecosystem.”

It is apparent that implementation technologies now employ look-ahead strategies from the logic level down to the process design-rule level to create more optimal results, as well as reducing the number of iterations and the need for user intervention. Place and route tools now have analysis from the signoff realm that are integrated, for instance. All of these innovations are really a form of the shift left in order to better optimize and shorten the design cycle.

With parts of the ESL flow beginning to crystalize, there is still a need for a verification strategy for the SoC level. , chief executive officer of Breker said “the ongoing move toward portable tests is an example of users demanding that verification solutions be portable ‘vertically’ from block to SoC and ‘horizontally’ from simulation through emulation and FPGA prototyping to actual silicon in the lab.”

Early this year, the Accellera (PSWG) commenced its work and has made very good progress toward a standard for portable tests and stimulus. There were many conference presentations and panels on this topic, most notably at DVCon India, and user awareness of the problems that it is addressing has skyrocketed. The group would like to see even more involvement from user companies to ensure that the best possible standard is created.

Emulation has started to play a larger part in the verification than in the past. Hsu had said “Simulation remains the main workhorse for IP and subsystem verification while emulation has taken center stage for SoC hardware verification, hardware/software integration, and OS bring-up.”

Hsu backpedals a little: Although the rise of emulation has been apparent by the market growth and proliferation of use models available, especially full SoC integration verification, simulation on a workstation is still the primary verification engine.

Some emulator architectures enable acceleration modes with abstract processor models running on a workstation and the remainder of the SoC on the emulation machine. Firmware and drivers can be integrated with hardware as the hardware is being designed. This enables both functional verification and performance analysis. This process serves as hardware verification testbenches, as well as enabling software bring-up. So many SoCs are now delivered with vast amounts of software that the shift left of software bring-up has become essential for competitive schedules. There are many other use models that make the addition of emulation hardware technology more and more attractive to a broader set of users in the semiconductor design space.

Software simulators are also growing in their use models for verification. Mixed-signal verification is a fast-growing area with real number models enabling runtimes that open up more thorough testing. Server farm solutions with multiple licenses per engineer enable multi-tasking on bug finding, fixing and regression.

Aldec’s Zalewski adds to this: “I definitely agree with this prediction for 2015. However, with more affordable emulation platforms, the simulation trend is slowly changing, and hopefully we can see the change more clearly in 2016—especially for midsize and big designs.”

At the other end of the scale, Zhihong Liu, executive chairman for ProPlus Design Solutions, expressed concerns about accuracy of simulation for things such as memory. “Embedded memory has a significant impact on chip performance and power. Designers will struggle with traditional EDA tools, such as SPICE for large memory characterization and verification, that don’t give them enough reliability and accuracy.”

Designers continue to work with sub-par tools, especially FastSPICE simulators, that are not as accurate as they are required to be for the smaller process technologies and gigs-scale designs. We’ve seen modest improvement and some additional focus to improve accuracy for transistor-level verification, but there is a need for more. In 2015, designers, especially those working on the most advanced memory designs, have begun to adopt GigaSpice simulators to support highly accurate leakage and power verification and signoff.

Before we leave the subject of verification completely, Agnisys’ Bakshi reflects on a couple of other areas. “ has now become mainstream as more and more verification teams understand where and how to use it, and UVM has become a de-facto standard in verification. But some concerns are being raised about the overhead and performance issues.”

What of EDA in general? Graham Bell, vice president of marketing at Real Intent talked about the general EDA market and predicted “a positive years for all sectors of the EDA pie.”

The latest EDA Consortium Market Statistics show this was true with one exception. In the 1H of 2015 the industry grew 8% over the first half of 2014. But of all the segments (Service, CAE, PCB & MCM, IC Physical Design and Verification, Semiconductor IP Products and Tools), only PCM&MCD had a decline (-6%). So I was almost right on this one.

Design
IP has become an indispensable part of the industry. Hsu said that “verified, standards-based IP has become a ‘must-have’ for system design enablement companies to thrive. Design IP lets the specialists encapsulate the complexity of implementation.”

The growth in design IP and verification IP markets was brisk in 2015. Many new standards were created, and expertise is not widespread for these new capabilities. Taking the risk and time out of the equation through IP purchases was a big driver for IP in 2015. No end is in sight for this trend, and the market is set to just about double by 2020.

Randy Smith, vice president of marketing at Sonics, said that “we cannot simply add more engineers to a team, especially verification engineers.”

Cadence’s Hsu responds: “The macro trends that have had an impact on team sizes for SoC programs include semiconductor industry consolidation, complexity pressures that are lengthening schedules, and the overall smaller number of SoC projects (albeit, with ever increasing complexity). To cope with these trends, companies have consistently made two moves. The first is to centralize and specialize teams related to SoC design processes. For instance, rather than having a place and route team for each project, a centralized, high-performance team resource is a common occurrence. The second organizational trend has been to increase the size of sub-teams with specialized skills such that they can perform on overlapping programs.”

Going forward, we expect to see more emphasis on the application of the latest automation technologies and techniques to get the most out of each SoC program. Maximizing team performance relative to industry norms necessitates the use of the most advanced design, verification, analysis and implementation technologies.

Markets
Casale-Rossi talked about evolving markets. “Once upon a time there were computers — millions of them — followed by phones — billions of them. Today, there are an estimated 1.5 trillion ‘things’ out there.”

Soon after our prediction, Intel CEO Brian Krzanich presented Curie (MCU + FLASH/SRAM + DSP + Bluetooth + 6-Axis MEMS Combo + PMIC + Li Battery) at CES 2015. Curie is clearly meant for IoT applications, and it is worth noting how Intel showcased Curie and not the latest and greatest 14-nanometer microprocessor. Krzanich was followed by Young Sohn, president and chief strategy of officer of Samsung, who presented Artik (CPU + GPU + FLASH/SRAM + Bluetooth/WiFi + 9-Axis MEMS Combo) at IoT World 2015, and by MediaTek, which launched the MT7688 (MIPS24KEc/580MHz CPU + 256MB DDR1/2 RAM + AES128/256 encryption engine.) This appears to confirm the IoT is indeed taking off. Just look at the presentations given by literally scores of companies at CES 2015: *everything* is getting connected and smart, even *things* one won’t think about.

Another market that garnered a lot of interest toward the end of 2014 was wearables. Casale-Rossi was lukewarm about them saying, “The jury is still out whether or not wearable applications are a must-have or just another nice-to-have, with the exception of health care, where technology brings medicine to a higher level such as in artificial cochlea, pancreas and retina, implantable/injectable bio-sensors and actuators.”

He remains cautious. Smart watches lag behind smartphones by two orders of magnitude.

Many people talked about the importance of the automotive industry and the rate at which electronics is consuming increasing amounts of a car’s cost. Casale-Rossi responds: “According to Bosch, the world’s largest automotive components maker, electronics represents 80% of car innovation and 40% of its cost. The car is indeed a computer—actually about 100 computers, and steadily going up. It will get smarter and smarter, with new layers of services and players just around the corner. Just think about Uber, Enjoy or Tesla super-charging stations, with your smart car telling you about the nearest super-charging station and your smartphone telling you when a super-charger connector becomes available.”

Finally, and perhaps one of the most prescient predictions of the year came from Hsu: “There is a new breed of system-design company that has taken direct control over its semiconductor and subsystem destinies. These systems companies are reaping business (pricing, availability), technical (broader scope of optimization), and strategic (IP protection, secrecy) benefits by taking more control over their system-design destiny.”

While he chose not to make a direct comment about this, it probably turned out to be truer than even he imagined, and one of the biggest trends of the year.



Leave a Reply


(Note: This name will be displayed publicly)