A slew of new technologies and a focus on lower power and different tools and methodologies point to a year of strong growth, change and innovation.
EDA is on a roll. Design starts are up significantly thanks to increased investment in areas such as AI, a plethora of new communications standards, buildout of the Cloud, the race toward autonomous driving and continued advancements in mobile phones. Many designs demand the latest technologies and push the limits of complexity.
Low power is becoming more than just reducing wasted power at the chip level. It is becoming an important environmental consideration. The focus of environmentalists increasingly will be toward the semiconductor industry. Systems are becoming more distributed, driven by the IoT, requiring new approaches to both design and verification.
From a design standpoint, many of the existing methodologies are running out of steam. “Increasing system complexity will drive a resurgence of classic electronic system-level (ESL) technologies,” says Frank Schirrmeister, senior group director for product management and marketing at Cadence. “It will be interesting to see the changes that allow users to do what emerged in the ’90s as hardware/software co-design. For instance, adding instructions to processors and trading their impact off with hardware is best done with tool assistance. The classic, M.C. Escher-ish dilemma of models needing to be fast, early and accurate is still just as over-constrained as it was in the 90s, especially as system design grows beyond pure SoCs and across different domains like RF, digital, and mechanical. With new options like faster simulation, as well as parallel execution to maintain accuracy, and the use of emulation and prototyping, it will be interesting to see how this dilemma will play out this time around.”
Software is a growing part of EDA, as well. “We’ve always said that you have to consider the whole system,” says Rupert Baines, CEO of UltraSoC. “That includes hardware-software interactions. It is nice that others are waking up to that. This is causing increasing amounts of re-aggregation. All of the big technology companies are now either designing their own chips or engaging with the semiconductor ecosystem to make what are effectively ASICs for in-house use. As with the software-hardware and EDA integration, this re-aggregation brings a more system-level approach, which is long overdue. On the other hand, it can make the design process more opaque and therefore harder for the ecosystem to add value.”
This change in system focus is also seen by a Keysight thinktank. “Realizing the full potential of sensor systems connected to communication systems connected to mechanical systems will require new ways to test at the system level,” according to the group. “Today, there are available tests for radar antennas and a radar transceiver module. However, testing a multi-antenna radar system integrated into a car will require a different testing approach. The same is true for data centers, mission-critical IoT networks, automobiles, and a wide range of new, complex, 5G-enabled applications. In 2020, the electronics industry will emphasize system-level testing as the definitive, final step to assure end-to-end performance, integrity and reliability across the increasingly connected world.”
The industry also is demanding easier paths to silicon. “What happened to the software industry about 20 years ago is now happening to the hardware industry,” says Mo Faisal, president and CEO for Movellus. “You are going to have generic libraries that people can start with, but the moment you want to do something out of the ordinary, you will have to start customizing it. It is really easy to build your own website today. Anyone can do that. But if you want the best, most scalable web app, then you have to have your own team and dev ops team to optimize the whole process. The same thing is happening with silicon. The lower end of the market will be democratized and it will be easily accessible to lots of people. But if you want want something that is highly optimized, then you will need a team to do that.”
One of the areas where this is happening is the RISC-V movement. “The adoption of open-source hardware solutions, particularly RISC-V, will accelerate in 2020 going beyond the work already done by many for deeply embedded functions,” says Jo Jones, technology communications manager for Imagination Technologies. “This is because open-source software is already a dominant force in the industry and so this movement is now translating into hardware.”
But the industry is also learning that this path is not as easy as some would have you believe. “Some ecosystems are rallying around open-source architectures like RISC-V,” says Cadence’s Schirrmeister. “2020 will be the year in which users will realize that the associated verification of core processors, and their integration into SoC architectures, is a huge challenge that needs to be considered in the costing considerations for adoption.”
Verification
Many designs today are in the latest fabrication processes, as well as distributed across multiple dies within a package. This is adding to already significant verification challenges. “In 2020 we believe the acceleration of sub-7nm designs will drive the critical need for multi-physics simulations,” says Vic Kulkarni, vice president and chief strategist for ANSYS. “This is because various physical effects are getting coupled and must be analyzed simultaneously, and not as separate tasks. For example, dynamic voltage drop now impacts block-level and SoC timing, localized temperature effects (LTE) or self-heating at the transistor level can impact local and regional temperature profile of the chip and can impact timing, then the process variability must be accounted for in modelling of device, cells and nets to meet the chip-package-system (CPS) goals and so on to meet the TTR and TTM goals for differentiation and product success. As operating frequencies go beyond 2.3 to 2.5 GHz, on-chip and off-chip electromagnetic effects start impacting the functionality of the entire chip. EM interference goes beyond the immediate signal line neighbors to distant signal lines, which are adjacent to power grid several microns away from the aggressor in these SoCs targeted for the emerging HPC and 5G applications.”
Even in the digital world, verification engines are having to transform to handle the complexity. “Verification teams need to run more and smarter verification cycles to gain confidence before tape-out,” says Schirrmeister. “For hardware-based verification and software development, emulation and prototyping have been non-optional for a while now. 2020 will be the year in which the efficient combined usage of emulation and prototyping will become non-optional. Emulation still offers the best bring-up time to an executable model and the most efficient, simulation-like debug. In contrast, the execution speed of prototyping well exceeds emulation and makes it the platform of choice for software development and hardware regressions. Closer integration will allow for the detection of bugs most efficiently in the fastest engine, as well as switching engines to allow for removal using the best bug hunting techniques in emulation and simulation.”
The combination of engines has been a goal for some time, and it continues to advance. “Hybrid execution also will become mainstream as FPGA-based techniques are now providing high-speed execution for billion-gate designs. That allows teams to avoid cutting down designs to run in an FPGA,” says Schirrmeister. “In addition, we are seeing greater demand for interfaces that allow connection to virtual platforms for optimal balancing of software-based and hardware-based execution of designs. Hybrids with virtual platforms also will extend further to simulation, complementing the hybrid techniques of RTL co-execution with gate-level description and analog-mixed signal execution already in production use today.”
The industry also is looking for formal techniques to fill more of the holes. “The quality of hardware verification will come under increasing scrutiny, and formal verification – mathematical techniques capable of proving the absence of faults, rather than just finding issues to fix – will become increasingly desirable,” says Imagination’s Jones. “The demand for full guarantees of correctness will push forward the need for high-quality formal verification tools, and with greater investment and invention, increasingly complex systems will come to be verified formally.”
Cloud
The migration to the Cloud started a couple of years ago. “2019 saw greater customer usage of EDA in the cloud (via third-party data centers), especially for those tool steps that are highly compute-intensive,” says Joseph Sawicki, executive vice president for IC EDA at Mentor, a Siemens Business. “The cloud enables design teams to call up virtually unlimited capacity on demand. The challenge for EDA companies has been to ensure their tools can scale for cloud configuration, so that the more CPUs you call up, the faster you can complete design and verification runs. EDA software running on the third-party cloud offers a tremendous opportunity to speed the pace of innovation. We expect to see increased customer demand and usage of EDA cloud offerings in 2020 and the coming years.”
There are several areas where this is happening successfully. “Function-specific workloads will increasingly move to the cloud to provide faster turnaround time in addressing complex simulation and analysis challenges,” says Craig Johnson, vice president of cloud business development at Cadence. “Cloud-ready electromagnetic simulation will be commonly deployed using fully managed and highly automated hybrid cloud enablement technology. The combination of nearly push-button cloud access with cloud-ready software will drive mainstream use in 2020.”
EDA does have some specific needs in the Cloud. “For the semiconductor industry, taking design processes and EDA workloads to the cloud is the next new frontier,” says Simon Rance, head of marketing at ClioSoft. “Semiconductor companies realize that developing a high-performance compute (HPC) environment and keeping it up to date is a daunting task. In 2019, Cloud players identified this niche and have started providing targeted solutions for the EDA industry. For Example, AWS teamed up with semiconductor manufacturers such as MediaTek, TowerJazz, and EDA companies such as ClioSoft and Cadence to provide end-to-end solutions. Google increased its year-over-year capital expenditure by 28%, of which 60% was spent on technical infrastructure. 2020 will see semiconductor startups adopt the public cloud exclusively because of its extreme scalability and elasticity.”
The Cloud and AI
The Cloud itself is evolving. “One of the main markets that leverages AI/ML is the data center,” says Mentor’s Sawicki. “There’s an insatiable need to not only process data faster and more intelligently, but also to move that data into and out of the data center more quickly. 2019 marks the year where we started to see silicon photonics emerge from research labs and move into the commercial market. In 2020, we’ll start to see these silicon photonic chips start to move into the data center.”
As the Cloud does become more capable, many applications start to intercept with each other. “Cloud services have matured through the rapid adoption of AI solutions driven by the need for voice and image recognition solutions, as well as near-real-time fraud detection for business transactions,” says Harry Foster, chief scientist for verification at Mentor, a Siemens Business. “But there are many emerging problems beyond the enterprise that will require AI solutions, such as the need to optimize spectrum bandwidth dynamically for efficient 5G operation, analyzing humongous set of data for immediate risk analysis and response for cyber security, as well as achieving higher levels of autonomy in automotive.”
And AI is increasingly being used with EDA. “New, smarter techniques to allow targeted data collection in the realm and context of the detected bug in emulation will emerge,” says Schirrmeister. “In addition, machine learning will make verification more efficient by learning the best mappings into execution engines utilizing FPGAs and custom processors.”
AI and the environment
The rapid rise of AI is beginning to ring alarm bells for some. “Artificial intelligence may account for as much as 2% of the world’s total energy use,” says Sriram Raghavan, vice president for AI at IBM Research. “Demand for cloud computing and AI won’t go away, so expect to see increased efforts in 2020 that look to make AI tech more sustainable. This includes creating new materials, like ‘transition-metal oxides’ that make more flexible devices, new chip designs with both analog and mixed-signal processing, and new software techniques based on approximate computing, that all aim to support growing AI workloads while reducing its carbon footprint.”
There is an increasing focus on power reduction. “As we move in to 2020 it’s clear that every sector of industry, including the semiconductor industry, will have a responsibility to address growing environmental concerns,” says Stephen Crosher, CEO for Moortec. “We should be aware that as our sector underpins the growth in AI, 5G telecommunications, cryptocurrency and high-performance compute applications, it is predicted that by 2030 energy consumption attributable to data centers will make up a staggering 8% of the world’s total usage. Data centers are fast becoming one of the big consumers, alongside lighting, domestic heating/cooling and transportation.”
This will have worldwide impact in several ways. “We will see greater governmental involvement in how carbon emission targets are levied upon different industrial sectors, technology applications, and in particular, data centers,” says Crosher. “As the so called ‘evolved economies’ respond to the pending climate crisis, we could see a growth in data centers being located in ‘less evolved’ economic regions where emission levels are scrutinized less and incentives for reduced energy consumption are less apparent. So in 2020, narratives from environmentalists like Greta Thunberg, and subsequently the action taken by governments around the world, will see the semiconductor industry respond by helping to tackle our new existential challenge.”
This also will lead to a push toward solutions that utilize energy harvesting. “Initially, applications would focus on extending battery life significantly, say through the life of the device,” says David Su, CEO for Atmosic. “Devices that use energy harvesting will enable consumers to enjoy their favorite smart home devices without worrying about changing batteries as often. However, industrial use of these energy harvesting solutions will be the real game changer. Devices with extended battery life, or without any batteries, will significantly reduce deployment and maintenance costs for IoT fleets, while also reducing the environmental impact of batteries.”
More acquisitions?
It has been a couple of years since Siemens acquired Mentor, and many people had silently talked about further consolidation within the industry. “Expect more acquisitions in 2020,” says UltraSoC’s Baines. “Maybe some big ones. It certainly makes sense to us.”
Leave a Reply