Rising concerns about the security of chips used in everything from cars to data centers are driving up the cost and complexity of electronic systems in a variety of ways, some obvious and others less so.
Until very recently, semiconductor security was viewed more as a theoretical threat than a real one. Governments certainly worried about adversaries taking control of secure systems through back doors in hardware, either through third-party IP or unknowns in the global supply chain, but the rest of the chip industry generally paid little heed apart from the ability to boot securely and to authenticate firmware. But as advanced electronics are deployed in cars, robots, drones, medical devices, as well as in a variety of server applications, robust hardware security is becoming a requirement. It no longer can be brushed aside as a ‘nice-to-have’ feature because IC breaches can affect safety, jeopardize critical data, and sideline businesses until the damage is assessed and the threat resolved.
The big question many companies are now asking is how much security is enough. The answer is not always clear, and it’s often incomplete. Adequate security is based on an end-to-end risk assessment, and when it comes to semiconductors the formula is both complex and highly variable. It includes factors that can fluctuate from one vendor to the next in the same market, and frequently from one chip to the next for the same vendor.
Much of this growing concern can be mapped against the rising value of data, which often is coupled with new or expanding market opportunities in automotive, medical, AR/VR, AI/ML, and both on-premise and cloud-based data centers. This is evident in the growth of the big data and business analytics market, which is expected to grow 13.5% annually from $198.08 billion in 2020 to $684.12 billion in 2030, according to Allied Market Research. There is more data that needs to be processed, and with Moore’s Law running out of steam, chipmakers and systems companies are innovating around new architectures to optimize performance with less power for different use cases. That makes it much harder to determine what’s an acceptable level of security for each application.
“It varies widely,” said Steve Hanna, distinguished engineer at Infineon. “The attacker generally is not going to spend a million dollars to get a thousand dollars in return. But there are some attackers who are willing to do that, although they tend to be motivated by other goals — government espionage, terrorism, or even people taking revenge on someone. And you can’t always assume no one will bother to attack something, because there can be indirect attacks, too. Why, for example, would someone want to hack a lightbulb in my house? One answer is because it can become a bot in their botnet army, and then they can lease that botnet army to other people. Then you have attacks from a million different points and it brings down a server.”
Many attacks can be prevented, or at least fixed with a system reboot. But there are costs associated with those actions, including some that are not obvious. Actively policing a chip to identify unusual activity requires power, which in turn can reduce battery life in mobile devices such as a phone or smart glasses, or an implantable medical device such as a pacemaker. It also can affect performance in devices, because it requires extra circuitry dedicated to making sure a chip is secure — basically the equivalent of guard-banding, but with little or no hard data to prove how effective it will be.
Building security into an IC also makes the design more complex, which in turn potentially adds other vulnerabilities that may be unique to a particular design. Chip architects and design teams need to understand the implications of every security measure on the movement and capturing of data, as well as the impact of ECOs and other last-minute changes needed to achieve sign-off.
In the past, this was a secondary consideration, because most attacks happened with software, which could be hacked remotely. But as more hardware is connected to the Internet, and to other hardware, chips are now a source of concern. Unlike software, if an attacker gains access to the hardware, a system reboot may not be possible.
“There is a whole business of building and selling tools to attackers,” said Hanna. “They have tech support and documentation and sales reps, and there’s a whole supply chain of tools and GUIs for mounting your attack. Usually, these operations are run out of places where there’s no extradition treaties.”
Even if a device starts out secure, that doesn’t mean it will remain secure throughout its lifetime. This became evident with vulnerabilities based on speculative execution and branch prediction, two commonly used approaches to improve processor performance prior to the discovery of Meltdown, Spectre, and Foreshadow. Now, with complex designs making their way into automotive, medical and industrial applications, where they are expected to be used for up to 25 years, security needs to be well architected and flexible enough to respond to future security holes and more sophisticated attack vectors.
“If you’re building network equipment, for example, it’s not just about the chip or the software,” said Andreas Kuehlmann, CEO of Tortuga Logic. “Their box is going to be out there for tens of years. What’s the total cost to maintain my product over its lifecycle? That cost reflects the cost that I have, the cost I’m imposing on my customers, and the cost if there’s any incident. The auto industry, in particular, really understands that because they look at records as part of their business. They’ve taken risk management to a level that nobody else has.”
For the automotive industry, and increasingly the medical industry, a breach can be extremely costly in multiple ways, from customer confidence in the brand to liability based upon insufficient security in a piece of hardware that results in injuries. “Security has an indirect impact on safety,” said Kuehlmann. “Safety is an extremely well-understood process, but it also raises some business issues. What’s my liability? What is the cost of a recall? It also affects privacy, particularly with medical records. That has a direct business impact, as well.”
Fig. 1: Securing chips in the design phase. Source: Tortuga Logic
And that’s just the beginning. These are relatively well-understood threat models. Other industry segments are far less sophisticated when it comes to chip security. And as more devices are connected to each other, often crisscrossing silos in various industry segments, the threat level increases for all of them.
Designing for security
Reducing the risk of potential hardware breaches requires a solid understanding of chip architectures, including everything from partitioning and prioritization of data movement and data storage, to a variety of obfuscation techniques and activity monitoring. Getting all of that right is a complex undertaking, and it’s one for which there often is no clear payback. A chip that is difficult to hack may deter attackers, and the best outcome is that nothing unusual happens — which can make companies question why they expended the necessary effort and money needed to secure a device.
That, in turn, tends to instill a false sense of confidence and lead to bad security choices. “A lot of times people use a normal applications processor to run cryptographic or security algorithms,” said Scott Best, director of anti-tamper security technology at Rambus. “So they are running security algorithms on an insecure processor, which is one of the hallmarks of a security failure. A processor is optimized, like any other circuitry. You can optimize it for power, for performance, or for security, and to think you’re going to accidentally get any one of those three benefits without actually focusing on them is recklessly optimistic. None of those things happens by accident.”
Some chipmakers and various organizations and government agencies are beginning to recognize that. “There are a couple of different ways security is creeping into the designs,” said John Hallman, product manager for trust and security at OneSpin, a Siemens Business. “It really is becoming more of a requirements-driven process, and that’s good. It’s moving toward that initial development stage. I wouldn’t say we’re completely there, or that the entire system is thought out in terms of which processors or which components are going to be in your system. But we are starting to establish at least some semblance of threat vectors, and how you can address perceived threats early enough. There is some due diligence in initial protections, which sets you up for the most success you can have at that stage. Then there are points along the way that you can continue to evaluate as you get into the design phase, where in your front-end design you’re doing some type of HDL coding, which you can continue to evaluate. So not only did you meet the requirements set forth and put in those protections, but now you’re starting to introduce some of these known vulnerabilities. There’s a lot being reported today in the hardware space. These are things you can check for and incorporate as part of your verification process.”
For example, MITRE, which is funded by various U.S. government agencies, publishes a list of the most important hardware weaknesses. In Europe, the European Union Agency for Cybersecurity (ENISA) publishes a threat landscape for supply chain attacks, as well.
That’s a starting point. Less visible is the impact of different use models on security. This is especially true for an increasing number of devices that have some circuits that are always on, a power-saving technique that allows devices such as smart speakers or surveillance systems to wake up as needed.
“In the case of an always-on machine, it’s doing background monitoring tasks while trying to consume as little power as possible,” said George Wall, director of product marketing at Cadence. “But it’s also vulnerable to unauthorized code executing on it, or some other type of security attack. So it’s important that when it boots up, the code that it is booting from is good. What are the authentication steps that are required for it to be considered a secure state? What other resources need to be protected from unauthorized code or misbehaving third-party code running on an always-on processor? Those considerations need to be built in upfront because they’re difficult to shoehorn in later.”
These kinds of best practices need to be standardized for chip design, similar to the emergence of design for test and design for manufacturability as requirements.
“There’s been a push for metrics in standards committees,” said Hallman. “You can look at the number of protections you put in, the coverage numbers you can measure, and the number of vulnerabilities that are out there for pieces of code or certain products. So we’re starting to at least quantify some of the pieces that we’re looking at. We’re not there yet. We still need to determine which data means the most, and are we measuring the right things? That’s going to be researched for a while. But at least we’re starting to use more scientific practices when we try to evaluate what we really value as security.”
Fig. 2: Secure chip architecture. Source: DARPA
Heterogeneneous challenges
All of this becomes more difficult as chipmakers embrace more customization and heterogeneity. As device scaling becomes more expensive, and the power, performance and area/cost benefits continue to shrink with each new node, architects have begun to package more components together. That creates a new and different set of challenges involving security. Not all of the components are inherently secure, and it’s not always clear which ones have been designed with security in mind because many of these customized accelerators and IP blocks are developed and/or sold as black boxes.
“If you want to go move loaded data around, you want to process that data, you want that data to be secure, you want it to be obviously handled through the right kind of memory management, and all those kinds of things,” said Peter Greenhalgh, vice president of technology at Arm. “Whatever piece of hardware you design relies on multiple other layers underneath. That volume of hardware raises the bar for when you build something to accelerate data. It’s kind of like building bigger castles to manipulate that data. So you’ve got a CPU, GPU, compute accelerator, and you need to make them bigger, with higher performance, and with more flexibility. If you’re going to try to construct lots of different smaller pieces of IP or smaller components to manipulate data in the most efficient way, that might work in an academic environment. But when you get into a consumer environment or commercial environment, you realize you need Linux, virtualization, security, debugging, performance management, etc. Suddenly, all of these bespoke accelerators, which are brilliant because they can manipulate the data and handle it seamlessly, tend to grow and grow. But you’d be better building three or four different castles that are flexible enough to be able to handle all the different ways that I want to do this in the future.”
Jayson Bethurem, product line manager for Xilinx’s cost-optimized portfolio, pointed to similar concerns. “When you’re bringing lots of signal data into a device, and then back out of the device, customers are asking for more multi-level security,” he said. “We have to be able to encrypt the data coming in and encrypt the data going out. We may be able to reprogram your device and make sure it’s coming from an authenticated source. And finally, we need to protect the IP that’s inside this device through cryptography and IP theft protection, with DPA (differential power analysis) resistance and things like that. All the security features that exist in our high-end devices need to be available in a low-cost FPGA.”
The big challenge here is building in flexibility for optimization without sacrificing security, and doing it quickly enough and without burning up too much power. “You want the hardware assurance that the software is behaving correctly,” said Rambus’ Best. “And if that root of trust is in a sometimes-on system, because it’s a mobile system that wakes up when something needs attention, then it creates a security problem. An adversary can always figure out what wakes up a system. And if the system needs to wake up and behave securely, that’s similar to what your phone does. It needs to get itself into a secure execution place. You have to go find the secure state information that got saved off the low-power memory. It’s just sitting there quiet and idle, so that can get quickly loaded, and you can now have a secure environment without going through a full minute of secure boot.”
AI systems add a a whole different level of complexity because none of them behave the same way. In fact, that’s the whole point. These devices are supposed to optimize themselves to whatever tasks they are designed to handle. That makes spotting aberrations in their behavior more difficult, and it’s extremely difficult to reverse engineer problems when they do occur to really figure out whether an abnormality was due to an unusual data set or training data flaw, or whether it is responding to malicious code.
“You can embed a behavior in AI that people are not expecting, but that can be triggered by the person who invented it,” said Mike Borza, security IP architect at Synopsys. “You’re introducing this in the training data, which means you’re adjusting the connectivity and the weights between neurons — how those neurons respond to things in their environment. It’s very difficult to understand what training data is going to do. And that has been one of the challenges. We’re looking now at ways to enhance the observability and controllability of it, and to have these devices provide feedback about how they’re making their decisions so that you can diagnose them when they start misbehaving. It’s very easy in this kind of scenario to embed some behavior that can be triggered by the right set of inputs, or the right sequence of inputs, or the right collection of images, and to produce a behavior the adversary wants that is undesirable behavior for the AI itself.”
Extending security forward and backward
The bigger challenge may be the longevity of a hardware design. Security needs to be end-to-end, both physically in the supply chain, the design-through-manufacturing chain, and in the field throughout the projected lifetime for a particular chip or system. This is difficult to do with components manufactured in different regions of the world, often by different companies, and with some of it black-boxed. And it becomes even more difficult if the chip is expected to work according to spec for a decade or two.
Find Ed Sperling’s entire article “Why It’s So Difficult — And Costly — To Secure Chips” here.
Multiple chips arranged in a planar or stacked configuration with an interposer for communication.
2.5D and 3D forms of integration
A memory architecture in which memory cells are designed vertically instead of using a traditional floating gate.
Transistors where source and drain are added as fins of the gate.
Next-generation wireless technology with higher data transfer rates, low latency, and able to support more devices.
We start with schematics and end with ESL
Important events in the history of logic simulation
Early development associated with logic synthesis
Commonly and not-so-commonly used acronyms.
Sensing and processing to make driving safer.
At newer nodes, more intelligence is required in fill because it can affect timing, signal integrity and require fill for all layers.
A collection of approaches for combining chips into packages, resulting in lower power and lower cost.
An approach to software development focusing on continual delivery and flexibility to changing requirements
How Agile applies to the development of hardware systems
A way of improving the insulation between various components in a semiconductor by creating empty space.
A collection of intelligent electronic environments.
The theoretical speedup when adding processors is always limited by the part of the task that cannot benefit from the improvement.
Semiconductors that measure real-world conditions
Analog integrated circuits are integrated circuits that make a representation of continuous signals in electrical form.
The design and verification of analog components.
A software tool used in software programming that abstracts all the programming steps into a user interface for the developer.
A custom, purpose-built integrated circuit made for a specific task or product.
An IC created and optimized for a market and sold to multiple companies.
Using machines to make decisions based upon stored knowledge and sensory input.
Code that looks for violations of a property
A method of measuring the surface structures down to the angstrom level.
A method of depositing materials and films in exact places on a surface.
ALE is a next-generation etch technology to selectively and precisely remove targeted materials at the atomic scale.
The generation of tests that can be used for functional or manufacturing verification
Issues dealing with the development of automotive electronics.
Electronic systems in the vehicles are networked in different architectures types.
Time sensitive networking puts real time into automotive Ethernet.
Noise in reverse biased junctions
Verification methodology created by Mentor
IC manufacturing processes where interconnects are made.
Devices that chemically store energy.
Transformation of a design described in a high-level of abstraction to RTL
Security based on scans of fingerprints, palms, faces, eyes, DNA or movement.
A reverse force to electromigration.
Also known as Bluetooth 4.0, an extension of the short-range wireless protocol for low energy applications.
Transistor model
On-chip logic to test a design.
Chiplet interconnect specification.
Interface model between testbench and device under test
C, C++ are sometimes used in design of integrated circuits because they offer higher abstraction.
Interconnect standard which provides cache coherency for accelerators and memory expansion peripheral devices connecting to processors.
Automotive bus developed by Bosch
CD-SEM, or critical-dimension scanning electron microscope, is a tool for measuring feature dimensions on a photomask.
Making CDC interfaces predictable
Fault model for faults within cells
Cell-aware test methodology for addressing defect mechanisms specific to FinFETs.
The CPU is an dedicated integrated circuit or IP core that processes logic and math.
A lab that wrks with R&D organizations and fabs involved in the early analytical work for next-generation devices, packages and materials.
Testbench component that verifies results
A process used to develop thin films and polymer coatings.
Design is the process of producing an implementation from a conceptual form
The design, verification, implementation and test of electronics systems into integrated circuits.
Exchange of thermal design information for 3D ICs
A discrete unpackaged die that can be assembled into a package with other chiplets.
Asynchronous communications across boundaries
Dynamic power reduction by gating the clock
Design of clock trees for power reduction
The cloud is a collection of servers that run Internet software you can use on your device or computer.
Fabrication technology
Cobalt is a ferromagnetic metal key to lithium-ion batteries.
Metrics related to about of code executed in functional verification
Verify functionality between registers remains unchanged after a transformation
The plumbing on chip, among chips and between devices, that sends bits of data and manages that data.
Faster form for logic simulation
Complementary FET, a new type of vertical transistor.
Combinations of semiconductor materials.
Interconnect between CPU and accelerators.
The structure that connects a transistor with the first layer of copper interconnects.
A technique for computer vision based on machine learning.
Completion metrics for functional verification
Interference between signals
Crypto processors are specialized processors that execute cryptographic algorithms within hardware.
Companies supplying IP or IP services
A method of conserving power in ICs by powering down segments of a chip when they are not in use.
Data analytics uses AI and ML to find patterns in data to improve processes in EDA and semi manufacturing.
How semiconductors are sorted and tested before and after implementation of the chip in a system.
A data center is a physical building or room that houses multiple servers with CPUs for remote data storage and processing.
Data processing is when raw data has operands applied to it via a computer or server to process data into another useable form. This definition category includes how and where the data is processed.
A standard that comes about because of widespread acceptance or adoption.
The removal of bugs from a design
Deep learning is a subset of artificial intelligence where data representation is based on multiple layers of a matrix.
An observation that as features shrink, so does power consumption.
Actions taken during the physical design stage of IC development to ensure that the design can be accurately manufactured.
Techniques that reduce the difficulty and cost associated with testing an integrated circuit.
Protection for the ornamental design of an item
A physical design process to determine if chip satisfies rules defined by the semiconductor manufacturer
Locating design rules using pattern matching techniques.
Sources of noise in devices
Insertion of test logic for clock-gating
A wide-bandgap synthetic material.
Categorization of digital IP
Allowed an image to be saved digitally
A digital signal processor is a processor optimized to process signals.
A digital representation of a product or system.
A complementary lithography technology.
DNA analysis is based upon unique DNA sequencing.
Using deoxyribonucleic acid to make chips hacker-proof.
A patterning technique using multiple passes of a laser.
Colored and colorless flows for double patterning
Single transistor memory that requires refresh.
Dynamically adjusting voltage and frequency for power reduction
Hardware Verification Language
A slower method for finding smaller defects.
Lithography using a single beam e-beam tool
The difference between the intended and the printed features of an IC layout.
Electromigration (EM) due to power densities
Electronic Design Automation (EDA) is the industry that commercializes the tools, methodologies and flows associated with the fabrication of electronic systems.
Levels of abstraction higher than RTL used for design and verification
Transfer of electrostatic charge.
An eFPGA is an IP core integrated into an ASIC or SoC that offers the flexibility of programmable logic without the cost of FPGAs.
Special purpose hardware used for logic verification
Capturing energy from the environment
Noise caused by the environment
A method for growing or depositing mono crystalline films on a substrate.
Programmable Read Only Memory that was bulk erasable.
Reuse methodology based on the e language
Methods for detecting and correcting errors.
Ethernet is a reliable, open standard for connecting devices by wire.
EUV lithography is a soft X-ray technology.
Finding out what went wrong in semiconductor design and manufacturing.
A way of including more features that normally would be on a printed circuit board inside a package.
Evaluation of a design under the presence of manufacturing defects
The lowest power form of small cells, used for home WiFi networks.
Ferroelectric FET is a new type of memory.
Reprogrammable logic device
The use of metal fill to improve planarity and to manage electrochemical deposition (ECD), etch, lithography, stress effects, and rapid thermal annealing.
A three-dimensional transistor.
non-volatile, erasable memory
Integrated circuits on a flexible substrate
An automotive communications protocol
Noise related to resistance fluctuation
A type of interconnect using solder balls or microbumps.
A transistor type with integrated nFET and pFET.
Formal verification involves a mathematical proof to show that a design adheres to a property
A company that specializes in manufacturing semiconductor devices.
FD-SOI is a semiconductor substrate material with lower current leakage compared than bulk CMOS.
Coverage metric used to indicate progress in verifying functionality
Functional Design and Verification is currently associated with all design and verification functions performed before RTL synthesis.
Functional verification is used to determine if a design, or unit of a design, conforms to its specification.
A statistical method for determining if a test system is production ready by measuring variation during test for repeatability and reproducibility.
GaN is a III-V material with a wide bandgap.
A transistor design with a gate is placed on all four sides of the channel.
Power reduction techniques available at the gate level.
noise related to generation-recombination
A neural network framework that can generate new data.
Germany is known for its automotive industry and industrial machinery.
2D form of carbon in a hexagonal lattice.
An electronic circuit designed to handle graphics and video.
Adding extra circuits or software into a design to ensure that if one part doesn't work the entire system doesn't fail.
Fully designed hardware IP block
Use of special purpose hardware to accelerate verification
Historical solution that used real chips in the simulation process
Optimizing the design by using a single language to describe hardware and software.
Power creates heat and heat affects power
The process of integrating different chips, chiplets, and chip components into packages.
A dense, stacked version of memory with high-speed interfaces that can be used in advanced packaging.
An umbrella term (circa 2015) for advanced packaging in semiconductors.
Synthesis technology that transforms an untimed behavioral description into RTL
Defines a set of functionality and features for HSA hardware
HSAIL Virtual ISA and Programming Model, Compiler Writer, and Object Format (BRIG)
Runtime capabilities for the HSA architecture
Combines use of a public cloud service with a private cloud, such as a company's internal enterprise servers or data centers.
A data center facility owned by the company that offers cloud services through that data center.
What are the types of integrated circuits?
Hardware Description Language
Analog extensions to VHDL
A collection of VHDL 1076.1 packages
Modeling of macro-cells in VHDL
Boundry Scan Test
IEEE ratified version of Verilog
Standard for Verilog Register Transfer Level Synthesis
Extension to 1149.1 for complex device programming
Functional verification language
SystemC
Standard for integration of IP in System-on-Chip
IEEE Standard for Access and Control of Instrumentation Embedded within a Semiconductor Device
IEEE ratified version of SystemVerilog
Universal Verification Methodology
IEEE Standard for Design and Verification of Low-Power Integrated Circuits also known by its Accellera name of Unified Power Format (UPF)
Standard for Test Access Architecture for Three-Dimensional Stacked Integrated Circuits
Verification language based on formal specification of behavior
IEEE 802.1 is the standard and working group for higher layer LAN protocols.
IEEE 802.11 working group manages the standards for wireless local area networks (LANs).
IEEE 802.15 is the working group for Wireless Specialty Networks (WSN), which are used in IoT, wearables and autonomous vehicles.
"RR-TAG" is a technical advisory group supporting IEEE standards groups working on 802.11, 802.12, 802.16, 802.20, 802.21, and 802.22.
Standards for coexistence between wireless standards of unlicensed devices.
Enables broadband wireless access using cognitive radio technology and spectrum sharing in white spaces.
IEEE 802.3-Ethernet working group manages the IEEE 802.3-Ethernet standards.
Standard for Unified Hardware Abstraction and Layer for Energy Proportional Electronic Systems
Power Modeling Standard for Enabling System Level Analysis
Specific requirements and special consideration for the Internet of Things within an Industrial setting.
Wafer costs across nodes
Power optimization techniques for physical implementation
Performing functions directly in the fabric of memory.
Thermal noise within a channel
A set of basic operations a computer must support.
IGBTs are combinations of MOSFETs and bipolar transistors.
Integration of multiple devices onto a single piece of semiconductor
A semiconductor company that designs, manufactures, and sells integrated circuits (ICs).
A design or verification unit that is pre-packed and available for licensing.
Networks that can analyze operating conditions and reconfigure in real time.
Method to ascertain the validity of one or more claims of a patent
Buses, NoCs and other forms of connection between various elements in an integrated circuit.
Also known as the Internet of Everything, or IoE, the Internet of Things is a global application where devices can connect to a host of other devices, each either providing data from sensors, or containing actuators that can control some function. Data can be consolidated and processed on mass in the Cloud.
Fast, low-power inter-die conduits for 2.5D electrical signals.
Finding ideal shapes to use on a photomask.
Injection of critical dopants during the semiconductor manufacturing process.
Separate electronic devices using Internet or other connections to communicate among the devices. Usually sensors or actuators are sending data to a computing hub.
Standard for integration of IP in System-on-Chip
The voltage drop when current flows through a resistor.
Terminology in ISO 26262
Standard related to the safety of electrical and electronic systems within a car
Standard to ensure proper operation of automotive situational awareness systems.
A standard (under development) for automotive cybersecurity.
The energy efficiency of computers doubles roughly every 18 months.
Languages are used to create models
Theories have been influential and are often referred to as "laws" and are discussed in trade publications, research literature, and conference presentations as "truisms" that eventually have limits.
Device and connectivity comparisons between the layout and the schematic
Cells used to match voltages across voltage islands
Measuring the distance to an object with pulsed lasers.
Low cost automotive bus
Deviation of a feature edge from ideal shape.
Removal of non-portable or suspicious code
LELE is a form of double patterning
A type of double patterning.
Light used to transfer a pattern from a photomask onto a substrate.
Coefficient related to the difficulty of the lithography process
Correctly sizing logic elements
Restructuring of logic for power reduction
A simulator is a software process used to execute a model of hardware
Methodologies used to reduce power consumption.
Verification of power circuitry
A technical standard for electrical characteristics of a low-power differential, serial communication protocol.
An approach in which machines are trained to favor basic behaviors and outcomes rather than explicitly programmed to do certain tasks. That results in optimization of both hardware and software to achieve a predictable range of results.
Uses magnetic properties to store data
Observation related to the amount of custom and standard content in electronics.
Tracking a wafer through the fab.
Noise sources in manufacturing
Semiconductor materials enable electronic circuits to be constructed.
A semiconductor device capable of retaining state information for a defined period of time.
Use of multiple memory banks for power reduction
Microelectromechanical Systems are a fusion of electrical and mechanical engineering and are typically used for sensors and for advanced microphones and even speakers.
A key tool for LED production.
Artificial materials containing arrays of metal nanostructures or mega-atoms.
Unstable state within a latch
Observation that relates network value being proportional to the square of users
Describes the process to create a product
Metrology is the science of measuring and characterizing tiny structures and materials.
A type of processor that traditionally was a scaled-down, all-in-one embedded processor, memory and I/O for use in very specific operations.
The integrated circuit that first put a central processing unit on one chip of silicon.
The integration of analog and digital.
Models are abstractions of devices
A midrange packaging option that offers lower density than fan-outs.
A way of stacking transistors inside a single chip instead of a package.
Observation related to the growth of semiconductors by Gordon Moore.
A mote is a micro-sensor.
An advanced form of e-beam lithography
An early approach to bundling multiple functions into a single package.
Increasing numbers of corners complicates analysis. Concurrent analysis holds promise.
Using a tester to test multiple dies at the same time.
Use of multi-threshold voltage devices
When a signal is received via different paths and dispersed over time.
A way to image IC designs at 20nm and below.
A durable and conductive material of two-dimensional inorganic compounds in thin atomic layers.
A hot embossing process type of lithography.
A type of field-effect transistor that uses wider and thicker wires than a lateral nanowire.
Optimizing power by computing below the minimum operating voltage.
Moving compute closer to memory to reduce access costs.
NBTI is a shift in threshold voltage with applied stress.
An in-chip network, often in a SoC, that connects IP blocks and components and routes data packets among them.
A method of collecting data from the physical world that mimics the human brain.
A compute architecture modeled on the human brain.
Nodes in semiconductor manufacturing indicate the features that node production line can create on an integrated circuit, such as interconnect pitch, transistor density, transistor type, and other new technology.
Random fluctuations in voltage or current on a signal.
Programmable Read Only Memory (PROM) and One-Time-Programmable (OTP) Memory can be written to once.
OSI model describes the main data handoffs in a network.
Verification methodology created from URM and AVM
Disabling datapath computation when not enabled
Method used to find defects on a wafer.
A way to improve wafer printability by modifying mask patterns.
The company that buys raw goods, including electronics and chips, to make a product.
Companies who perform IC packaging and testing - often referred to as OSAT
The ability of a lithography scanner to align and print various layers accurately on top of each other.
How semiconductors get assembled and packaged.
A high-speed signal encoding technique.
Outlier detection for a single measurement, a requirement for automotive electronics.
A patent is an intellectual property right granted to an inventor
High-speed serial expansion bus for connecting sending data between devices.
A thin membrane that prevents a photomask from being contaminated.
Memory that stores information in the amorphous and crystalline phases.
A template of what will be printed on a wafer.
Light-sensitive material used to form a pattern on the substrate.
Design and implementation of a chip that takes physical placement, routing and artifacts of those into consideration.
Physically connects devices and is the conduit that encodes, decodes bits of data.
PVD is a deposition method that involves high-temperature vacuum evaporation and sputtering.
Making sure a design layout works as intended.
A set of unique features that can be built into a chip but not cloned.
A small cell that is slightly higher in power than a femtocell.
Lowering capacitive loads on logic
An algorithm used ATPG
Hardware Verification Language, PSS is defined by Accellera and is used to model verification intent in semiconductor design.
Components of power consumption
Power domain shutdown and startup
Definitions of terms related to power
Moving power around a device.
How is power consumption estimated
Reducing power by turning off parts of a design
Special flop or latch used to retain the state of the cell when its main power supply is shut off.
Addition of isolation cells around power islands
Power reduction at the architectural level
Ensuring power control circuitry is fully verified
An integrated circuit that manages the power in an electronic device or module, including any device that has a battery that gets recharged.
A power semiconductor used to control and convert electric power.
A power IC is used as a switch or rectifier in high voltage power applications.
Noise transmitted through the power delivery network
Controlling power for power shutoff
Techniques that analyze and optimize power in a design
Test considerations for low-power circuitry
Fundamental tradeoffs made in semiconductor design for power, performance and area.
The design, verification, assembly and test of printed circuit boards
Data centers and IT infrastructure for data storage and computing that a company owns or subscribes to for use only by that company.
power optimization techniques at the process level
Variability in the semiconductor manufacturing process
A measurement of the amount of time processor core(s) are actively in use.
An integrated circuit or part of an IC that does logic and math processing.
Verification language based on formal specification of behavior
Data storage and computing done in a data center, through a service offered by a cloud service provider, and accessed on the public Internet.
A different way of processing data using qubits.
RF SOI is the RF version of silicon-on-insulator (SOI) technology.
Random trapping of charge carriers
The process of rapidly heating wafers.
Critical metals used in electronics.
Read Only Memory (ROM) can be read from but cannot be written to.
An artificial neural network that finds patterns in data using other data stored in memory.
Copper metal interconnects that electrically connect one part of a package to another.
Design verification that helps ensure the robustness of a design and reduce susceptibility to premature or catastrophic electrical failures.
Materials used to manufacture ReRAMs
Memory utilizing resistive hysteresis
Synonymous with photomask.
A proposed test data standard aimed at reducing the burden for test engineers and test operations.
An open-source ISA used in designing integrated circuits at lower cost.
Trusted environment for secure functions.
An abstraction for defining the digital portions of a design
Optimization of power consumption at the Register Transfer Level
A series of requirements that must be met before moving past the RTL phase
Verification methodology based on Vera
Algorithm used to solve problems
Additional logic that connects registers into a shift register or scan chain for increased test efficiency.
Mechanism for storing stimulus in testbench
Testbench support for SystemC
A form of double patterning.
Subjects related to the manufacture of semiconductors
Methods and technologies for keeping data safe.
Combining input from multiple sensor types.
An IC that conditions an analog sensor signal and converts to it digital before sending to a microcontroller.
Sensors are a bridge between the analog world we live in and the underlying communications infrastructure.
A transmission system that sends signals over a high-speed connection from a transceiver on one chip to a receiver on another. The transceiver converts parallel data into serial stream of data that is re-translated into parallel on the receiving end.
In semiconductor development flow, tasks once performed sequentially must now be done concurrently.
Sweeping a test condition parameter through a range and obtaining a plot of the results.
When channel lengths are the same order of magnitude as depletion-layer widths of the source and drain, they cause a number of issues that affect design.
Quantization noise
A class of attacks on a device and its contents by analyzing information using different access methods.
Undetected errors in data output from an integrated circuit.
A wide-bandgap technology used for FETs and MOSFETs for power transistors.
The integration of photonic devices into silicon
A simulator exercises of model of hardware
Special purpose hardware used to accelerate the simulation process.
Disturbance in ground voltage
Single transistor DRAM
Wireless cells that fill in the voids in wireless infrastructure.
Synthesizable IP block
Verification methodology utilizing embedded processors
Defines an architecture description useful for software design
Circuit Simulator first developed in the 70s
A type of neural network that attempts to more closely model the brain.
A type of MRAM with separate paths for write and read.
A secure method of transmitting data wirelessly.
A patent that has been deemed necessary to implement a standard.
The most commonly used data format for semiconductor test information.
Standards are important in any industry.
SRAM is a volatile memory that does not require refresh
Constraints on the input to guide random generation process
Random variables that cause defects on chips during EUV lithography.
An advanced type of MRAM
Use of Substrate Biasing
Coupling through the substrate.
Network switches route data packet traffic inside the network.
Type of DRAM with faster transfer
A method for bundling multiple ICs to work together as a single chip.
A system on chip (SoC) is the integration of functions necessary to implement an electronic system onto a single substrate and contains at least one processor
A class library built on top of the C++ language used for modeling hardware
Analog and mixed-signal extensions to SystemC
Industry standard design and verification language
Google-designed ASIC processing unit for machine learning that works with TensorFlow ecosystem.
Software used to functionally verify a design
Noise related to heat
Through-Silicon Vias are a technology to connect various die in a stacked die configuration.
Basic building block for both analog and digital integrated circuits.
Minimizing switching times
A multi-patterning technique that will be required at 10nm and below.
A type of transistor under development that could replace finFETs in future process technologies.
Standard for safety analysis and evaluation of autonomous vehicles.
The Unified Coverage Interoperability Standard (UCIS) provides an application programming interface (API) that enables the sharing of coverage data across software simulators, hardware accelerators, symbolic simulations, formal tools or custom verification tools.
Accellera Unified Power Format (UPF)
Die-to-die interconnect specification.
Verification methodology
SystemVerilog version of eRM
User interfaces is the conduit a human uses to communicate with an electronics device.
Patent to protect an invention
Hardware Verification Language
A pre-packaged set of code used for verification.
A standardized way to verify integrated circuit designs.
A document that defines what functional verification is going to be performed
Hardware Description Language in use since 1984
Procedural access to Verilog objects
Analog extensions to Verilog
Hardware Description Language
An abstract model of a hardware system enabling early software execution.
Verification methodology built by Synopsys
Using voice/speech for device command and control.
Memory that loses storage abilities when power is removed.
Use of multiple voltages for power reduction
The basic architecture for most computing today, based on the principle that data needs to move back and forth between a processor and memory.
Verifying and testing the dies on the wafer after the manufacturing.
The science of finding defects on a silicon wafer.
A brand name for a group of wireless networking protocols and technology,
3D memory interface standard
Creating interconnects between IC and package using a thin wire.
Wired communication, which passes data through wires between devices, is still considered the most stable form of communication.
A way of moving data without wires.
IC interconnect architecture
X Propagation causes problems
A data-driven system for monitoring and improving IC yield and reliability.
A vulnerability in a product’s hardware or software discovered by researchers or attackers that the producing company does not know about and therefore does not have a fix for yet.