As logic devices become more complex, it took increasing amounts of time and effort to manually create and validate tests, it was too hard to determine test coverage, and the tests took too long to run. The technique is referred to as functional test. So the industry moved to a design for test (DFT) approach where the design was modified to make it easier to test. The approach that ended up dominating IC test is called structural, or “scan,” test because it involves scanning test patterns into internal circuits within the device under test (DUT). The design’s flip-flops are modified to allow them to function as stimulus and observation points, or “scan cells” during test, while performing their intended functional role during normal operation.
The modified flip-flops, or scan cells, allow the overall design to be viewed as many small segments of combinational logic that can be more easily tested. For a design with a million flops, introducing scan cells is like adding a million control and observation points. Segmenting the logic in this manner is what makes it feasible to automatically generate test patterns that can exercise the logic between the flops. The test software doesn’t need to understand the function of the logic-it just tries to exercise the logic segments observed by a scan cell. Since scan test modifies flip flops that are already in the design to enable them to also act as scan cells, the impact of the test circuitry is relatively small, typically adding about only 1-5% to the total gate count.
The scan cells are linked together into “scan chains” that operate like big shift registers when the circuit is put into test mode. The scan chains are used by external automatic test equipment (ATE) to deliver test pattern data from its memory into the device. After the test pattern is loaded, the design is placed back into functional mode and the test response is captured in one or more clock cycles. The design is again put in test mode and the captured test response is shifted out, while the next test pattern is simultaneously shifted in to the scan cells. The ATE then compares the captured test response with the expected response data stored in its memory. Any mismatches are likely defects and are logged for further evaluation.
Many designs do not connect up every register into a scan chain. This is called partial scan.
To enable automatic test pattern generation (ATPG) software to create the test patterns, fault models are defined that predict the expected behaviors (response) from the IC when defects are present. The ATPG tool then uses the fault models to determine the patterns required to detect those faults at all points in the circuit (or almost all-coverage of 95% or more is typical). There are a number of different fault models that are commonly used.
Stuck-At Test
The most basic and common is the “stuck-at” fault model, which checks each node location in the design for either stuck-at-1 or stuck-at-0 logic behavior. For example, if a NAND gate in the design had an input pin shorted to ground (logic value 0) by a defect, the stuck-at-0 test for that node would catch it. The stuck-at model can also detect other defect types like bridges between two nets or nodes. The stuck-at model is classified as a static model because it is a slow speed test and is not dependent on gate timing (rise and fall times and propagation delay).
At-Speed Test
A second common type of fault model is called the “transition” or “at-speed” fault model, and is a dynamic fault model, i.e., it detects problems with timing. It is similar to the stuck-at model in that there are two faults for every node location in the design, classified as slow-to-rise and slow-to-fall faults. The transition fault model uses a test pattern that creates a transition stimulus to change the logic value from either 0-to-1 or from 1-to-0. The time allowed for the transition is specified, so if the transition doesn’t happen, or happens outside the allotted time, a timing defect is presumed.
Path Delay Test
The “path delay” model is also dynamic and performs at-speed tests on targeted timing critical paths. While stuck-at and transition fault models usually address all the nodes in the design, the path delay model only tests the exact paths specified by the engineer, who runs static timing analysis to determine which are the most critical paths. These paths are specified to the ATPG tool for creating the path delay test patterns. The theory is that if the most critical timing paths can pass the tests, then all the other paths with longer slack times should have no timing problems. In a way, path delay testing is a form of process check (e.g., showing timing errors if a process variable strays too far), in addition to a test for manufacturing defects on individual devices.
IDDQ Test
The IDDQ test relies on measuring the supply current (Idd) in the quiescent state (when the circuit is not switching and inputs are held at static values). Test patterns are used to place the DUT in a variety of selected states. By performing current measurements at each of these static states, the presence of defects that draw excess current can be detected. The value of Iddq testing is that many types of faults can be detected with very few patterns. The drawback is the additional test time to perform the current measurements.
Toggle Test
Toggle fault testing ensures that a node can be driven to both a logical 0 and a logical 1 value, and indicates the extent of your control over circuit nodes. Because the toggle fault model is faster and requires less overhead to run than stuck-at fault testing, you can experiment with different circuit configurations and get a quick indication of how much control you have over your circuit nodes. Because the toggle fault model only excites fault sites and does not propagate the responses to capture points, it cannot be used for defect detection. This fault model is sometimes used for burn-in testing to cause high activity in the circuit.
N-Detect and Embedded Multiple Detect (EMD)
The basic idea of n-detect (or multi-detect) is to randomly target each fault multiple times. The way the fault is targeted is changed randomly, as is the fill (bits that don’t matter in terms of the fault being targeted) in the pattern set. This approach starts with a standard stuck-at or transition pattern set targeting each potential defect in the design. The pattern set is analyzed to see which potential defects are addressed by more than one pattern in the total pattern set. Then additional (different) patterns are generated to specifically target the defects that are detected a number of times that is less than the user specified minimum threshold. The combined information for all the resulting patterns increases the potential for detecting a bridge defect that might otherwise escape.
Embedded multiple detect (EMD) is a method of improving multiple detection of a pattern set without increasing the number of patterns within that pattern set. EMD uses the otherwise unspecified (fill or don’t care) bits of an ATPG pattern to test for nodes that have not reached their N-detect target. Standard multiple detect (N-detect) will have a cost of additional patterns but will also have a higher multiple detection rate than EMD. How much difference there is between EMD and multiple detect defect detection will depend on the particular design’s pattern set and the level of test compression used. This test is becoming more common since it does not increase the size of the test set, and can produce additional detection.
Deterministic Bridging
The deterministic bridging test utilizes a combination of layout extraction tools and ATPG. Based on a set of geometric rules, the extraction tool creates a list of net pairs that have the potential of bridging. This list is then fault simulated using existing stuck-at and transition patterns to determine which bridge defects can be detected. The net pairs that are not covered by the initial patterns are identified, and then used by the ATPG tool to generate a specific set of test patterns to completely validate that the remaining nets are not bridged.
Small-Delay Defects
At design nodes of 180nm and larger, the majority of manufacturing defects are caused by random particles that cause bridges or opens. There are very few timing related defects at these larger design nodes since manufacturing process variations cause relatively small parametric changes that would affect the design timing. However, at design nodes of 90nm and smaller, the same manufacturing process variations can cause on-chip parametric variations to be greater than 50%. This creates a situation where timing-related failures are a significant percentage of overall test failures.
One might expect that transition test patterns would find all of the timing defects in the design. This is true most of the time, but some of the smallest delay defects can evade the basic transition test pattern. For example, when a path through vias, gates, and interconnects has a minor resistive open or other parametric issue that causes a delay, the accumulative defect behavior may only be manifested by long paths. A common scenario is where the same via type is used multiple times in the same path, and the vias are formed as resistive vias. In order to detect this defect a small delay defect (SDD) test can be performed. Circuit timing and physical layout information is used to guide the test generator to detect faults through the longest paths in order to improve the ability to detect small delay detects. This ATPG method is often referred to as timing-aware ATPG and is growing in usage for designs that have tight timing margins and high quality requirements. The resulting patterns have a much higher probability of catching small-delay defects if they are present.
Page contents originally provided by Mentor Graphics Corp.
Recommended reading:
System-on-Chip Test Architectures: Nanometer Design for Testability (Systems on Silicon)
Multiple chips arranged in a planar or stacked configuration with an interposer for communication.
2.5D and 3D forms of integration
A memory architecture in which memory cells are designed vertically instead of using a traditional floating gate.
Transistors where source and drain are added as fins of the gate.
Next-generation wireless technology with higher data transfer rates, low latency, and able to support more devices.
We start with schematics and end with ESL
Important events in the history of logic simulation
Early development associated with logic synthesis
Commonly and not-so-commonly used acronyms.
Sensing and processing to make driving safer.
At newer nodes, more intelligence is required in fill because it can affect timing, signal integrity and require fill for all layers.
A collection of approaches for combining chips into packages, resulting in lower power and lower cost.
An approach to software development focusing on continual delivery and flexibility to changing requirements
How Agile applies to the development of hardware systems
A way of improving the insulation between various components in a semiconductor by creating empty space.
A collection of intelligent electronic environments.
The theoretical speedup when adding processors is always limited by the part of the task that cannot benefit from the improvement.
Semiconductors that measure real-world conditions
Analog integrated circuits are integrated circuits that make a representation of continuous signals in electrical form.
The design and verification of analog components.
A software tool used in software programming that abstracts all the programming steps into a user interface for the developer.
A custom, purpose-built integrated circuit made for a specific task or product.
An IC created and optimized for a market and sold to multiple companies.
Using machines to make decisions based upon stored knowledge and sensory input.
Code that looks for violations of a property
A method of measuring the surface structures down to the angstrom level.
A method of depositing materials and films in exact places on a surface.
ALE is a next-generation etch technology to selectively and precisely remove targeted materials at the atomic scale.
The generation of tests that can be used for functional or manufacturing verification
Issues dealing with the development of automotive electronics.
Electronic systems in the vehicles are networked in different architectures types.
Time sensitive networking puts real time into automotive Ethernet.
Noise in reverse biased junctions
Verification methodology created by Mentor
IC manufacturing processes where interconnects are made.
Devices that chemically store energy.
Transformation of a design described in a high-level of abstraction to RTL
Security based on scans of fingerprints, palms, faces, eyes, DNA or movement.
A reverse force to electromigration.
Also known as Bluetooth 4.0, an extension of the short-range wireless protocol for low energy applications.
Transistor model
On-chip logic to test a design.
Chiplet interconnect specification.
Interface model between testbench and device under test
C, C++ are sometimes used in design of integrated circuits because they offer higher abstraction.
Interconnect standard which provides cache coherency for accelerators and memory expansion peripheral devices connecting to processors.
Automotive bus developed by Bosch
CD-SEM, or critical-dimension scanning electron microscope, is a tool for measuring feature dimensions on a photomask.
Making CDC interfaces predictable
Fault model for faults within cells
Cell-aware test methodology for addressing defect mechanisms specific to FinFETs.
The CPU is an dedicated integrated circuit or IP core that processes logic and math.
A lab that wrks with R&D organizations and fabs involved in the early analytical work for next-generation devices, packages and materials.
Testbench component that verifies results
A process used to develop thin films and polymer coatings.
Design is the process of producing an implementation from a conceptual form
The design, verification, implementation and test of electronics systems into integrated circuits.
Exchange of thermal design information for 3D ICs
A discrete unpackaged die that can be assembled into a package with other chiplets.
Asynchronous communications across boundaries
Dynamic power reduction by gating the clock
Design of clock trees for power reduction
The cloud is a collection of servers that run Internet software you can use on your device or computer.
Fabrication technology
Cobalt is a ferromagnetic metal key to lithium-ion batteries.
Metrics related to about of code executed in functional verification
Verify functionality between registers remains unchanged after a transformation
The plumbing on chip, among chips and between devices, that sends bits of data and manages that data.
Faster form for logic simulation
Complementary FET, a new type of vertical transistor.
Combinations of semiconductor materials.
Interconnect between CPU and accelerators.
The structure that connects a transistor with the first layer of copper interconnects.
A technique for computer vision based on machine learning.
Completion metrics for functional verification
Interference between signals
Crypto processors are specialized processors that execute cryptographic algorithms within hardware.
Companies supplying IP or IP services
A method of conserving power in ICs by powering down segments of a chip when they are not in use.
Data analytics uses AI and ML to find patterns in data to improve processes in EDA and semi manufacturing.
How semiconductors are sorted and tested before and after implementation of the chip in a system.
A data center is a physical building or room that houses multiple servers with CPUs for remote data storage and processing.
Data processing is when raw data has operands applied to it via a computer or server to process data into another useable form. This definition category includes how and where the data is processed.
A standard that comes about because of widespread acceptance or adoption.
The removal of bugs from a design
Deep learning is a subset of artificial intelligence where data representation is based on multiple layers of a matrix.
An observation that as features shrink, so does power consumption.
Actions taken during the physical design stage of IC development to ensure that the design can be accurately manufactured.
Techniques that reduce the difficulty and cost associated with testing an integrated circuit.
Protection for the ornamental design of an item
A physical design process to determine if chip satisfies rules defined by the semiconductor manufacturer
Locating design rules using pattern matching techniques.
Sources of noise in devices
Insertion of test logic for clock-gating
A wide-bandgap synthetic material.
Categorization of digital IP
Allowed an image to be saved digitally
A digital signal processor is a processor optimized to process signals.
A digital representation of a product or system.
A complementary lithography technology.
DNA analysis is based upon unique DNA sequencing.
Using deoxyribonucleic acid to make chips hacker-proof.
A patterning technique using multiple passes of a laser.
Colored and colorless flows for double patterning
Single transistor memory that requires refresh.
Dynamically adjusting voltage and frequency for power reduction
Hardware Verification Language
A slower method for finding smaller defects.
Lithography using a single beam e-beam tool
The difference between the intended and the printed features of an IC layout.
Electromigration (EM) due to power densities
Electronic Design Automation (EDA) is the industry that commercializes the tools, methodologies and flows associated with the fabrication of electronic systems.
Levels of abstraction higher than RTL used for design and verification
Transfer of electrostatic charge.
An eFPGA is an IP core integrated into an ASIC or SoC that offers the flexibility of programmable logic without the cost of FPGAs.
Special purpose hardware used for logic verification
Capturing energy from the environment
Noise caused by the environment
A method for growing or depositing mono crystalline films on a substrate.
Programmable Read Only Memory that was bulk erasable.
Reuse methodology based on the e language
Methods for detecting and correcting errors.
Ethernet is a reliable, open standard for connecting devices by wire.
EUV lithography is a soft X-ray technology.
Finding out what went wrong in semiconductor design and manufacturing.
A way of including more features that normally would be on a printed circuit board inside a package.
Evaluation of a design under the presence of manufacturing defects
The lowest power form of small cells, used for home WiFi networks.
Ferroelectric FET is a new type of memory.
Reprogrammable logic device
The use of metal fill to improve planarity and to manage electrochemical deposition (ECD), etch, lithography, stress effects, and rapid thermal annealing.
A three-dimensional transistor.
non-volatile, erasable memory
Integrated circuits on a flexible substrate
An automotive communications protocol
Noise related to resistance fluctuation
A type of interconnect using solder balls or microbumps.
A transistor type with integrated nFET and pFET.
Formal verification involves a mathematical proof to show that a design adheres to a property
A company that specializes in manufacturing semiconductor devices.
FD-SOI is a semiconductor substrate material with lower current leakage compared than bulk CMOS.
Coverage metric used to indicate progress in verifying functionality
Functional Design and Verification is currently associated with all design and verification functions performed before RTL synthesis.
Functional verification is used to determine if a design, or unit of a design, conforms to its specification.
A statistical method for determining if a test system is production ready by measuring variation during test for repeatability and reproducibility.
GaN is a III-V material with a wide bandgap.
A transistor design with a gate is placed on all four sides of the channel.
Power reduction techniques available at the gate level.
noise related to generation-recombination
A neural network framework that can generate new data.
Germany is known for its automotive industry and industrial machinery.
2D form of carbon in a hexagonal lattice.
An electronic circuit designed to handle graphics and video.
Adding extra circuits or software into a design to ensure that if one part doesn't work the entire system doesn't fail.
Fully designed hardware IP block
Use of special purpose hardware to accelerate verification
Historical solution that used real chips in the simulation process
Optimizing the design by using a single language to describe hardware and software.
Power creates heat and heat affects power
The process of integrating different chips, chiplets, and chip components into packages.
A dense, stacked version of memory with high-speed interfaces that can be used in advanced packaging.
An umbrella term (circa 2015) for advanced packaging in semiconductors.
Synthesis technology that transforms an untimed behavioral description into RTL
Defines a set of functionality and features for HSA hardware
HSAIL Virtual ISA and Programming Model, Compiler Writer, and Object Format (BRIG)
Runtime capabilities for the HSA architecture
Combines use of a public cloud service with a private cloud, such as a company's internal enterprise servers or data centers.
A data center facility owned by the company that offers cloud services through that data center.
What are the types of integrated circuits?
Hardware Description Language
Analog extensions to VHDL
A collection of VHDL 1076.1 packages
Modeling of macro-cells in VHDL
Boundry Scan Test
IEEE ratified version of Verilog
Standard for Verilog Register Transfer Level Synthesis
Extension to 1149.1 for complex device programming
Functional verification language
SystemC
Standard for integration of IP in System-on-Chip
IEEE Standard for Access and Control of Instrumentation Embedded within a Semiconductor Device
IEEE ratified version of SystemVerilog
Universal Verification Methodology
IEEE Standard for Design and Verification of Low-Power Integrated Circuits also known by its Accellera name of Unified Power Format (UPF)
Standard for Test Access Architecture for Three-Dimensional Stacked Integrated Circuits
Verification language based on formal specification of behavior
IEEE 802.1 is the standard and working group for higher layer LAN protocols.
IEEE 802.11 working group manages the standards for wireless local area networks (LANs).
IEEE 802.15 is the working group for Wireless Specialty Networks (WSN), which are used in IoT, wearables and autonomous vehicles.
"RR-TAG" is a technical advisory group supporting IEEE standards groups working on 802.11, 802.12, 802.16, 802.20, 802.21, and 802.22.
Standards for coexistence between wireless standards of unlicensed devices.
Enables broadband wireless access using cognitive radio technology and spectrum sharing in white spaces.
IEEE 802.3-Ethernet working group manages the IEEE 802.3-Ethernet standards.
Standard for Unified Hardware Abstraction and Layer for Energy Proportional Electronic Systems
Power Modeling Standard for Enabling System Level Analysis
Specific requirements and special consideration for the Internet of Things within an Industrial setting.
Wafer costs across nodes
Power optimization techniques for physical implementation
Performing functions directly in the fabric of memory.
Thermal noise within a channel
A set of basic operations a computer must support.
IGBTs are combinations of MOSFETs and bipolar transistors.
Integration of multiple devices onto a single piece of semiconductor
A semiconductor company that designs, manufactures, and sells integrated circuits (ICs).
A design or verification unit that is pre-packed and available for licensing.
Networks that can analyze operating conditions and reconfigure in real time.
Method to ascertain the validity of one or more claims of a patent
Buses, NoCs and other forms of connection between various elements in an integrated circuit.
Also known as the Internet of Everything, or IoE, the Internet of Things is a global application where devices can connect to a host of other devices, each either providing data from sensors, or containing actuators that can control some function. Data can be consolidated and processed on mass in the Cloud.
Fast, low-power inter-die conduits for 2.5D electrical signals.
Finding ideal shapes to use on a photomask.
Injection of critical dopants during the semiconductor manufacturing process.
Separate electronic devices using Internet or other connections to communicate among the devices. Usually sensors or actuators are sending data to a computing hub.
Standard for integration of IP in System-on-Chip
The voltage drop when current flows through a resistor.
Terminology in ISO 26262
Standard related to the safety of electrical and electronic systems within a car
Standard to ensure proper operation of automotive situational awareness systems.
A standard (under development) for automotive cybersecurity.
The energy efficiency of computers doubles roughly every 18 months.
Languages are used to create models
Theories have been influential and are often referred to as "laws" and are discussed in trade publications, research literature, and conference presentations as "truisms" that eventually have limits.
Device and connectivity comparisons between the layout and the schematic
Cells used to match voltages across voltage islands
Measuring the distance to an object with pulsed lasers.
Low cost automotive bus
Deviation of a feature edge from ideal shape.
Removal of non-portable or suspicious code
LELE is a form of double patterning
A type of double patterning.
Light used to transfer a pattern from a photomask onto a substrate.
Coefficient related to the difficulty of the lithography process
Correctly sizing logic elements
Restructuring of logic for power reduction
A simulator is a software process used to execute a model of hardware
Methodologies used to reduce power consumption.
Verification of power circuitry
A technical standard for electrical characteristics of a low-power differential, serial communication protocol.
An approach in which machines are trained to favor basic behaviors and outcomes rather than explicitly programmed to do certain tasks. That results in optimization of both hardware and software to achieve a predictable range of results.
Uses magnetic properties to store data
Observation related to the amount of custom and standard content in electronics.
Tracking a wafer through the fab.
Noise sources in manufacturing
Semiconductor materials enable electronic circuits to be constructed.
A semiconductor device capable of retaining state information for a defined period of time.
Use of multiple memory banks for power reduction
Microelectromechanical Systems are a fusion of electrical and mechanical engineering and are typically used for sensors and for advanced microphones and even speakers.
A key tool for LED production.
Artificial materials containing arrays of metal nanostructures or mega-atoms.
Unstable state within a latch
Observation that relates network value being proportional to the square of users
Describes the process to create a product
Metrology is the science of measuring and characterizing tiny structures and materials.
A type of processor that traditionally was a scaled-down, all-in-one embedded processor, memory and I/O for use in very specific operations.
The integrated circuit that first put a central processing unit on one chip of silicon.
The integration of analog and digital.
Models are abstractions of devices
A midrange packaging option that offers lower density than fan-outs.
A way of stacking transistors inside a single chip instead of a package.
Observation related to the growth of semiconductors by Gordon Moore.
A mote is a micro-sensor.
An advanced form of e-beam lithography
An early approach to bundling multiple functions into a single package.
Increasing numbers of corners complicates analysis. Concurrent analysis holds promise.
Using a tester to test multiple dies at the same time.
Use of multi-threshold voltage devices
When a signal is received via different paths and dispersed over time.
A way to image IC designs at 20nm and below.
A durable and conductive material of two-dimensional inorganic compounds in thin atomic layers.
A hot embossing process type of lithography.
A type of field-effect transistor that uses wider and thicker wires than a lateral nanowire.
Optimizing power by computing below the minimum operating voltage.
Moving compute closer to memory to reduce access costs.
NBTI is a shift in threshold voltage with applied stress.
An in-chip network, often in a SoC, that connects IP blocks and components and routes data packets among them.
A method of collecting data from the physical world that mimics the human brain.
A compute architecture modeled on the human brain.
Nodes in semiconductor manufacturing indicate the features that node production line can create on an integrated circuit, such as interconnect pitch, transistor density, transistor type, and other new technology.
Random fluctuations in voltage or current on a signal.
Programmable Read Only Memory (PROM) and One-Time-Programmable (OTP) Memory can be written to once.
OSI model describes the main data handoffs in a network.
Verification methodology created from URM and AVM
Disabling datapath computation when not enabled
Method used to find defects on a wafer.
A way to improve wafer printability by modifying mask patterns.
The company that buys raw goods, including electronics and chips, to make a product.
Companies who perform IC packaging and testing - often referred to as OSAT
The ability of a lithography scanner to align and print various layers accurately on top of each other.
How semiconductors get assembled and packaged.
A high-speed signal encoding technique.
Outlier detection for a single measurement, a requirement for automotive electronics.
A patent is an intellectual property right granted to an inventor
High-speed serial expansion bus for connecting sending data between devices.
A thin membrane that prevents a photomask from being contaminated.
Memory that stores information in the amorphous and crystalline phases.
A template of what will be printed on a wafer.
Light-sensitive material used to form a pattern on the substrate.
Design and implementation of a chip that takes physical placement, routing and artifacts of those into consideration.
Physically connects devices and is the conduit that encodes, decodes bits of data.
PVD is a deposition method that involves high-temperature vacuum evaporation and sputtering.
Making sure a design layout works as intended.
A set of unique features that can be built into a chip but not cloned.
A small cell that is slightly higher in power than a femtocell.
Lowering capacitive loads on logic
An algorithm used ATPG
Hardware Verification Language, PSS is defined by Accellera and is used to model verification intent in semiconductor design.
Components of power consumption
Power domain shutdown and startup
Definitions of terms related to power
Moving power around a device.
How is power consumption estimated
Reducing power by turning off parts of a design
Special flop or latch used to retain the state of the cell when its main power supply is shut off.
Addition of isolation cells around power islands
Power reduction at the architectural level
Ensuring power control circuitry is fully verified
An integrated circuit that manages the power in an electronic device or module, including any device that has a battery that gets recharged.
A power semiconductor used to control and convert electric power.
A power IC is used as a switch or rectifier in high voltage power applications.
Noise transmitted through the power delivery network
Controlling power for power shutoff
Techniques that analyze and optimize power in a design
Test considerations for low-power circuitry
Fundamental tradeoffs made in semiconductor design for power, performance and area.
The design, verification, assembly and test of printed circuit boards
Data centers and IT infrastructure for data storage and computing that a company owns or subscribes to for use only by that company.
power optimization techniques at the process level
Variability in the semiconductor manufacturing process
A measurement of the amount of time processor core(s) are actively in use.
An integrated circuit or part of an IC that does logic and math processing.
Verification language based on formal specification of behavior
Data storage and computing done in a data center, through a service offered by a cloud service provider, and accessed on the public Internet.
A different way of processing data using qubits.
RF SOI is the RF version of silicon-on-insulator (SOI) technology.
Random trapping of charge carriers
The process of rapidly heating wafers.
Critical metals used in electronics.
Read Only Memory (ROM) can be read from but cannot be written to.
An artificial neural network that finds patterns in data using other data stored in memory.
Copper metal interconnects that electrically connect one part of a package to another.
Design verification that helps ensure the robustness of a design and reduce susceptibility to premature or catastrophic electrical failures.
Materials used to manufacture ReRAMs
Memory utilizing resistive hysteresis
Synonymous with photomask.
A proposed test data standard aimed at reducing the burden for test engineers and test operations.
An open-source ISA used in designing integrated circuits at lower cost.
Trusted environment for secure functions.
An abstraction for defining the digital portions of a design
Optimization of power consumption at the Register Transfer Level
A series of requirements that must be met before moving past the RTL phase
Verification methodology based on Vera
Algorithm used to solve problems
Additional logic that connects registers into a shift register or scan chain for increased test efficiency.
Mechanism for storing stimulus in testbench
Testbench support for SystemC
A form of double patterning.
Subjects related to the manufacture of semiconductors
Methods and technologies for keeping data safe.
Combining input from multiple sensor types.
An IC that conditions an analog sensor signal and converts to it digital before sending to a microcontroller.
Sensors are a bridge between the analog world we live in and the underlying communications infrastructure.
A transmission system that sends signals over a high-speed connection from a transceiver on one chip to a receiver on another. The transceiver converts parallel data into serial stream of data that is re-translated into parallel on the receiving end.
In semiconductor development flow, tasks once performed sequentially must now be done concurrently.
Sweeping a test condition parameter through a range and obtaining a plot of the results.
When channel lengths are the same order of magnitude as depletion-layer widths of the source and drain, they cause a number of issues that affect design.
Quantization noise
A class of attacks on a device and its contents by analyzing information using different access methods.
Undetected errors in data output from an integrated circuit.
A wide-bandgap technology used for FETs and MOSFETs for power transistors.
The integration of photonic devices into silicon
A simulator exercises of model of hardware
Special purpose hardware used to accelerate the simulation process.
Disturbance in ground voltage
Single transistor DRAM
Wireless cells that fill in the voids in wireless infrastructure.
Synthesizable IP block
Verification methodology utilizing embedded processors
Defines an architecture description useful for software design
Circuit Simulator first developed in the 70s
A type of neural network that attempts to more closely model the brain.
A type of MRAM with separate paths for write and read.
A secure method of transmitting data wirelessly.
A patent that has been deemed necessary to implement a standard.
The most commonly used data format for semiconductor test information.
Standards are important in any industry.
SRAM is a volatile memory that does not require refresh
Constraints on the input to guide random generation process
Random variables that cause defects on chips during EUV lithography.
An advanced type of MRAM
Use of Substrate Biasing
Coupling through the substrate.
Network switches route data packet traffic inside the network.
Type of DRAM with faster transfer
A method for bundling multiple ICs to work together as a single chip.
A system on chip (SoC) is the integration of functions necessary to implement an electronic system onto a single substrate and contains at least one processor
A class library built on top of the C++ language used for modeling hardware
Analog and mixed-signal extensions to SystemC
Industry standard design and verification language
Google-designed ASIC processing unit for machine learning that works with TensorFlow ecosystem.
Software used to functionally verify a design
Noise related to heat
Through-Silicon Vias are a technology to connect various die in a stacked die configuration.
Basic building block for both analog and digital integrated circuits.
Minimizing switching times
A multi-patterning technique that will be required at 10nm and below.
A type of transistor under development that could replace finFETs in future process technologies.
Standard for safety analysis and evaluation of autonomous vehicles.
The Unified Coverage Interoperability Standard (UCIS) provides an application programming interface (API) that enables the sharing of coverage data across software simulators, hardware accelerators, symbolic simulations, formal tools or custom verification tools.
Accellera Unified Power Format (UPF)
Die-to-die interconnect specification.
Verification methodology
SystemVerilog version of eRM
User interfaces is the conduit a human uses to communicate with an electronics device.
Patent to protect an invention
Hardware Verification Language
A pre-packaged set of code used for verification.
A standardized way to verify integrated circuit designs.
A document that defines what functional verification is going to be performed
Hardware Description Language in use since 1984
Procedural access to Verilog objects
Analog extensions to Verilog
Hardware Description Language
An abstract model of a hardware system enabling early software execution.
Verification methodology built by Synopsys
Using voice/speech for device command and control.
Memory that loses storage abilities when power is removed.
Use of multiple voltages for power reduction
The basic architecture for most computing today, based on the principle that data needs to move back and forth between a processor and memory.
Verifying and testing the dies on the wafer after the manufacturing.
The science of finding defects on a silicon wafer.
A brand name for a group of wireless networking protocols and technology,
3D memory interface standard
Creating interconnects between IC and package using a thin wire.
Wired communication, which passes data through wires between devices, is still considered the most stable form of communication.
A way of moving data without wires.
IC interconnect architecture
X Propagation causes problems
A data-driven system for monitoring and improving IC yield and reliability.
A vulnerability in a product’s hardware or software discovered by researchers or attackers that the producing company does not know about and therefore does not have a fix for yet.