Converting analog to digital creates approximation errors due to quantization noise. Here’s how to minimize the impact.
Analog to Digital Converters (ADCs) are critical components in high-speed, high-resolution applications where an analog or RF signal has to be processed, stored, or transported in digital form. ADC performance requirements vary by application and include resolution, dynamic range, linearity, power consumption, speed, bandwidth, SNDR (Signal-to-Noise and Distortion Ratio), and ENOB (Effective Number of Bits).
Designers face a wide variety of challenges in verifying ADC functionality against performance specifications. Converting a continuous analog signal into a discrete digital code yields unavoidable approximation errors due to quantization noise, which designers can minimize by choosing the most appropriate architecture for their application. In nanometer technology nodes, ADC performance is also significantly impacted by device noise, postlayout parasitics, process variability, and device mismatch.
To read more, click here.
Disaggregation and the wind-down of Moore’s Law have changed everything.
Different interconnect standards and packaging options being readied for mass chiplet adoption.
Suppliers are investing new 300mm capacity, but it’s probably not enough. And despite burgeoning 200mm demand, only Okmetic and new players in China are adding capacity.
Continued expansion in new and existing markets points to massive and sustained growth.
Aging equipment and rising demand are pushing up prices and slowing production.
Disaggregation and the wind-down of Moore’s Law have changed everything.
It depends on whom you ask, but there are advantages to both.
Research shows significant improvement in time to market and optimization of key metrics.
Efficiency is improving significantly, but the amount of data is growing faster.
Some designs focus on power, while others focus on sustainable performance, cost, or flexibility. But choosing the best option for an application based on benchmarks is becoming more difficult.
The clock network is complex, critical to performance, but often it’s treated as an afterthought. Getting this wrong can ruin your chip.
Moving forward will require a fundamental reconsideration of logic.
After years of research, chipmakers have started combining ultra low-power designs with advancements in harvesting technology.
Leave a Reply