True 3D-IC Problems


Placing logic on logic may sound like a small step, but several problems must be overcome to make it a reality. True 3D involves wafers stacked on top of each other in a highly integrated manner. This is very different from 2.5D integration, where logic is placed side-by-side, connected by an interposer. And there are some intermediate solutions today where significant memory is stacked on l... » read more

Designing For In-Circuit Monitors


In every application space the semiconductor ecosystem touches, in-circuit monitors and sensors are playing an increasing role in silicon lifecycle management and concepts around reliability and resiliency — both during design as well as in the field. The combination of true system-level design, in/on-chip monitors, and improved data analysis are expected to drastically improve reliability... » read more

New Standards Push Co-Packaged Optics


Co-packaged optics (CPOs) promise five times the bandwidth of pluggable connections, but the new architecture requires multiple changes to accommodate different applications. The Optical Internetworking Forum (OIF) recently published standards for co-packaged optics, which are the photonic industry’s hope for handling today’s faster Ethernet interfaces, as well as increasing speeds and p... » read more

Meeting The Major Challenges Of Modern Memory Design


Memory lies at the heart of every electronics application, and demand is growing all the time. Users want ever greater capacity, throughput, and reliability. At the same time, time to market (TTM) goals and competitive pressures mandate that memories be developed in ever shorter project schedules. These requirements put enormous pressure on designers of discrete memory chips, memory dies in 2.5... » read more

Blog Review: April 26


Codasip's Tora Fridholm introduces the NimbleAI project, an effort to design a neuromorphic sensing and processing 3D integrated chip that implements an always-on sensing stage, highly specialized event-driven processing kernels and neural networks to perform visual inference of selected stimuli using the bare minimum amount of energy. Synopsys' Anjaneya Thakar discusses computational lithog... » read more

Silicon Lifecycle Management Advances With Unified Analytics


In a typical day in the life of a product engineer, they have gone through the requisite wafer sort testing in manufacturing with the next step to assemble the resultant good die into their respective packages. While performing a series of parametric tests during final test, yield issues are encountered and the process of finding the source of the issues begins. Luckily, with access to a good d... » read more

IC Security Issues Grow, Solutions Lag


Experts at the Table: Semiconductor Engineering sat down to talk about the growing chip security threat and what's being done to mitigate it, with Mike Borza, Synopsys scientist; John Hallman, product manager for trust and security at Siemens EDA; Pete Hardee, group director for product management at Cadence; Paul Karazuba, vice president of marketing at Expedera; and Dave Kelf, CEO of Breker V... » read more

Week In Review: Design, Low Power


Cadence rolled out a slew of new products at this week’s CDNLive Silicon Valley, including: A new generative AI-powered tool for analog, mixed-signal, RF and photonics design; An extended collaboration with TSMC and Microsoft to advance giga-scale physical verification system in the cloud; A multi-year partnership with the San Francisco 49ers football organization, focused on sust... » read more

Smarter Ways To Manufacture Chips


OSAT and wafer fabs are beginning to invest in Industry 4.0 solutions in order to improve efficiency and reduce operating costs, but it's a complicated process that involves setting up frameworks to evaluate different options and goals. Semiconductor manufacturing facilities have relied on dedicated automation teams for decades. These teams track and schedule chip production, respond to equi... » read more

100G Ethernet At The Edge


The amount of data is growing, and so is the need to process it closer to the source. The edge is a middle ground between the cloud and the end point, close enough to where data is generated to reduce the time it takes to process that data, yet still powerful enough to analyze that data quickly and send it wherever it is needed. But to make this all work requires faster conduits for that data i... » read more

← Older posts Newer posts →