Open-Source Verification


Ask different people what open-source verification means and you will get a host of different answers. They range from the verification of open-source hardware, to providing an open-source verification infrastructure, to providing open-source stream generators or reference models, to open-source simulators and formal verification engines. Verification is about reducing risk. "Verification is... » read more

The Race To Much More Advanced Packaging


Momentum is building for copper hybrid bonding, a technology that could pave the way toward next-generation 2.5D and 3D packages. Foundries, equipment vendors, R&D organizations and others are developing copper hybrid bonding, which is a process that stacks and bonds dies using copper-to-copper interconnects in advanced packages. Still in R&D, hybrid bonding for packaging provides mo... » read more

Smaller Nodes, Much Bigger Problems


João Geada, chief technologist at Ansys, sat down with Semiconductor Engineering to talk about device scaling, advanced packaging, increasing complexity and the growing role of AI. What follows are excerpts of that conversation. SE: We've been pushing along Moore's Law for roughly a half-century. What sorts of problems are you seeing now that you didn't see a couple nodes ago? Geada: The... » read more

Integrating FPGA: Comparison Of Chiplets Vs. eFPGA


FPGA is widely popular in systems for its flexibility and adaptability. Increasingly, it is being used in high volume applications. As volumes grow, system designers can consider integration of the FPGA into an SoC to reduce cost, reduce power and/or improve performance. There are two options for integrating FPGA into an SoC: FPGA chiplets, which replace the power hungry SERDES/PHYs wit... » read more

Rethinking Architectures Based On Power


The newest chips being developed for everything from the cloud to the edge of the network look nothing like designs of even a year or two ago. They are architected for speed, from the throughput of high-speed buses and external interconnects to the customized accelerators and arrays of redundant MACs. But many of these designs have barely scratched the surface for saving power, which will becom... » read more

Maximizing Value Post-Moore’s Law


When Moore's Law was in full swing, almost every market segment considered moving to the next available node as a primary way to maximize value. But today, each major market segment is looking at different strategies that are more closely aligned with its individual needs. This diversity will end up causing both pain and opportunities in the supply chain. Chip developers must do more with a ... » read more

Enabling Cost-Effective, High-Performance Die-to-Die Connectivity


System advances in accelerated computing platforms such as CPUs, GPUs and FPGAs, heterogeneous systems on chip (SoCs) for AI acceleration and high-speed networking/interconnects have all pushed chip integration to unprecedented levels. This requires more complex designs and higher levels of integration, larger die sizes and adopting the most advanced geometries as quickly as possible. Facing th... » read more

Power Impact At The Physical Layer Causes Downstream Effects


Data movement is rapidly emerging as one of the top design challenges, and it is being complicated by new chip architectures and physical effects caused by increasing density at advanced nodes and in multi-chip systems. Until the introduction of the latest revs of high-bandwidth memory, as well as GDDR6, memory was considered the next big bottleneck. But other compute bottlenecks have been e... » read more

Next Challenge: Known Good Systems


The leading edge of design is heading toward multi-die/multi-chiplet architectures, and an increasing number of mainstream designs likely will follow as processing moves closer to the edge. This doesn't mean every chipmaker will be designing leading-edge chips, of course. But more devices will have at least some leading-edge logic or will be connected over some advanced interconnect scheme t... » read more

What’s After PAM-4?


[This is part 2 of a 2-part series. Part 1 can be found here.] The future of high-speed physical signaling is uncertain. While PAM-4 remains one of the key standards today, there is widespread debate about whether PAM-8 will succeed it. This has an impact on everything from where the next bottlenecks are likely to emerge and the best approaches to solving them, to how chips, systems and p... » read more

← Older posts Newer posts →