Defining Edge Memory Requirements


Defining edge computing memory requirements is a growing problem for chipmakers vying for a piece of this market, because it varies by platform, by application, and even by use case. Edge computing plays a role in artificial intelligence, automotive, IoT, data centers, as well as wearables, and each has significantly different memory requirements. So it's important to have memory requirement... » read more

Blog Review: June 20


Mentor's Randy Allen digs into OpenACC, a collection of directives and routines to help a compiler uncover and schedule parallelism, plus an examination of the GCC implementation's performance. Cadence's Paul McLellan takes a look at the shifting opinions on FD-SOI vs. finFET as Dan Hutcheson of VLSI Research finds most see the two as complementary technologies in his latest survey. Synop... » read more

Can Machine Learning Chips Help Develop Better Tools With Machine Learning?


As we continue to be bombarded with AI- and machine learning-themed presentations at industry conferences, an ex-colleague told me that he is sick of seeing an outline of the human head with a processor in place of the brain. If you are a chip architect trying to build one of these data-centric architecture chips for machine learning or AI (as opposed to the compute-centric chips, which you pro... » read more

Chip Dis-Integration


Just because something can be done does not always mean that it should be done. One segment of the semiconductor industry is learning the hard way that continued chip integration has a significant downside. At the same time, another another group has just started to see the benefits of consolidating functionality onto a single substrate. Companies that have been following Moore's Law and hav... » read more

Near-Threshold Issues Deepen


Complex issues stemming from near-threshold computing, where the operating voltage and threshold voltage are very close together, are becoming more common at each new node. In fact, there are reports that the top five mobile chip companies, all with chips at 10/7nm, have had performance failures traced back to process variation and timing issues. Once a rather esoteric design technique, near... » read more

Improving Test Coverage And Eliminating Test Escapes Using Analog Defect Analysis


While the analog and mixed-signal components are the leading source of test escapes that result in field failures, the lack of tools to analyze the test coverage during design has made it difficult for designers to address the issue. In this white paper, we explore the methodology for performing analog fault simulation of test coverage based on defect-oriented testing. In addition, we look at h... » read more

Blog Review: June 13


Synopsys' Taylor Armerding looks at what the flaws in OpenPGP and S/MIME encryption means for the IoT and warns that the problems of patching such devices could lead to an increasing chance of security failures. Cadence's Paul McLellan takes a peek at Imec's roadmap to see what the path to 3nm looks like, how nanosheets fit in, and why design and system technology co-optimization is necessar... » read more

Blog Review: June 6


In a video, Cadence's Marc Greenberg discusses the advantages and trade-offs of HBM2 and GDDR6, two advanced memory interfaces targeted to the high-performance computing market. Synopsys' Ravindra Aneja takes a look at what's needed for AI-focused hardware designs and how formal can help with the necessary data path verification. In a video, Mentor's Colin Walls explains the challenges of... » read more

FPGAs Becoming More SoC-Like


FPGAs are blinged-out rockstars compared to their former selves. No longer just a collection of look-up tables (LUTs) and registers, FPGAs have moved well beyond into now being architectures for system exploration and vehicles for proving a design architecture for future ASICs. This family of devices now includes everything from basic programmable logic all the way up to complex SoC devices.... » read more

The Week In Review: Design


Tools Real Intent launched Verix SimFix, an intent-driven verification solution for gate-level simulation (GLS) of digital designs designed to eliminate X-pessimism. SimFix uses mathematical methods to identify conditions under which pessimism can occur, and to determine the correct value when those conditions occur. It then generates files to use in simulation that detect and correct pessimis... » read more

← Older posts