The Darker Side Of Consolidation


Another wave of consolidation is underway in the semiconductor industry, setting the stage for some high-stakes competitive battles over market turf and sowing confusion across the supply chain about continued support throughout a product's projected lifetime. The consolidation comes as chipmakers already are grappling with rising complexity, the loss of a roadmap for future designs as Moore... » read more

Market And Tech Inflections Ahead


Aart de Geus, chairman and co-CEO of Synopsys, sat down with Semiconductor Engineering to talk about the path to autonomous vehicles, industry dis-aggregation and re-aggregation, security issues, and who's going to pay for chips at advanced nodes. SE: All of a sudden we have a bunch of new markets opening up for electronics. We have assisted and autonomous driving, AI and machine learning, v... » read more

Tuesday At DAC 2018


The morning starts with the Accellera Breakfast. Accellera has made some significant progress this year and we can expect to hear about the approval of the Portable Stimulus 1.0 specification later in the conference as well as the initial release of SystemC CCI as well as a proposal for the creation of an IP Security Assurance Working Group, which will discuss standards development to address s... » read more

What’s Next In R&D?


Luc Van den hove, president and chief executive of Imec, sat down with Semiconductor Engineering to discuss R&D challenges and what’s next in the arena. The Belgium R&D organization is working on AI, DNA storage, EUV, semiconductors and other technologies. What follows are excerpts of that conversation. SE: Moore’s Law is slowing down. And it is becoming more expensive to move fr... » read more

Can AI Alter The Burgeoning Design Cost Trend?


Everyone in the semiconductor design arena has experienced or at least observed the impact of increasing costs for complex SoC silicon. Semico’s recently released report entitled "Silicon and Software Design Cost Analysis" reveals the cost associated with a first time design effort for a high-end, advanced performance multicore SoC using 7nm process technology can top $195M for both the silic... » read more

Big Trouble At 3nm


As chipmakers begin to ramp up 10nm/7nm technologies in the market, vendors are also gearing up for the development of a next-generation transistor type at 3nm. Some have announced specific plans at 3nm, but the transition to this node is expected to be a long and bumpy one, filled with a slew of technical and cost challenges. For example, the design cost for a 3nm chip could exceed an eye-p... » read more

Defining Edge Memory Requirements


Defining edge computing memory requirements is a growing problem for chipmakers vying for a piece of this market, because it varies by platform, by application, and even by use case. Edge computing plays a role in artificial intelligence, automotive, IoT, data centers, as well as wearables, and each has significantly different memory requirements. So it's important to have memory requirement... » read more

System Bits: June 19


ML algorithm 3D scan comparison up to 1,000 times faster To address the issue of medical image registration that typically takes two hours or more to meticulously align each of potentially a million pixels in the combined scans, MIT researchers have created a machine-learning algorithm they say can register brain scans and other 3D images more than 1,000 times more quickly using novel learning... » read more

Can Machine Learning Chips Help Develop Better Tools With Machine Learning?


As we continue to be bombarded with AI- and machine learning-themed presentations at industry conferences, an ex-colleague told me that he is sick of seeing an outline of the human head with a processor in place of the brain. If you are a chip architect trying to build one of these data-centric architecture chips for machine learning or AI (as opposed to the compute-centric chips, which you pro... » read more

System Bits: June 12


Writing complex ML/DL analytics algorithms Rice University researchers in the DARPA-funded Pliny Project believe they have the answer for every stressed-out systems programmer who has struggled to implement complex objects and workflows on ‘big data’ platforms like Spark and thought: “Isn’t there a better way?” Their answer: Yes with PlinyCompute, which the team describes as “a sys... » read more

← Older posts Newer posts →