Modeling Analytics for Computational Storage


This paper discusses the expected performance benefits of offloading some important basic database operations — namely Scan, Filter and Project — to computational storage. We evaluate the performance estimate model using TPC-DS workload and two database engines running on Hadoop clusters: SPARK- SQL and Presto. This paper is organized as follows: after covering previous computational sto... » read more

Addressing Pain Points In Chip Design


Semiconductor Engineering sat down to discuss the impact of multi-physics and new market applications on chip design with John Lee, general manager and vice president of ANSYS' Semiconductor Business Unit; Simon Burke, distinguished engineer at Xilinx, Duane Boning, professor of electrical engineering and computer science at MIT; and Thomas Harms, director EDA/IP Alliance at Infineon. What foll... » read more

Changes In Data Storage and Usage


Doug Elder, vice president and general manager of OptimalPlus, talks about what’s changing in the storage and collection, including using data lakes and data engineering to break down silos and get data into a consistent format, and why it’s essential to define data up front based upon how quickly it needs to be accessed, as well as who actually owns the data. » read more

Solving The Memory Bottleneck


Chipmakers are scrambling to solve the bottleneck between processor and memory, and they are turning out new designs based on different architectures at a rate no one would have anticipated even several months ago. At issue is how to boost performance in systems, particularly those at the edge, where huge amounts of data need to be processed locally or regionally. The traditional approach ha... » read more

Big Shifts In Big Data


The big data market is in a state of upheaval as companies begin shifting their data strategies from "nothing" or "everything" in the cloud to a strategic mix, squeezing out middle-market players and changing what gets shared, how that data is used, and how best to secure it. This has broad implications for the whole semiconductor supply chain, because in many cases it paves the way for ... » read more

Searching For A System Abstraction


Without abstraction, advances in semiconductor design would have stalled decades ago and circuits would remain about the same size as analog blocks. No new abstractions have emerged since the 1990s that have found widespread adoption. The slack was taken up by IP and reuse, but IP blocks are becoming larger and more complex. Verification by isolation is no longer a viable strategy at the system... » read more

Shifting Performance Bottlenecks Driving Change In Chip And System Architectures


The rise of personal computing in the 1980s — along with graphical user interfaces (GUIs) and applications ranging from office apps to databases — drove the demand for faster chips capable of removing processing bottlenecks and delivering a more responsive end-user experience. Indeed, the semiconductor industry has certainly come quite a long way since IBM launched its PC way back in 1981. ... » read more

One On One: John Lee


John Lee, general manager and vice president of Ansys—and the former CEO of data analytics firm Gear Design Solutions, which Ansys acquired in September—sat down with Semiconductor Engineering to talk about how big data techniques can be used in semiconductor and system design. What follows are excerpts of that conversation. SE: What's your goal now that Gear has been acquired by [getent... » read more

Integration Or Segregation


In the Electronics Butterfly Effect story, the observation was made that the electronics industry has gone non-linear, no longer supported by incremental density and cost-reducing improvements that Moore’s Law promised with each new node. Those incremental changes, over several decades, have meant that design and architecture have followed a predictable path with very few new ideas coming in ... » read more

Cloud 2.0


Corporate data centers are reluctant adopters of new technology. There is too much at stake to make quick changes, which accounts for a number of failed semiconductor startups over the past decade with better ideas for more efficient processors, not to mention rapid consolidation in other areas. But as the amount of data increases, and the cost of processing that data decreases at a slower rate... » read more

← Older posts