Faster SerDes For More Efficient Data Centers


The evolving data center presents an imposing set of challenges for system architects as Dennard Scaling fades and Moore’s Law wanes. These include an exponential increase in data, shifting architectural bottlenecks and a never-ending demand for higher performance within the same power and thermal envelopes. The Internet of Things (IoT), Big Data analytics, in-memory computing and machine ... » read more

From The Data Center To The Mobile Edge


At the heart of the Internet of Things is the complex interplay between the needs for both low power and high performance (LPHP), a perplexing challenge rooted in the de-facto bifurcation of the IoT itself. For example, lower power mobile devices, systems and lite endpoints make up the vast majority of forward-facing consumer infrastructure, while high-performance servers at the back end are ta... » read more

Architecting Memory For Next-Gen Data Centers


The industry’s insatiable appetite for increased bandwidth and ever-higher transfer rates is driven by a burgeoning Internet of Things (IoT), which has ushered in a new era of pervasive connectivity and generated a tsunami of data. In this context, datacenters are currently evaluating a wide range of new memory initiatives. All seek to optimize efficiency by reducing data transport, thus sign... » read more

Shifting Performance Bottlenecks Driving Change In Chip And System Architectures


The rise of personal computing in the 1980s — along with graphical user interfaces (GUIs) and applications ranging from office apps to databases — drove the demand for faster chips capable of removing processing bottlenecks and delivering a more responsive end-user experience. Indeed, the semiconductor industry has certainly come quite a long way since IBM launched its PC way back in 1981. ... » read more

Addressing Modern Bottlenecks With Smart Data Acceleration


Over the past 30 years, the relentless progression of Moore’s Law has driven dramatic improvements in transistor counts and ultimately in processor performance. CPU performance was often the primary factor in determining overall system performance, leading us to believe that better CPUs led to better system performance. But, as processors have become more powerful, other subsystems have not k... » read more

Which Memory Type Should You Use?


I continue to get besieged by statements in which memory “latency” and “bandwidth” get misused. As I mentioned in my last blog, latency is defined as how long the CPU needs to wait before the first data is available, while bandwidth is how fast additional data can be “streamed” after the first data point has arrived. Bandwidth becomes a bigger factor in performance when data is stor... » read more

The Memory And Storage Hierarchy


The memory and storage hierarchy is a useful way of thinking about computer systems, and the dizzying array of memory options available to the system designer. Many different parameters characterize the memory solution. Among them are latency (how long the CPU needs to wait before the first data is available) and bandwidth (how fast additional data can be “streamed” after the first data poi... » read more

Don’t Let The Headlines Trick You


This is the time of year when reports get issued summarizing the sales results of the server market in the first quarter. As a way of grabbing attention, many of the headlines will mention that the results of the first quarter are below those of the fourth quarter, bringing to mind all sorts of doomsday scenarios. Don’t be fooled. In many industries, sales exhibit a large seasonal compo... » read more

Traffic Jam?


This week, the first week in which school was out of session for the summer, I noticed that my commute to work was much shorter than it had been, reduced from about 25 minutes to 15 minutes. It’s always hard for me to believe that such a simple thing, as fewer drivers on the road due to summer vacations, is enough to cause such wild swings in commute times. I took advantage of the additional ... » read more

How Is Your HBM Memory?


The seemingly countless applications used every day requiring web access (social media, streaming video, games, etc.) are not only driving the need to store a tremendous amount of data, but also driving the need to access this data with as little delay as possible. Add to this list the growing number of connected devices (IoT), and you can see why changes in the data center are needed, in parti... » read more

← Older posts