Hyperscale And Artificial Intelligence Are Reshaping Value Chains

Data centers and high-performance computing are undergoing tectonic shifts, and the transformation is just beginning.

popularity

Observing electronic ecosystems and value chains change over time is fascinating. For instance, the design chain for mobile devices fundamentally changed over the past two decades with waves of disaggregation and aggregation. Today, the area of computing and data centers is amid tectonic shifts and transformation, with the combination of hyperscale, networking, artificial intelligence (AI), and machine learning (ML) fundamentally re-shuffling value creation.

Back in 2002, Grant Martin and I wrote “A Design Chain for Embedded Systems” for IEEE Computer. We described the embedded SoC provider-integrator design chain and argued that “what used to be a vertically integrated process within each product company has become significantly fragmented. Platform-based design can accelerate the flow in this chain.” The associated graphic looked like this:


2002: A disaggregated design chain for embedded systems

For instance, until the early 2000s, the design chain from embedded mobile systems was dominated by platform-based designs like the TI OMAP platform. The semiconductor vendors would provide all drivers and then interact with software OS vendors like Palm OS, Symbian, Microsoft WinCE and PocketPC 2002 to port their operating systems to their silicon, to then provide it jointly to device and equipment manufacturers.

Ten years later, a keynote by Nimish Modi at CDNLive Israel 2011 featured a slide emphasizing how the once fragmented design chain was aggregating into vertically integrated design approaches again:


2011: Aggregation vs. disaggregation in the mobile market

Today we arguably face a mixed industry, with some companies taking control of more of the full hardware/software stack. As Alan Kaye once said, “People who are really serious about software should make their own hardware.”

So, what about the compute market? Spoiler alert: The changes have the potential to dwarf what happened in mobile, both in terms of depth and scope.

As to depth, hyperscale operators accounted for a third of all spending on data center hardware and software in 2019. “Hyper” literally means over or excessive, and the word “scale” refers to scope, size or extent. At this point, Synergy Research Group defines 24 global companies as “hyperscalers.” Two tiers, mega hyperscale operators and SAAS, and platform companies and cloudlets are defined by Data Center Frontiers. The former include the likes of Google, Microsoft, Amazon Web Services (AWS), Facebook, Apple and Alibaba, and the latter tier includes cloud providers like Oracle, Baidu and and China Telecom, along with SaaS providers like Salesforce, SAP, Workday, Paypal and Dropbox and platform companies like Uber and Lyft.

In terms of scope, hyperscale is tying together the traditional markets of computing, networking, storage and sensing, and it is augmented by AI and ML, adding new computational software aspects at a scale never seen before. The traditional market segments for electronics are consumer, compute, wireless, wired, automotive and industrial, with aerospace/defense and health sometimes split out separately from industrial. Now consider the path from the end devices all the way to the data center as pictured below.


2020+: Network from 5G end points though the edge and network to hyperscale data centers

Hyperscale, AI/ML, and 5G are all connected! The hyperscale ecosystem includes the traditional compute and storage, and the networks both inside the data center—leading to the NVIDIA/Mellanox combination—and the networks connecting to the end devices—with the edge sometimes now being defined as everything that has a latency of less than 20ms. Where compute for AI/ML can be performed heavily depends on the latency that is required. This explains the large number of projects going on in this area of edge compute. And the scope even widens further when considering that the massive scale of data comes from devices across all eight vertical domains mentioned above. Consumer game data, real-time health information, mobile app usage data, automotive and aerospace data to optimize maintenance of cars and planes, you name it.

My inaugural Frankly Speaking blog, “Ripple Effects Through Value Chains,” focused on mobile almost a decade ago. The transformation in the compute/hyperscale value chain is just starting. Good examples are Cisco entering the chip market, supplying Microsoft and Facebook (see Bloomberg), and Facebook planning to develop its own AI chips (see Forbes). And as a timely example this week, the Fujtsu Fugako Supercomputer, featuring 415 PetaFlops, became the new #1 on the top 500 list of supercomputers. It is powered by Arm, which is a first, and it has been emulated extensively. The value chain has been re-shuffled all the way to the processor ecosystem!

In closing, Fujitsu’s Fugako reminds me of my first exposure to HPC computing, going back to my last years of university. Professor Dr. Hans Weinerth, together with Professor Dr. Otto Manck, taught a class dealing with microelectronics’ economic impact. I would later join Dr. Weinerth’s company SICAN Microelectronics, which featured a CONVEX and a Cray Supercomputer for high-performance computing research. When asked what supercomputers would be useful for in the context of microelectronics, Dr. Weinerth’s eyes would light up, and he would bring up topics like “world simulation” and “global weather prediction” to better prepare for disasters. It has been a while, but now we are doing “world simulation” on supercomputers like Fugaku! Go, hyperscale!



Leave a Reply


(Note: This name will be displayed publicly)