Knowledge Center
Navigation
Knowledge Center

Von Neumann Architecture

The basic architecture for most computing today, based on the principle that data needs to move back and forth between a processor and memory.
popularity

Description

The von Neumann architecture is the basis of almost all computing done today. Developed roughly 80 years ago, it assumes that every computation pulls data from memory, processes it, and then sends it back to memory. This has created what is known as the von Neumann bottleneck, where the penalty is throughput, cost and power.

One such change involves rethinking how much data really needs to be shipped to memory, and how much can be stored locally. So rather than send everything through to memory, multiple caches and proxy caches can shortcut the flow of data from accelerator chips to different devices. While this is technically still a von Neumann architectural approach, it’s a much more fine-grained version of it. The big difference is that the starting point is the data and follows how it moves, rather than relying just on a centralized chip architecture to handle everything. In effect, it puts the burden on the architecture, which is defined by the software, rather than the speed or process geometry of any single chip.

Multimedia

PCIe 5.0 Drill-Down

Multimedia

Inferencing At The Edge

Multimedia

Edge Inferencing Challenges

Multimedia

Using ASICs For AI Inferencing