PCI Express 5.0 Takes Center Stage For Data Centers


The demands on servers at the heart of data centers continue an inexorable rise. Responding to these demands, new platforms keep coming that deliver greater computing performance, have more memory and use faster interconnects. On the way at the end of this year and early 2022 are new server platforms that will take performance to a new level. These new platforms will transition to DDR5 DIMMs fo... » read more

New Uses For AI


AI is being embedded into an increasing number of technologies that are commonly found inside most chips, and initial results show dramatic improvements in both power and performance. Unlike high-profile AI implementations, such as self-driving cars or natural language processing, much of this work flies well under the radar for most people. It generally takes the path of least disruption, b... » read more

Privacy Protection A Must For Driver Monitoring


Driver monitoring systems are so tied into a vehicle's architecture that soon the driver will not be able to opt out because the vehicle will only operate if the driver is detected and monitored. This is raising privacy concerns about whether enough security is in place for the data to remain private. At the very least, laws and regulations in every geography where the vehicle will operate a... » read more

Secure TSN Ethernet With MACsec Is Now Possible


For end-to-end security of data, it must be secured both when at rest (processed or stored in a device) and when in motion (communicated between connected devices). For data at rest, a hardware root of trust anchored in silicon provides that foundation upon which all data security is built. Applications, OS, and boot code all depend on the root of trust as the source of confidentiality, integri... » read more

Computing Where Data Resides


Computational storage is starting to gain traction as system architects come to grips with the rising performance, energy and latency impacts of moving large amounts of data between processors and hierarchical memory and storage. According to IDC, the global datasphere will grow from 45 zettabytes in 2019 to 175 by 2025. But that data is essentially useless unless it is analyzed or some amou... » read more

Week In Review: Design, Low Power


Tools & IP Codasip unveiled three commercially licensed add-ons to the Western Digital SweRV Core EH1, aiming to allow it to be designed into a wider range of applications. The SweRV Core EH1 is a 32-bit, dual-issue, RISC-V ISA core with a 9-stage pipeline, open-sourced through CHIPS Alliance. The add-ons offer a floating-point unit (FPU) that supports the RISC-V single precision [F] and d... » read more

Week In Review: Auto, Security, Pervasive Computing


Automotive/Mobility General Motors is working on a next version of Ultium battery chemistry and announced a joint development agreement with Singapore-based SolidEnergy Systems, a lithium metal battery startup founded by a graduate of MIT. The companies plan to open a Woburn, Massachusetts prototype production line by 2023. GM’s is attempting to lower the cost of its proprietary battery tech... » read more

Domain-Specific Memory


Domain-specific computing may be all the rage, but it is avoiding the real problem. The bigger concern is the memories that throttle processor performance, consume more power, and take up the most chip area. Memories need to break free from the rigid structures preferred by existing software. When algorithms and memory are designed together, improvements in performance are significant and pr... » read more

Tradeoffs To Improve Performance, Lower Power


Generic chips are no longer acceptable in competitive markets, and the trend is growing as designs become increasingly heterogeneous and targeted to specific workloads and applications. From the edge to the cloud, including everything from vehicles, smartphones, to commercial and industrial machinery, the trend increasingly is on maximizing performance using the least amount of energy. This ... » read more

HBM2E Raises The Bar For AI/ML Training


The largest AI/ML neural network training models now exceed an enormous 100 billion parameters. With the rate of growth over the last decade on a 10X annual pace, we’re headed to trillion parameter models in the not-too-distant future. Given the tremendous value that can be derived from AI/ML (it is mission critical to five of six of the top market cap companies in the world), there has been ... » read more

← Older posts Newer posts →