AI Races To The Edge


AI is becoming increasingly sophisticated and pervasive at the edge, pushing into new application areas and even taking on some of the algorithm training that has been done almost exclusively in large data centers using massive sets of data. There are several key changes behind this shift. The first involves new chip architectures that are focused on processing, moving, and storing data more... » read more

Vision Transformers Change The AI Acceleration Rules


Transformers were first introduced by the team at Google Brain in 2017 in their paper, "Attention is All You Need". Since their introduction, transformers have inspired a flurry of investment and research which have produced some of the most impactful model architectures and AI products to-date, including ChatGPT which is an acronym for Chat Generative Pre-trained Transformer. Transformers a... » read more

Data Collection For Edge AI / Tiny ML With Sensors


Reality AI software from Renesas provides solution suites and tools for R&D engineers who build products and internal solutions using sensors. Working with accelerometers, vibration, sound, electrical (current/voltage/ capacitance), radar, RF, proprietary sensors, and other types of sensor data, Reality AI software identifies signatures of events and conditions, correlates changes in signat... » read more

A Packet-Based Architecture For Edge AI Inference


Despite significant improvements in throughput, edge AI accelerators (Neural Processing Units, or NPUs) are still often underutilized. Inefficient management of weights and activations leads to fewer available cores utilized for multiply-accumulate (MAC) operations. Edge AI applications frequently need to run on small, low-power devices, limiting the area and power allocated for memory and comp... » read more

Low Density Of LPDDR4x DRAM — The Best Choice For Edge AI


Edge AI computes the data as close as possible to the physical system. The advantage is that the processing of data does not require a connected network. The computation of data happens near the edge of a network, where the data is being developed, instead of in a centralized data-processing center. One of the biggest benefits of edge AI is the ability to secure real-time results for time-sensi... » read more

Review of Tools & Techniques for DL Edge Inference


A new technical paper titled "Efficient Acceleration of Deep Learning Inference on Resource-Constrained Edge Devices: A Review" was published in "Proceedings of the IEEE" by researchers at University of Missouri and Texas Tech University. Abstract: Successful integration of deep neural networks (DNNs) or deep learning (DL) has resulted in breakthroughs in many areas. However, deploying thes... » read more

Edge-AI Hardware for Extended Reality


New technical paper titled "Memory-Oriented Design-Space Exploration of Edge-AI Hardware for XR Applications" from researchers at Indian Institute of Technology Delhi and Reality Labs Research, Meta. Abstract "Low-Power Edge-AI capabilities are essential for on-device extended reality (XR) applications to support the vision of Metaverse. In this work, we investigate two representative XR w... » read more

MIPI In Next Generation Of AI IoT Devices At The Edge


The history of data processing begins in the 1960’s with centralized on-site mainframes that later evolved into distributed client servers. In the beginning of this century, centralized cloud computing became attractive and began to gain momentum becoming one of the most popular computing tools today. In recent years however, we have seen an increase in the demand for processing... » read more

Fast and Flexible FPGA-based NoC Hybrid Emulation


Researchers from RWTH Aachen University and Otto-von-Guericke Universitat Magdeburg have published a new technical paper titled "EmuNoC: Hybrid Emulation for Fast and Flexible Network-on-Chip Prototyping on FPGAs." Abstract: "Networks-on-Chips (NoCs) recently became widely used, from multi-core CPUs to edge-AI accelerators. Emulation on FPGAs promises to accelerate their RTL modeling co... » read more

AI At The Edge: Optimizing AI Algorithms Without Sacrificing Accuracy


The ultimate measure of success for AI will be how much it increases productivity in our daily lives. However, the industry has huge challenges in evaluating progress. The vast number of AI applications is in constant churn: finding the right algorithm, optimizing the algorithm, and finding the right tools. In addition, complex hardware engineering is rapidly being updated with many different s... » read more

← Older posts Newer posts →