The Implications Of AI Everywhere: From Data Center To Edge


Generative AI has upped the ante on the transformative force of AI, driving profound implications across all aspects of our everyday lives. Over the past year, we have seen AI capabilities placed firmly in the hands of consumers. The recent news and product announcements emerging from MWC 2024 highlighted what we can expect to see from the next wave of generative AI applications. AI will be eve... » read more

AI Tradeoffs At The Edge


AI is impacting almost every application area imaginable, but increasingly it is moving from the data center to the edge, where larger amounts of data need to be processed much more quickly than in the past. This has set off a scramble for massive improvements in performance much closer to the source of data, but with a familiar set of caveats — it must use very little power, be affordable... » read more

Addressing The New Wave Of The IoT With Edge AI


Today, we are witnessing the exponential growth IoT is experiencing. Every second, 127 devices are getting connected with an expected forecast for 43 billion IoT devices by 2027. As this market grows and evolves, so does the demand for more sophisticated, powerful, energy efficient and accurate system solutions that can help enrich the way of life. Among the many crucial technologies enabling ... » read more

Preparing For An AI-Driven Future In Chips


Experts at the Table: Semiconductor Engineering sat down to discuss the impact of AI on semiconductor architectures, tools, and security, with Michael Kurniawan, business strategy manager at Accenture; Kaushal Vora, senior director and head of business acceleration and ecosystem at Renesas Electronics; Paul Karazuba, vice president of marketing at Expedera; and Chowdary Yanamadala, technology s... » read more

AI Races To The Edge


AI is becoming increasingly sophisticated and pervasive at the edge, pushing into new application areas and even taking on some of the algorithm training that has been done almost exclusively in large data centers using massive sets of data. There are several key changes behind this shift. The first involves new chip architectures that are focused on processing, moving, and storing data more... » read more

Vision Transformers Change The AI Acceleration Rules


Transformers were first introduced by the team at Google Brain in 2017 in their paper, "Attention is All You Need". Since their introduction, transformers have inspired a flurry of investment and research which have produced some of the most impactful model architectures and AI products to-date, including ChatGPT which is an acronym for Chat Generative Pre-trained Transformer. Transformers a... » read more

Data Collection For Edge AI / Tiny ML With Sensors


Reality AI software from Renesas provides solution suites and tools for R&D engineers who build products and internal solutions using sensors. Working with accelerometers, vibration, sound, electrical (current/voltage/ capacitance), radar, RF, proprietary sensors, and other types of sensor data, Reality AI software identifies signatures of events and conditions, correlates changes in signat... » read more

A Packet-Based Architecture For Edge AI Inference


Despite significant improvements in throughput, edge AI accelerators (Neural Processing Units, or NPUs) are still often underutilized. Inefficient management of weights and activations leads to fewer available cores utilized for multiply-accumulate (MAC) operations. Edge AI applications frequently need to run on small, low-power devices, limiting the area and power allocated for memory and comp... » read more

Low Density Of LPDDR4x DRAM — The Best Choice For Edge AI


Edge AI computes the data as close as possible to the physical system. The advantage is that the processing of data does not require a connected network. The computation of data happens near the edge of a network, where the data is being developed, instead of in a centralized data-processing center. One of the biggest benefits of edge AI is the ability to secure real-time results for time-sensi... » read more

Review of Tools & Techniques for DL Edge Inference


A new technical paper titled "Efficient Acceleration of Deep Learning Inference on Resource-Constrained Edge Devices: A Review" was published in "Proceedings of the IEEE" by researchers at University of Missouri and Texas Tech University. Abstract: Successful integration of deep neural networks (DNNs) or deep learning (DL) has resulted in breakthroughs in many areas. However, deploying thes... » read more

← Older posts