Driving AI, ML To New Levels On MCUs


One of the most dramatic impacts of technology of late has been the implementation of artificial intelligence and machine learning on small edge devices, the likes of which are forming the backbone of the Internet of Things. At first, this happened through sheer engineering willpower and innovation. But as the drive towards a world of a trillion connected devices accelerates, we must find wa... » read more

Neural Network Performance Modeling Software


nnMAX Inference IP is nearing design completion. The nnMAX 1K tile will be available this summer for design integration in SoCs, and it can be arrayed to provide whatever inference throughput is desired. The InferX X1 chip will tape out late Q3 this year using 2x2 nnMAX tiles, for 4K MACs, with 8MB SRAM. The nnMAX Compiler is in development in parallel, and the first release is available now... » read more

Rushing To The Edge


Virtually every major tech company has an "edge" marketing presentation these days, and some even have products they are calling edge devices. But the reality is that today no one is quite sure how to define the edge or what it will become, and any attempts to pigeon-hole it are premature. What is becoming clear is the edge is not simply an extension of the Internet of Things. It is the resu... » read more

Week In Review: Design, Low Power


IP Flex Logix debuted its new InferX X1 edge inference co-processor, which incorporates the interconnect technology from its eFPGAs and its inference-optimized nnMAX clusters. The chip focuses on high throughput in edge applications with a single DRAM and is optimized for small batch sizes in edge applications where there is typically only one camera/sensor. InferX X1 will be available as chip... » read more

Machine Learning on Arm Cortex-M Microcontrollers


Machine learning (ML) algorithms are moving to the IoT edge due to various considerations such as latency, power consumption, cost, network bandwidth, reliability, privacy and security. Hence, there is an increasing interest in developing Neural Network (NN) solutions to deploy them on low-power edge devices such as the Arm Cortex-M microcontroller systems. CMSIS-NN is an open-source library of... » read more

Preparing For War On The Edge


War clouds are gathering over the edge of the network. The rush by the reigning giants of data—IBM, Amazon, Facebook, Alibaba, Baidu, Microsoft and Apple—to control the cloud by building mammoth hyperscale data centers  is being met with uncertainty at the edge of the network. In fact, just the emergence of the edge could mean that all bets are off when it comes to data dominance. It... » read more

AI: Where’s The Money?


A one-time technology outcast, Artificial Intelligence (AI) has come a long way. Now there’s groundswell of interest and investments in products and technologies to deliver high performance visual recognition, matching or besting human skills. Equally, speech and audio recognition are becoming more common and we’re even starting to see more specialized applications such as finding optimized... » read more

Pushing AI Into The Mainstream


Artificial intelligence is emerging as the driving force behind many advancements in technology, even though the industry has merely scratched the surface of what may be possible. But how deeply AI penetrates different market segments and technologies, and how quickly it pushes into the mainstream, depend on a variety of issues that still must be resolved. In addition to a plethora of techni... » read more

Cutting The Cord: How Edge Intelligence Is Enabling The IoT To Go Where Cloud Can’t


In a world where data’s time to value or irrelevancy may be measured in milliseconds, the latency introduced in transferring data to the cloud threatens to undermine many of the Internet of Things’ most compelling use cases. Think of data as the fuel that powers our new decision-making engines – fail to get the fuel to the engines fast enough and the engine splutters and dies. Meanwhil... » read more

AI Chip Architectures Race To The Edge


As machine-learning apps start showing up in endpoint devices and along the network edge of the IoT, the accelerators that make AI possible may look more like FPGA and SoC modules than current data-center-bound chips from Intel or Nvidia. Artificial intelligence and machine learning need powerful chips for computing answers (inference) from large data sets (training). Most AI chips—both tr... » read more

← Older posts Newer posts →