11 Ways To Reduce AI Energy Consumption


As the machine-learning industry evolves, the focus has expanded from merely solving the problem to solving the problem better. “Better” often has meant accuracy or speed, but as data-center energy budgets explode and machine learning moves to the edge, energy consumption has taken its place alongside accuracy and speed as a critical issue. There are a number of approaches to neural netw... » read more

AI Inference Acceleration


Geoff Tate, CEO of Flex Logix, talks about considerations in choosing an AI inference accelerator, how that fits in with other processing elements on a chip, what tradeoffs are involved with reducing latency, and what considerations are the most important. » read more

New Ways To Optimize Machine Learning


As more designers employ machine learning (ML) in their systems, they’re moving from simply getting the application to work to optimizing the power and performance of their implementations. Some techniques are available today. Others will take time to percolate through the design flow and tools before they become readily available to mainstream designers. Any new technology follows a basic... » read more

AI’s Impact On Power And Performance


AI/ML is creeping into everything these days. There are AI chips, and there are chips that include elements of AI, particularly for inferencing. The big question is how well they will affect performance and power, and the answer isn't obvious. There are two main phases of AI, the training and the inferencing. Almost all training is done in the cloud using extremely large data sets. In fact, ... » read more

Edge Inferencing Challenges


Geoff Tate, CEO of Flex Logix, talks about balancing different variables to improve performance and reduce power at the lowest cost possible in order to do inferencing in edge devices. https://youtu.be/1BTxwew--5U » read more