Accelerating Endpoint Inferencing


Chipmakers are getting ready to debut inference chips for endpoint devices, even though the rest of the machine-learning ecosystem has yet to be established. Whatever infrastructure does exist today is mostly in the cloud, on edge-computing gateways, or in company-specific data centers, which most companies continue to use. For example, Tesla has its own data center. So do most major carmake... » read more

Processing Moves To The Edge


Edge computing is evolving from a relatively obscure concept into an increasingly complex component of a distributed computing architecture, in which processing is being shifted toward end devices and satellite data facilities and away from the cloud. Edge computing has gained attention in two main areas. One is the [getkc id="78" kc_name="industrial IoT"], where it serves as a do-it-yoursel... » read more