Machine-learning inference started out as a data-center activity, but tremendous effort is being put into inference at the edge.
At this point, the “edge” is not a well-defined concept, and future inference capabilities will reside not only at the extremes of the data center and a data-gathering device, but at multiple points in between.
“Inference isn't a function that has to resid...
» read more