I might have tacos tonite
Join Date: Mar 2006
Location: Seattle, Washington
So essentially what the "neural network" learning programs are more interested in is being in the right ballpark of number size, and not necessarily having the exact right number. So for example, 11,000 is fine, they don't care as much that the number should actually be 11,213.62
BFLOAT16 lets them handle larger numbers without needing as much memory as the overly-precise floating point. Cool to see this technology start to develop more and more.