[TheNextPlatform] Intel Prepares to Graft Google's BFLOAT16 onto Processors - Overclock.net - An Overclocking Community

Forum Jump: 

[TheNextPlatform] Intel Prepares to Graft Google's BFLOAT16 onto Processors

 
Thread Tools
post #1 of 5 (permalink) Old 07-21-2019, 04:49 PM - Thread Starter
sudo apt install sl
 
WannaBeOCer's Avatar
 
Join Date: Dec 2009
Posts: 5,154
Rep: 168 (Unique: 120)
[TheNextPlatform] Intel Prepares to Graft Google's BFLOAT16 onto Processors

Source: https://www.nextplatform.com/2019/07...to-processors/

Quote:
According to Dubey, IEEE’s FP16 format reduces the dynamic range too much in an effort to keep more bits for precision, but again, that’s not the tradeoff you want for deep learning computations. What often happens is that with FP16, the model doesn’t converge, so you end up needing to tune the hyperparameters – things like the learning rate, batch size, and weight decay.

Thus was born bfloat16, affectionately known as 16-bit “brain” floating point. Developed originally by Google and implemented in its third generation Tensor Processing Unit (TPU), bfloat16 has attracted some important backers. In particular, Intel is implementing bfloat16 instructions in its upcoming Cooper Lake Xeon processors, as well as on its initial Nervana Neural Network Processor for training, the NNP-T 1000.

Silent
(20 items)
CPU
Core i9 9900K... CoffeeTime! @ 4.2Ghz w/ 1v
Motherboard
Maximus VIII Formula
GPU
Radeon VII @ 1900Mhz/1250Mhz w/ 1v
RAM
TeamGroup Xtreem 16GB 3866Mhz CL15
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 500GB
Power Supply
EVGA SuperNova 1200w P2
Cooling
EK Supremacy Full Copper Clean
Cooling
XSPC D5 Photon v2
Cooling
Black Ice Gen 2 GTX360 x2
Cooling
EK-Vector Radeon VII - Copper + Plexi
Case
Thermaltake Core X5 Tempered Glass Edition
Operating System
Clear Linux
Monitor
Acer XF270HUA
Keyboard
Cherry MX Board 6.0
Mouse
Logitech G600
Mouse
Alugraphics GamerArt
Audio
Definitive Technology Incline
Audio
SMSL M8A
▲ hide details ▲
WannaBeOCer is online now  
Sponsored Links
Advertisement
 
post #2 of 5 (permalink) Old 07-21-2019, 08:36 PM
I might have tacos tonite
 
Crazy9000's Avatar
 
Join Date: Mar 2006
Location: Seattle, Washington
Posts: 22,284
Rep: 489 (Unique: 344)
So essentially what the "neural network" learning programs are more interested in is being in the right ballpark of number size, and not necessarily having the exact right number. So for example, 11,000 is fine, they don't care as much that the number should actually be 11,213.62

BFLOAT16 lets them handle larger numbers without needing as much memory as the overly-precise floating point. Cool to see this technology start to develop more and more.

Crazy9000 is offline  
post #3 of 5 (permalink) Old 07-21-2019, 10:33 PM
What should be here ?
 
huzzug's Avatar
 
Join Date: Jun 2012
Posts: 5,217
Rep: 356 (Unique: 255)
Quote: Originally Posted by Crazy9000 View Post
So essentially what the "neural network" learning programs are more interested in is being in the right ballpark of number size, and not necessarily having the exact right number. So for example, 11,000 is fine, they don't care as much that the number should actually be 11,213.62

BFLOAT16 lets them handle larger numbers without needing as much memory as the overly-precise floating point. Cool to see this technology start to develop more and more.
Wouldn't the AI algos learning curve help cover the remaining 213 to still get a 99% accurate results at fraction of the cost?

#2 their debt is insane, even for a "diverse field" company. They cannot even afford to service the debt maintenance let alone make an actual dent in the debt itself. - Internet Stranger
huzzug is offline  
Sponsored Links
Advertisement
 
post #4 of 5 (permalink) Old 07-21-2019, 10:44 PM
I might have tacos tonite
 
Crazy9000's Avatar
 
Join Date: Mar 2006
Location: Seattle, Washington
Posts: 22,284
Rep: 489 (Unique: 344)
Quote: Originally Posted by huzzug View Post
Wouldn't the AI algos learning curve help cover the remaining 213 to still get a 99% accurate results at fraction of the cost?
Exactly.

Crazy9000 is offline  
post #5 of 5 (permalink) Old 07-23-2019, 06:50 AM
New to Overclock.net
 
EniGma1987's Avatar
 
Join Date: Sep 2011
Posts: 6,266
Rep: 336 (Unique: 246)
Quote: Originally Posted by huzzug View Post
Wouldn't the AI algos learning curve help cover the remaining 213 to still get a 99% accurate results at fraction of the cost?



FP16 has 5 exponent bits and 10 fraction bits, BF16 has 8 exponent bits and only 7 fraction bits. This lets BFloat16 numbers have a significantly larger range of numbers so the initial training can commence much faster. There are multiple training stages, and you would use 16-bit FP calculations for the first stage as you dont necessarily want a ton of precision. This is getting the data performing relatively consistently and in the ballpark of what you expect the result to be. After the initial training you move to higher precision training to fine tune things. The nice thing about BFloat16 is that it follows the same format as regular FP32, only with a bunch of the fraction bits truncated off. So converting an FP32 to a BF16 and vise versa is extremely easy due to the exponent bits matching up, where a normal FP16 has a different actual format.


So the AI isnt really covering up the extra 213 in your example, it isnt even trying to deal with that yet. It is simply a way to get to "11,000" at a much faster rate before switching to the second training stage to find the last 213 to dial it in.

EniGma1987 is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off