Originally Posted by Jarhead
No, you're not taking me literal enough. When I say Blender I really do mean Blender. LTT's numbers showed the 2080TI losing to a Vega64 in specifically Blender, and outright getting embarrassed by one of these. And it plays games. If that appeals to you(it actually does to me) you might wanna start saving just in case AMD decides to do another run of these. But my priority this year is a Ryzen 3000 base-rig. The GPU/display is secondary.
LTT most likely didn't use a CUDA accelerated version of Blender that support the RTX cards.
Radeon VII are just chips that failed ECC verification. Vega 20 was designed for deep learning and HPC. Vega 20 support was built ground up on linux using the ROCm stack and drivers. There are a ton of task that Vega 20 surpasses nVidia but Blender isn't one.
Edit: I want to add the Radeon VII is a great value for both single precision training along with scientific research that uses double precision. Which the architecture was designed for with the instinct cards.
Core i7 6700K 4.8Ghz @ 1.4v
Maximus VIII Formula
Radeon RX VEGA 64 @ 1750/1105Mhz with 1175Mv
Samsung 850 Evo 2TB Raid 0
EVGA SuperNova 1200w P2
EK Supremacy Full Copper Clean
EK-FC Radeon Vega
XSPC D5 Photon v2
Black Ice Gen 2 GTX360 x2
Thermaltake Core X5 Tempered Glass Edition
Windows 10 Pro
Cherry MX Board 6.0
Definitive Technology Incline
▲ hide details ▲
Last edited by WannaBeOCer; 02-10-2019 at 12:57 AM.