Overclock.net › Forums › Graphics Cards › Graphics Cards - General › I'm not convinced the "Let's offload everything on the Graphics Card" culture is exactly benign.
New Posts  All Forums:Forum Nav:

I'm not convinced the "Let's offload everything on the Graphics Card" culture is exactly benign.

post #1 of 3
Thread Starter 
I'm sure everyone that follows the Graphics Card technology for the past 10-20 years, even on the surface, would have noticed the obvious tendency of offloading the most they can on the graphics card. It started with textures, it went to geometry and shaders (all that process improving and extending with each iteration), and nowadays shaders are whole beasts. They call them "shaders" but in reality they are whole programs responsible for almost any visual effect you see on the screen.
Hell, NVIDIA plans on using a whole general purpose processor on the graphics card extending even more the importance of shaders.
Now, I'm not going to claim this culture is necessarily bad, after all I'm not an expert and it has produced results.
But, it's hard to not notice the obvious: 1. It benefits the hardware manufacturers of graphics cards enormously and 2. Intel is not exactly bad when it gets to the rest of the system, so I'm not convinced keeping it out of the loop is exactly wise technologically. Besides, most people blame Intel for low graphics performance but that's lately a mere APU, without its own heatsink, without its own memory and so on. Now, I'm not claiming a monopoly of Intel is good, so that's a benefit right there, but the fact of the matter is, I see a general push towards graphics cards "playing whole computer" while their parent companies aren't as good in shrinking the transistor as intel (mainly because they don't have their own Foundries).
In case anyone has doubts concerning the extremity of the graphics card contribution on live interactive rendering nowadays, he only has to look at the extreme deviation between graphics and general purpose benchmarks. e.g. fast system RAM is almost useless in gaming since most the needed stuff used are pre-loaded on VRAM before even beginning rendering them, and in contrast, system applications that deal with massive amounts of memory e.g. offline video editing, benefit enormously from system RAM.
As a result we have phenomena like gaming consoles using ONLY video ram as system ram and on most systems a divide between subsystems: Almost like rendering half of the computer useless (graphics sub-system or general purpose subsystem) on one occasion and then doing the reverse on another occasion.
Maybe it's time to have only one "subsystem" and not waste resources like that.
Edited by fateswarm - 7/2/13 at 6:43pm
PC
(9 items)
 
  
CPUMotherboardGraphicsRAM
4790K 4.6G 1.24v/1.74v, 5.1G 1.35v validation. GA-Z97X-Gaming 7 Tri-X R9 290 1100/1350 +0.012v G.Skill 2400 c10 
CoolingMonitorPowerCase
Noctua NH-D15 24EA53 IPS 76Hz OC EVGA 1000 G2 Phanteks Enthoo Pro 
Audio
SoundMagic E10 
  hide details  
Reply
PC
(9 items)
 
  
CPUMotherboardGraphicsRAM
4790K 4.6G 1.24v/1.74v, 5.1G 1.35v validation. GA-Z97X-Gaming 7 Tri-X R9 290 1100/1350 +0.012v G.Skill 2400 c10 
CoolingMonitorPowerCase
Noctua NH-D15 24EA53 IPS 76Hz OC EVGA 1000 G2 Phanteks Enthoo Pro 
Audio
SoundMagic E10 
  hide details  
Reply
post #2 of 3
Quote:
Maybe it's time to have only one "subsystem" and not waste resources like that.

The only companies it benefits is full SOCs and AMD APUs with HSA

GPUs have much more floating point performance and multithreading compared to CPUs

That's why things like raytracing and fluid simulation use GPUs
Edited by AlphaC - 7/2/13 at 7:15pm
Workstation stuff
(407 photos)
SpecViewperf 12.0.1
(133 photos)
PGA 1331
(13 items)
 
CPUMotherboardGraphicsRAM
AMD Zen SR7 octocore (Ryzen 7 1700) Overclockable AM4 motherboard X370 To be determined , AMD Vega? 2x8GB DDR4 low-profile or heatsink-less 
Hard DriveHard DriveCoolingCooling
Samsung 950 Pro / 960 Evo / 960 Pro 256GB or 51... Samsung 850 Evo 1TB SSD Storage Black or black+white Twin tower air cooler or s... EK Vardar F2-140 140mm, Phanteks PH-F140SP 140m... 
CoolingOSMonitorPower
Fractal Design Dynamic GP14 (included with case) Win 10 Pro 64 bit 4K monitor with Freesync EVGA Supernova G3/P2 750W or 850W 
Case
Fractal Design Define R5 Blackout edition 
  hide details  
Reply
Workstation stuff
(407 photos)
SpecViewperf 12.0.1
(133 photos)
PGA 1331
(13 items)
 
CPUMotherboardGraphicsRAM
AMD Zen SR7 octocore (Ryzen 7 1700) Overclockable AM4 motherboard X370 To be determined , AMD Vega? 2x8GB DDR4 low-profile or heatsink-less 
Hard DriveHard DriveCoolingCooling
Samsung 950 Pro / 960 Evo / 960 Pro 256GB or 51... Samsung 850 Evo 1TB SSD Storage Black or black+white Twin tower air cooler or s... EK Vardar F2-140 140mm, Phanteks PH-F140SP 140m... 
CoolingOSMonitorPower
Fractal Design Dynamic GP14 (included with case) Win 10 Pro 64 bit 4K monitor with Freesync EVGA Supernova G3/P2 750W or 850W 
Case
Fractal Design Define R5 Blackout edition 
  hide details  
Reply
post #3 of 3
in theory if GPU tech becomes more efficient than CPU tech at performing common tasks, thats the road that will be taken by not only GPU manufacturers, but also CPU manufacturers as well (as witnessed by amd apu?)
technology progression seems to be a littlebit like a form of energy. and like all energy types, it flows in the path of least resistance (in this case, calculation efficiency)

that being said, due to the difference in calculations, i dont think the cpu will "die" in favor of a gpu architechture but rather continue to work together to find the most efficient form of calculations for different tasks.

I think most people (average users/consumers) share the same view as you in a way.
as a whole, people dont like change, they like to stick with what they know works, so the standardized common computer as it is seen today will still be the same for quite some time to come.

just imagen the socket on the motherboards of the future housing a gpu chip and your pcie would be your cpu rolleyes.gif

i guess it would in turn be called an o-cpu (OFF-Central Processing Unit)
Lol

EDIT
I guess i have more to say. You mentioned consols only using VRAM instead of system ram, which is a little off topic to your main argument "which seems to center around wasting resources) due to the system being fully optimized for one thing.
gaming.
If it doesnt need further resources, it is excluded in order to not waste them.


Gaming on a computer is vastly different but you also have to remember that the cpu constantly has to feed the GPU info. There are very few programs that use GPU calculations without being heavy on the cpu as well (if im not mistaken) like folding, bitcoin mining, litecoin mining, (etc)
Edited by AlaskaFox - 7/2/13 at 7:37pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Graphics Cards - General
Overclock.net › Forums › Graphics Cards › Graphics Cards - General › I'm not convinced the "Let's offload everything on the Graphics Card" culture is exactly benign.