Overclock.net › Forums › Industry News › Hardware News › [TGD] People “probably won’t” need discrete graphics cards anymore – Intel
New Posts  All Forums:Forum Nav:

[TGD] People “probably won’t” need discrete graphics cards anymore – Intel

post #1 of 7
Thread Starter 
Quote:
Shanghai (China) – The days of discrete graphics cards are coming to an end, according to an Intel representative we talked to at the Shanghai Intel Developer Forum. Ron Fosner, an Intel Graphics and Gaming Technologist and former video game programmer, told TG Daily that multi-core CPUs will put an end to multi-GPU madness and that people “probably won’t need†discrete cards in the future.

Fosner made these comment while demonstrating Intel’s ‘Smoke’ demo at the Technology Showcase at IDF. We posted a short version of the demo a few days ago and you can watch that video here. Today we went back to get some more detailed answers.

The demo is actually an incomplete demo that was supposed to simulate fire fighters putting out a raging inferno. “We didn’t put in the fire fighters yet,†said Fosner. So instead of triumphant fire fighters, Intel chose to rain fiery destruction onto a house comprised of thousands of individual pieces, all simulated on a Nehalem 4-core, 2 threads per core, processor.

Nothing is pre-baked in this demo and the meteors don’t land in the same spot on each run. Lucky for us, Fosner managed to smash a meteor through the front porch of the shack. He explained that the demo then procedurally simulated fire as a particle emitter system with bounding boxes. When the particles hit a tree branch bounding box, the branch then becomes an emitter itself.

The demo also simulated animals like deer and rabbits running around on the ground. Dozens of birds are also flying around. Threads are devoted to creature AI and the animals try to run away, sometimes unsuccessfully, from the flames. Multi-core systems could lead to more realistic crowd animations and better weather and environment effects, according to Fosner.

Fosner told us that multi-core CPUs are more than capable of rendering complex scenes that used to be reserved for top-end graphics cards. He argued that Intel processors offered “more bang for the buck†and that it was more economical to go from single to multiple core processors versus popping multiple graphics cards into a machine. “The fact of the matter is that you’re going to have one graphics card, you may have a dual graphics card, but you’re not going to have a four graphics card or eight graphics card system,†said Fosner.

Another advantage to CPU graphics and physics programming is that people won’t need to continually keep up with the latest programming techniques of all the newest cards – this means futzing around with shader models and DirectX programming will be a thing of the past. Fosner said that “everybody†knows how to program for a CPU and that this new way of programming will “get rid of†a path of graphics obsolescence.

When asked if discrete graphics cards will be needed in the future, Fosner answered, “Probably notâ€. He explained that computer didn’t have discrete graphics in the 80s and that CPUs are becoming powerful enough to take over that role.
Source and Video Interview
ElRigTheRig
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 G0 @ 2.4 GHz Asus Maximus Formula Asus EN8800GT 512MB 4GB G.Skill DDR2-1000 
Hard DriveOptical DriveOSMonitor
1.858TB (3x JBOD) HP 16x DVD Burner Vista Ultimate x64 SP2 24" FPD2485 Gateway LCD 
KeyboardPowerCaseMouse
Razer Lycosa Ultra X3 1000w Antec P182 Logitech G5 
Mouse Pad
Razer Xact Mat 
  hide details  
Reply
ElRigTheRig
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 G0 @ 2.4 GHz Asus Maximus Formula Asus EN8800GT 512MB 4GB G.Skill DDR2-1000 
Hard DriveOptical DriveOSMonitor
1.858TB (3x JBOD) HP 16x DVD Burner Vista Ultimate x64 SP2 24" FPD2485 Gateway LCD 
KeyboardPowerCaseMouse
Razer Lycosa Ultra X3 1000w Antec P182 Logitech G5 
Mouse Pad
Razer Xact Mat 
  hide details  
Reply
post #2 of 7
omg NICE! Hope the Thinkpad I'ma buy will be good enough. Rep+
post #3 of 7
Hey, hate to be a bubble buster, but it was already posted. Different source, but same quote from shanghai.

http://www.overclock.net/hardware-ne...ome-thing.html
Quad Rig
(15 items)
 
  
CPUMotherboardGraphicsRAM
i5 3570K Gigabyte GA-Z77x-UD5H Sapphire 5850 16 GB G.skill 1600 
Hard DriveOptical DriveCoolingOS
Crucial M4 256gb 2x Samsung 18x DVD-RW Custom Watercooling Windows 7 Home 
MonitorKeyboardPowerCase
27" Asus 1080p Logitech G710+ Corsair 750TX CoolerMaster Stacker 810 
MouseMouse PadAudio
Logitech G500s Steelseries Klipsch Promedia 5.1 
  hide details  
Reply
Quad Rig
(15 items)
 
  
CPUMotherboardGraphicsRAM
i5 3570K Gigabyte GA-Z77x-UD5H Sapphire 5850 16 GB G.skill 1600 
Hard DriveOptical DriveCoolingOS
Crucial M4 256gb 2x Samsung 18x DVD-RW Custom Watercooling Windows 7 Home 
MonitorKeyboardPowerCase
27" Asus 1080p Logitech G710+ Corsair 750TX CoolerMaster Stacker 810 
MouseMouse PadAudio
Logitech G500s Steelseries Klipsch Promedia 5.1 
  hide details  
Reply
post #4 of 7
Computers didn't have discrete cards in the 80's is part of that dude's explanation.... what a fail comment

I'm sure that Nvidia isn't going anywhere for a long, long time.... regardless of CPU power...
post #5 of 7
Ahh, more of intel poking nvidia. Such friendly adversairies.
    
CPUMotherboardGraphicsRAM
Q9000 Uh, iunno SLI 280m 6GB DDR3 
Hard DriveOSMonitorKeyboard
500gb Raid0 Windows 7 64bit 17" 1920x1200 Logitech MK700 
CaseMouseMouse Pad
Alienware M17X body Logitech G9 Xtrac Ripper 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Q9000 Uh, iunno SLI 280m 6GB DDR3 
Hard DriveOSMonitorKeyboard
500gb Raid0 Windows 7 64bit 17" 1920x1200 Logitech MK700 
CaseMouseMouse Pad
Alienware M17X body Logitech G9 Xtrac Ripper 
  hide details  
Reply
post #6 of 7
People won't need discrete GPU's once quantam computing happens... in about 20 years maybe... or when the human brain is the computer ( ~ 30 years, although the first brain-powered devices will appear this holiday season (courtesy of OCZ's brain-powered game controller / mouse deal)).
post #7 of 7
Quote:
Originally Posted by R3ap3R View Post
People won't need discrete GPU's once quantam computing happens... in about 20 years maybe... or when the human brain is the computer ( ~ 30 years, although the first brain-powered devices will appear this holiday season (courtesy of OCZ's brain-powered game controller / mouse deal)).
And why not? Seems the only application for Quantum Computing so far will be in encryption. If there is a use for quantum computers when it comes to gaming (AI springs to mind*), can you please share it and enlighten us, if not please don't pull inquirers.


* = Perhaps they could use the power of a quantum chip to figure out what moves you are going to make, and select the most probably ones you will do and then react accordingly? It should be pretty fast to calculate, since a qbit can be a 1 and a 0 at the same time.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [TGD] People “probably won’t” need discrete graphics cards anymore – Intel