Overclock.net › Forums › Industry News › Hardware News › [TH]Physics Drivers Outrage
New Posts  All Forums:Forum Nav:

[TH]Physics Drivers Outrage - Page 5

post #41 of 77
Quote:
Originally Posted by ENTERPRISE View Post
Actually GPU Physics are a little overated. Granted a GPU can do many more calculations per second that a CPU. HOWEVER a CPU of today is more than capable complex physics calculation with not much of a performance hit. Games today simply do not have the amount of physics to warrant GPU calculations as Games physics engines are just not yet that advanced.

I personally think the whole Physics processing on a GPU is overrated. However thats just me.
Physics processing is able to take advantage of massively parallel hardware. CPU's are good at sequential tasks (try writing an OS that will run on a GPU :x), but physics is not one of those tasks. It is true that CPU's are very capable of running the complex calculations, they just aren't good at running the complex, independent calculations for thousands of particles simultaneously.

Physics on the GPU is in no way overrated, we just haven't seen many impressive implementations yet.

GPU's in the past have been restricted to rasterization, but there are many, many other problems for which the massively parallel, floating point machines known as GPU's are better suited. Folding, collision detection, and physics are a few the areas in which people (and the hardware manufacturers) are beginning to realize that a GPU has a lot more to offer than a CPU.
Edited by rabidgnome229 - 6/26/08 at 3:53am
It goes to eleven
(13 items)
 
  
CPUMotherboardGraphicsRAM
E6300 DS3 EVGA 8600GTS 2GB XMS2 DDR2-800 
Hard DriveOSMonitorKeyboard
1.294 TB Arch Linux/XP Samsung 226bw Eclipse II 
PowerCaseMouse
Corsair 520HX Lian-Li v1000B Plus G7 
  hide details  
Reply
It goes to eleven
(13 items)
 
  
CPUMotherboardGraphicsRAM
E6300 DS3 EVGA 8600GTS 2GB XMS2 DDR2-800 
Hard DriveOSMonitorKeyboard
1.294 TB Arch Linux/XP Samsung 226bw Eclipse II 
PowerCaseMouse
Corsair 520HX Lian-Li v1000B Plus G7 
  hide details  
Reply
post #42 of 77
Thread Starter 
Quote:
Originally Posted by ENTERPRISE View Post
Actually GPU Physics are a little overated. Granted a GPU can do many more calculations per second that a CPU. HOWEVER a CPU of today is more than capable complex physics calculation with not much of a performance hit. Games today simply do not have the amount of physics to warrant GPU calculations as Games physics engines are just not yet that advanced.

I personally think the whole Physics processing on a GPU is overrated. However thats just me.
Yeah, I think that is just you. Have you ever tried playing a game that was designed, from the bottom up for physics hardware and then tried playing it with CPU/software mode? It's a slide show, each and every single time. Don't believe me? Go play in the Crysis editor and stack up 500 barrels and then see what happens when they start to tumble. Go download Endorphin, from natural motion and then start to play around with some of the more complex physics scenarios. Go check out some of the CPU vs. GPU comparisons in physics demonstrations. Jesus Christ, how you can have not seen the enormous leaps in performance when we switch over to GPU physics processing? This is about the most uneducated post in this entire thread, and it was made by a mod, no less. Go get educated.

I hope that GPU physics processing will someday soon allow us to go far beyond the equivalent physics of 500 barrels. But that's something that the CPUs of today will not allow us to do without taking severe performance hits.
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
post #43 of 77
Thread Starter 
Quote:
Originally Posted by Sikozu View Post
Wow in this thread people actually managed to get the two main points here of why this is bad. Cookie for all

Physics section is done seperatly and NOT when other gpu intensive work is going on, so NOT a indication of performance in game what so ever.

There are NO decent games that support physX and in the ones that do they are piss easy to run. UT3 runs maxed out without a problem on a 8800gt at 1680x1050, completly negating the need for any kind of PPU. Most games that support physx are about as popular as the iraq war.
Go download and play UT3: Extreme Physics Mod from Agiea and see how slow it runs on your CPU.
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
post #44 of 77
Quote:
Originally Posted by Porthios View Post
Yeah, I think that is just you. Have you ever tried playing a game that was designed, from the bottom up for physics hardware and then tried playing it with CPU/software mode? It's a slide show, each and every single time. Don't believe me? Go play in the Crysis editor and stack up 500 barrels and then see what happens when they start to tumble. Go download Endorphin, from natural motion and then start to play around with some of the more complex physics scenarios. Go check out some of the CPU vs. GPU comparisons in physics demonstrations. Jesus Christ, how you can have not seen the enormous leaps in performance when we switch over to GPU physics processing? This is about the most uneducated post in this entire thread, and it was made by a mod, no less. Go get educated.

I hope that GPU physics processing will someday soon allow us to go far beyond the equivalent physics of 500 barrels. But that's something that the CPUs of today will not allow us to do without taking severe performance hits.
You're wrong and I agree with ENTERPRISE.
First of all most games designed with physics acceleration where mostly marketing for Ageia Cards. Plenty other games like HL2 had decent physics that looked realistic and ran well on single core CPUs.

I'm yet to see one game with PhysisX that blew me away. Also, early reports said running hardware physics actually decreased FPS in games.
The only game where it ran well was Cell Factor but that game was heavily sponsored by Ageia.

But in comes nVidia and suddenly PhysiX is magically the way to go, a lot of great games are using it, bla, bla, bla...
Nonsense! There is plenty to be done on the CPU. GPUs are already overstressed so dumping more workload on them is not going to offer much performance increase if any. We need to balance the workload not dump all of it in one place (which is exactly what nVidia wants so they can come and claim how the CPU is obsolete and how their vision of a future with no CPUs is the best.)

P.S Don't call people uneducated when you say stuff like this: "Jesus Christ, how you can have not seen the enormous leaps in performance when we switch over to GPU physics processing?"
What enormous leaps in performance, where? I haven't seen any - except 3dMark Vantage scores which mean close to nothing in real world applications where physics and 3d rendering are being done at the same time and not separately.
post #45 of 77
Quote:
Originally Posted by Porthios View Post
Yeah, I think that is just you. Have you ever tried playing a game that was designed, from the bottom up for physics hardware and then tried playing it with CPU/software mode? It's a slide show, each and every single time. Don't believe me? Go play in the Crysis editor and stack up 500 barrels and then see what happens when they start to tumble. Go download Endorphin, from natural motion and then start to play around with some of the more complex physics scenarios. Go check out some of the CPU vs. GPU comparisons in physics demonstrations. Jesus Christ, how you can have not seen the enormous leaps in performance when we switch over to GPU physics processing? This is about the most uneducated post in this entire thread, and it was made by a mod, no less. Go get educated.

I hope that GPU physics processing will someday soon allow us to go far beyond the equivalent physics of 500 barrels. But that's something that the CPUs of today will not allow us to do without taking severe performance hits.
Wow - there is absolutely no need to be so combative. I realize that most forums employ omniscient gods as moderators, but, due to budget constraints, OCN decided to go with people. From your post it is clear that you know all there is to know about computers. The view from the top can be overwhelming, but I humbly ask that you try to be patient with us mere mortals who can only dream to one day reach your level of education.
It goes to eleven
(13 items)
 
  
CPUMotherboardGraphicsRAM
E6300 DS3 EVGA 8600GTS 2GB XMS2 DDR2-800 
Hard DriveOSMonitorKeyboard
1.294 TB Arch Linux/XP Samsung 226bw Eclipse II 
PowerCaseMouse
Corsair 520HX Lian-Li v1000B Plus G7 
  hide details  
Reply
It goes to eleven
(13 items)
 
  
CPUMotherboardGraphicsRAM
E6300 DS3 EVGA 8600GTS 2GB XMS2 DDR2-800 
Hard DriveOSMonitorKeyboard
1.294 TB Arch Linux/XP Samsung 226bw Eclipse II 
PowerCaseMouse
Corsair 520HX Lian-Li v1000B Plus G7 
  hide details  
Reply
post #46 of 77
Quote:
Originally Posted by rabidgnome229 View Post
Physics processing is able to take advantage of massively parallel hardware. CPU's are good at sequential tasks (try writing an OS that will run on a GPU :x), but physics is not one of those tasks. It is true that CPU's are very capable of running the complex calculations, they just aren't good at running the complex, independent calculations for thousands of particles simultaneously.

Physics on the GPU is in no way overrated, we just haven't seen many impressive implementations yet.

GPU's in the past have been restricted to rasterization, but there are many, many other problems for which the massively parallel, floating point machines known as GPU's are better suited. Folding, collision detection, and physics are a few the areas in which people (and the hardware manufacturers) are beginning to realize that a GPU has a lot more to offer than a CPU.
Thanks for that. That was very informative and I applaud you on the helpful post. Keep it up.

Quote:
Originally Posted by Porthios View Post
Yeah, I think that is just you. Have you ever tried playing a game that was designed, from the bottom up for physics hardware and then tried playing it with CPU/software mode? It's a slide show, each and every single time. Don't believe me? Go play in the Crysis editor and stack up 500 barrels and then see what happens when they start to tumble. Go download Endorphin, from natural motion and then start to play around with some of the more complex physics scenarios. Go check out some of the CPU vs. GPU comparisons in physics demonstrations. Jesus Christ, how you can have not seen the enormous leaps in performance when we switch over to GPU physics processing? This is about the most uneducated post in this entire thread, and it was made by a mod, no less. Go get educated.

I hope that GPU physics processing will someday soon allow us to go far beyond the equivalent physics of 500 barrels. But that's something that the CPUs of today will not allow us to do without taking severe performance hits.
Well thank you for pointing this out to me. I will indeed go and get some more information on this topic so I am better informed.

However I do think that you could have constructed your post ALOT better. Your post is rather rude. I think you could have responded in a much more friendly manner than you did. There is a difference between informing someone and just flaming them for there lack of knowledge on a particular subject. Flaming is imature at best! The fact that I am MOD does not make this a special case. The fact that I am a MOD does not mean I automatically know EVERYTHING about EVERYTHING !. Now I am a pretty easy going guy but If I saw this type of response to any other member on the forums then I would most likely infract you for it. I am hoping you will learn from this.

Now I will go and brush up on my knowledge concerning this subject ETC. However I think you ought to brush up on your Forum Etiquette. Just some friendly advice.
post #47 of 77
Quote:
Originally Posted by XenoRad View Post
You're wrong and I agree with ENTERPRISE.
First of all most games designed with physics acceleration where mostly marketing for Ageia Cards. Plenty other games like HL2 had decent physics that looked realistic and ran well on single core CPUs.

I'm yet to see one game with PhysisX that blew me away. Also, early reports said running hardware physics actually decreased FPS in games.
The only game where it ran well was Cell Factor but that game was heavily sponsored by Ageia.

But in comes nVidia and suddenly PhysiX is magically the way to go, a lot of great games are using it, bla, bla, bla...
Nonsense! There is plenty to be done on the CPU. GPUs are already overstressed so dumping more workload on them is not going to offer much performance increase if any. We need to balance the workload not dump all of it in one place (which is exactly what nVidia wants so they can come and claim how the CPU is obsolete and how their vision of a future with no CPUs is the best.)
I agree but at the same time I see no harm comming from nvidia trying to get phsyics running on GPUs
Nemesis NE-α
(15 items)
 
   
CPUMotherboardGraphicsRAM
Intel Core i7 4790k (Devil's Canyon) AsRock Z97 Extreme 4 Visiontek AMD 6990 Corsair XMS3 DDR3 1600 
Hard DriveCoolingOSMonitor
Samsung EVO 840 XSPC Raystorm Windows 8.1 Dell U2311H 1920x1080 IPS 
MonitorMonitorKeyboardPower
Dell U2311H 1920x1080 IPS Dell U2311H 1920x1080 IPS Ducky Blue Overclock.net Themed Mechanical Keyb... Corsair Silver 1kw  
CaseMouseAudio
Case Labs TH10 Logitech G502 Logitech 5.1 speakers w/ Onkyo Receiver 
CPUMotherboardGraphicsRAM
Intel Xeon W3520 i7 4.0ghz EVGA X58 Classified Visiontek 6990 GSkill 6GB DDR3 Pi 
Hard DriveOptical DriveOSMonitor
2x OCZ Vertex 60GB SSD ; 2x 1TB ; 2x 2TB Samsung BluRay Burner Windows 7 Ultimate 64x 3x Dell U2311H 23" 1080p IPS 
KeyboardPowerCaseMouse
Logitech G15 Gaming Keyboard SilverStone Strider 1000w Modular Power Supply Lian Li V2000 Plus Logitech G9 Gaming Laser Mouse 
Mouse Pad
Cyba Sniper Tracer (Acrylic Glass) 
  hide details  
Reply
Nemesis NE-α
(15 items)
 
   
CPUMotherboardGraphicsRAM
Intel Core i7 4790k (Devil's Canyon) AsRock Z97 Extreme 4 Visiontek AMD 6990 Corsair XMS3 DDR3 1600 
Hard DriveCoolingOSMonitor
Samsung EVO 840 XSPC Raystorm Windows 8.1 Dell U2311H 1920x1080 IPS 
MonitorMonitorKeyboardPower
Dell U2311H 1920x1080 IPS Dell U2311H 1920x1080 IPS Ducky Blue Overclock.net Themed Mechanical Keyb... Corsair Silver 1kw  
CaseMouseAudio
Case Labs TH10 Logitech G502 Logitech 5.1 speakers w/ Onkyo Receiver 
CPUMotherboardGraphicsRAM
Intel Xeon W3520 i7 4.0ghz EVGA X58 Classified Visiontek 6990 GSkill 6GB DDR3 Pi 
Hard DriveOptical DriveOSMonitor
2x OCZ Vertex 60GB SSD ; 2x 1TB ; 2x 2TB Samsung BluRay Burner Windows 7 Ultimate 64x 3x Dell U2311H 23" 1080p IPS 
KeyboardPowerCaseMouse
Logitech G15 Gaming Keyboard SilverStone Strider 1000w Modular Power Supply Lian Li V2000 Plus Logitech G9 Gaming Laser Mouse 
Mouse Pad
Cyba Sniper Tracer (Acrylic Glass) 
  hide details  
Reply
post #48 of 77
Doesn't anyone remember a year or so ago when ATi began talking about running another, smaller graphics card for physics processing? Like for example you had an x1950XT and an x1650PRO and the 1650 handled physics threads?

[SARCASM]Anyway you guys are all wrong. GeForce 7 series pwnzors the pants off all these new fangled GPGPUs![/SARCASM]

This is the one thing that annoys me when new hardware comes out or someone takes the lead in a certain field of the hardware industry. Everyone gets up in arms because they can't accept the fact that things move on and the amazing card they bought several months ago has been dethroned by that cards direct competitors. Seriously - get a life. Things move on, so embrace. I think everyone that was on the forums 2 years ago can remember the flame war after flamewar when C2D pwned the pants off the K8 architecture!

Anyway, hasn't THG always been a little biased towards the green team?
    
CPUMotherboardGraphicsRAM
Intel Centrino T2300 @ 1.83 GHz Intel GM945 Express Chipset Intel GMA950 2x1GB PC2-5300 667MHz 
Hard DriveOptical DriveOSMonitor
250GB 7200rpm WD Scorpio Black USB External DVD/CDRW Windows XP Pro SP3 12.1" 1024x768 DVI 
PowerCaseMouse
4400mAh Li-ion (7+ hours on power saving profile) Carbon Fibre with Magnesium alloy roll cage IBM Trackpoint 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Centrino T2300 @ 1.83 GHz Intel GM945 Express Chipset Intel GMA950 2x1GB PC2-5300 667MHz 
Hard DriveOptical DriveOSMonitor
250GB 7200rpm WD Scorpio Black USB External DVD/CDRW Windows XP Pro SP3 12.1" 1024x768 DVI 
PowerCaseMouse
4400mAh Li-ion (7+ hours on power saving profile) Carbon Fibre with Magnesium alloy roll cage IBM Trackpoint 
  hide details  
Reply
post #49 of 77
I think the power of GPU'S when it comes to physics processing is awesome. However the limiting factor is the current software out today. There is no actual real world game that really pushes the boundries yet...Not even Crysis. Yes it has more going on in game than most when it comes to physics but its not a game you would use to test the potential of GPU physics. I think you can only get a good idea from Benchmark apps that are designed with a view to test physics calculations. Apart from that I do not think we will see the FULL advantage of GPU physics processing for a while..when it comes to games ETC.
post #50 of 77
Thread Starter 
Quote:
Originally Posted by XenoRad View Post
You're wrong and I agree with ENTERPRISE.
First of all most games designed with physics acceleration where mostly marketing for Ageia Cards. Plenty other games like HL2 had decent physics that looked realistic and ran well on single core CPUs.

I'm yet to see one game with PhysisX that blew me away. Also, early reports said running hardware physics actually decreased FPS in games.
The only game where it ran well was Cell Factor but that game was heavily sponsored by Ageia.

But in comes nVidia and suddenly PhysiX is magically the way to go, a lot of great games are using it, bla, bla, bla...
Nonsense! There is plenty to be done on the CPU. GPUs are already overstressed so dumping more workload on them is not going to offer much performance increase if any. We need to balance the workload not dump all of it in one place (which is exactly what nVidia wants so they can come and claim how the CPU is obsolete and how their vision of a future with no CPUs is the best.)

P.S Don't call people uneducated when you say stuff like this: "Jesus Christ, how you can have not seen the enormous leaps in performance when we switch over to GPU physics processing?"
What enormous leaps in performance, where? I haven't seen any - except 3dMark Vantage scores which mean close to nothing in real world applications where physics and 3d rendering are being done at the same time and not separately.
Are you joking? Where are the giant leaps in performance? In programs that actually exploit the GPU's processing power. The only ones I've seen so far the ones that belong to researchers at universities.

So why haven't we seen any games out that take advantage of the GPU? BECAUSE IT'S A BRAND NEW TECHNOLOGY THAT JUST GOT INTEGRATED VIA PATCH TO A SMALL MINORITY OF COMPUTERS A FEW WEEKS AGO! Now, was that so hard to come up with on your own?

How many 3D titles were at the advent of VGA cards about 20 years ago? Not many. Why? Well, why the hell would a developer make a game that nobody could play, due to a lack of hardware? Fast forward 2 years into the future, when everyone and their mother had a VGA card and, all of a sudden, most developers were developing 3D titles. SUPRISE SUPRISE!!! So is it really that big of surprise that you aren't seeing many games out today that feature ground breaking advanced physics that exploit a GPU to the max? Hell no. That wouldn't be economically feasible. But, fast forward 1 or 2 years into the future, after most people have a card capable of running advanced physics without a big graphical performance hit, and you're going to start seeing some real amazing stuff. That's my bet anyway--and the bets of the people at Epic. If you disgree, go write Mark Rein a letter. Maybe you can change his mind.
Edited by Porthios - 6/26/08 at 5:51am
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
Metal Case
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 ASUS GeForce GTX285 1GB 6GB Corsair 
Hard DriveOSMonitorPower
500GB Vista Ultimate 64-bit Asus 24" 850W 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [TH]Physics Drivers Outrage