Overclock.net › Forums › Industry News › Hardware News › [NVIDIA]Microsoft going All-in on GPU computing
New Posts  All Forums:Forum Nav:

[NVIDIA]Microsoft going All-in on GPU computing - Page 10

post #91 of 151
That's maxing out an i7 920 completely at getting around 5fps... You can't possibly expect them to run a game at the same time when the CPU is maxed out. Take Mafia II for example. On an 1100T @ 4.2Ghz as soon as you turn on PhysX running on the CPU the game fps drops making it unplayable. You NEED a co-processor to handle PhysX properly. BC2 physics is so minimal and the preset destructions are anything but realistic. I love the game but the physics are a joke and chew up CPU like no other.
post #92 of 151
Quote:
Originally Posted by Cyrilmak View Post
Yeah you're talking to yourself again aren't you....

To date no other physics engine has been able to go head to head in realism than Physx. There is proof all over the internet, including from the post above me. Your either miss informed, or just thick in the head.
This is good and all but can you give me a list of games that make decent use of PhysX since Nvidia bought out Agea ?

And I am not talking about a specialized tech demo. Those are made to look great to sucker people in.

Nvidia are really on their own with this one (I think they call it humping the pooch) because I am yet to see a decent use of it that warrants the premium of buying a Nvidia card.

At least with this news it puts everyone on a level playing field both AMD and Nvidia. reason being is that there is no favorites with OpenCL it just is.


Also to those out there thinking that Nvidia are the only ones that can use OpenCL at the moment and all that crap, all I have to say is "STREAM".
Edited by smash_mouth01 - 6/23/11 at 6:05pm
post #93 of 151
physx is indeed a joke. i would never buy a dedicated card for it and i dont use in the 0.1% of games that feature it. a little extra eye candy at the cost of a ton of fps. if nvidia werent greedy everyone could enjoy it. i believe a few years back ati announced an open source physics engine they were going to be developing that would put an end to physx. not sure what happened to that but i hope they still plan on it.
best
(12 items)
 
  
CPUMotherboardGraphicsRAM
7700K @ 4.5GHz ASRock Z170 OC Formula EVGA GTX 1080 SC2 iCX 16GB @ 4GHz 
Hard DriveOSMonitorKeyboard
Samsung 850 Pro 256GB Windows 8.1 Pro x64 BenQ XL2411Z Filco MJ2 TKL Red PS/2 
PowerCaseMouseOther
Seasonic X650 Corsair 750D Zowie FK1 CyberPower CP1500PFCLCD 
  hide details  
Reply
best
(12 items)
 
  
CPUMotherboardGraphicsRAM
7700K @ 4.5GHz ASRock Z170 OC Formula EVGA GTX 1080 SC2 iCX 16GB @ 4GHz 
Hard DriveOSMonitorKeyboard
Samsung 850 Pro 256GB Windows 8.1 Pro x64 BenQ XL2411Z Filco MJ2 TKL Red PS/2 
PowerCaseMouseOther
Seasonic X650 Corsair 750D Zowie FK1 CyberPower CP1500PFCLCD 
  hide details  
Reply
post #94 of 151
Quote:
Originally Posted by PoopaScoopa View Post
That's maxing out an i7 920 completely at getting around 5fps... You can't possibly expect them to run a game at the same time when the CPU is maxed out. Take Mafia II for example. On an 1100T @ 4.2Ghz as soon as you turn on PhysX running on the CPU the game fps drops making it unplayable. You NEED a co-processor to handle PhysX properly. BC2 physics is so minimal and the preset destructions are anything but realistic. I love the game but the physics are a joke and chew up CPU like no other.
I'm sorry, apparently you didn't grasp the magnitude & reality of that video. The point was to show you that you can (in real time) ramp up the physical objects & soft bodies and take them to this level using a modern cpu.

Having max and showing 3,000 boxes swirling in game (yes that was in game), illustrates how powerful the cpu is @ physics. & how scalable it is. As 3k boxes & soft bodies was dramatic overkill... instead of just 30 objects.. "look what we can do". As there was no need for that much power. And (again) it was 2 years ago.. what will cpus be able to do starting nect year again? Physx was a marketing gimmick.. u got burnt.


Lastly, Battlefield Bad Company 2's physics are not "pre-set destruction". (Do u mean pre-rendered..?) if you played the game, or actually educated yourself you'd know this. btw... Battlefield has BF has had cpu physics in their games since bf1942.

Yes, there is pre-rendered destruction, but DICE is not trying to pass that off as physics. That is just animation on large Destruction 2.0 objects. But then again, if you actually played or knew what u were talking about you would understand the difference.
post #95 of 151

The only thing running in that "game" was the physics. There was a poorly rendered graphics environment that pale's in comparison even compared to CoD graphics which is outdated. The only thing running was the physics. You've been able to do the same thing with PhysX on the CPU for years. What's your point? It's slow and choppy. The Havok destruction in BC2 is minimal and still consumes most of the computing resources in multiplayer. I'm level 50 so obviously I've played the game plenty and still enjoy it. The Frostbite engine uses the Havok physics engine during explosions which, yes, includes Destruction 2.0.


Using 80-85% of the CPU of a 2600K @ 4.5 and only producing minimal effects just goes to show how poorly x86 performs with the Havok engine. PhysX has been able to produce far better realistic effects since back when AGEIA still owned the technology in 2005. I'm glad that Havok is being used in games since you don't need a an Nvidia GPU but there's no comparison in performance between the two. As CPUs become more powerful so will GPUs and they're far better at parallel processing than x86 will ever be. I can't wait till a realistic beach scene in Crysis 3 or whatever comes out using some of this new PhysX 3.0 development.
Edited by PoopaScoopa - 6/23/11 at 8:50pm
post #96 of 151
Quote:
Originally Posted by PoopaScoopa View Post
The only thing running in that "game" was the physics. There was a poorly rendered graphics environment that pale's in comparison even compared to CoD graphics which is outdated. The only thing running was the physics. You've been able to do the same thing with PhysX on the CPU for years. What's your point? It's slow and choppy. The Havok destruction in BC2 is minimal and still consumes most of the computing resources in multiplayer. I'm level 50 so obviously I've played the game plenty and still enjoy it. The Frostbite engine uses the Havok physics engine during explosions which, yes, includes Destruction 2.0.

http://www.youtube.com/watch?v=C0AnQwO1UiM
http://www.youtube.com/watch?v=NOUhtQms5wk

Using 80-85% of the CPU of a 2600K @ 4.5 and only producing minimal effects just goes to show how poorly x86 performs with the Havok engine. PhysX has been able to produce far better realistic effects since back when AGEIA still owned the technology in 2005. I'm glad that Havok is being used in games since you don't need a an Nvidia GPU but there's no comparison in performance between the two. As CPUs become more powerful so will GPUs and they're far better at parallel processing than x86 will ever be. I can't wait till a realistic beach scene in Crysis 3 or whatever comes out using some of this new PhysX 3.0 development.

I can see you don't grasp the difference between gpu physics and cpu physics, or how they relate to physx & marketing of physx.

That demo I linked (will check when I get home to see if it was the right link) does what would take tri-sli to accomplish. Which is great for nvidia's bank, but illogical, thus moot. That is why you only ever see "particle physics" used in Physx games.. Read the thread and learn, don't retort without understanding the premis.

Havok has nothing to do with nvidia.. I don't see why u keep bringing it up in ur defense.


Battlefield has physics all over the place. Every bullet is real & trash cans, cones, etc can be peppered with bullets and moved all over the map. The're real objects.

On the contrary, it is very hard for physx to do deformable & real object based physics. Nearly any demo doing so required multiple video cards, that is why nvidia sticks with particle physics, like shuffling papper & waiving banners in Batman... because to do any more would require a second video card...! And that was/is the whole point behind sli & physx... to get people to buy into their marketing and make nvidia richer with a gimmick.

Don't believe me(?).. name one physx game with deformable physics... heck, name one physx game with real objects... lol.


Physx is a gimmick, the more you try to rebuttal the more you illustrate that you have no fundamental grasp on what a physical environement consist of & keep regurgitating the same ole rederict.
post #97 of 151
While I love my nvidia, PhysX is most definitely a marketing chip. Nvidia wanted something exclusive that makes them look better than their competitors. So that way we can all pat ourselves on the back about how smart we were to buy a PhysX enabled gpu for our "long" list of PhysX enabled games.

Nvidia could play nice, but it makes them less money.
World Builder 2.0
(12 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3770K ASRock Z77 Extreme 4 ASUS GTX560ti (384c) G.Skill Ares 16GB  
Hard DriveCoolingOSMonitor
OCZ 60GB Vertex 2 + OCZ 120GB Vertex 3 + Samsun... CM Hyper 212 Plus Win7 Pro Asus 23" + Samsung 19" 
KeyboardPowerCaseMouse
CM QuickFire Rapid (Black Switch) Corsair TX750w Corsair 500r (Black) Logitech G500 
  hide details  
Reply
World Builder 2.0
(12 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3770K ASRock Z77 Extreme 4 ASUS GTX560ti (384c) G.Skill Ares 16GB  
Hard DriveCoolingOSMonitor
OCZ 60GB Vertex 2 + OCZ 120GB Vertex 3 + Samsun... CM Hyper 212 Plus Win7 Pro Asus 23" + Samsung 19" 
KeyboardPowerCaseMouse
CM QuickFire Rapid (Black Switch) Corsair TX750w Corsair 500r (Black) Logitech G500 
  hide details  
Reply
post #98 of 151
Quote:
Originally Posted by RagingCain View Post
Don't they make very good high-end, mid-range, and low-end video gaming cards?

Does anybody have a source that shows nVidia also makes video gaming cards?

I could of swore that's what made them the giant they are today...
Sorry to burst your bubble but Nvidia is a overpriced ABS plastics manufacturers they just happen to attach PCB/mosfets/capacitors/GPU/flash memory modules etc etc to the plastics and they just happen to function as a gpu as well, but they are indeed ABS plastic manufacturers NOT gpu manufacturers.

Gosh man use google
Sheep Prodigy
(13 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7-960 4.3 ghz Gigabyte GA-EX58-UD4P AMD HD 6970 2GB GDDR5 2x4GB DDR3 1333 mhz 
Hard DriveOptical DriveOSMonitor
Samsung Spinpoint F3 1TB DVD/DW +- RW/ Bluray W7 64 bit 21.5" e-IPS U2211H 
KeyboardPowerCaseMouse
Dynex cheapie Antec TP-550W Corsair Obsidian 650D Logitech G500 
Mouse Pad
google o.o 
  hide details  
Reply
Sheep Prodigy
(13 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7-960 4.3 ghz Gigabyte GA-EX58-UD4P AMD HD 6970 2GB GDDR5 2x4GB DDR3 1333 mhz 
Hard DriveOptical DriveOSMonitor
Samsung Spinpoint F3 1TB DVD/DW +- RW/ Bluray W7 64 bit 21.5" e-IPS U2211H 
KeyboardPowerCaseMouse
Dynex cheapie Antec TP-550W Corsair Obsidian 650D Logitech G500 
Mouse Pad
google o.o 
  hide details  
Reply
post #99 of 151
Quote:
Originally Posted by Daniel Moth, MSFT...
is modern C++. Not C or some other derivative.
Did the Moth just imply that C is a derivative of "modern" C++?
My giant
(13 items)
 
  
CPUMotherboardGraphicsRAM
E6600 lapped Asus P5n32-E 680i GeForce 9800GT 512MB 2x1GB (unused ATM) & 2x2Gb Corsair XMS2 PC6400 
Hard DriveOSMonitorPower
WDJS SATA-II 160GB + WD80GB + WDAAKS Raid0 320GB Aperature FSII 19" LCD Apevia DarkSide 600W 
Case
NZXT Zero 
  hide details  
Reply
My giant
(13 items)
 
  
CPUMotherboardGraphicsRAM
E6600 lapped Asus P5n32-E 680i GeForce 9800GT 512MB 2x1GB (unused ATM) & 2x2Gb Corsair XMS2 PC6400 
Hard DriveOSMonitorPower
WDJS SATA-II 160GB + WD80GB + WDAAKS Raid0 320GB Aperature FSII 19" LCD Apevia DarkSide 600W 
Case
NZXT Zero 
  hide details  
Reply
post #100 of 151
Quote:
Originally Posted by Kirmie View Post
Did the Moth just imply that C is a derivative of "modern" C++?
No he said the language is C++. He is saying the programming is not using C or other derivatives like objective-C. Nvidia's CUDA language uses objective-C iirc.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [NVIDIA]Microsoft going All-in on GPU computing