Overclock.net › Forums › Industry News › Software News › [Nvidia] NVIDIA Launches PhysX 3.0 With Support For Emerging Gaming Platforms
New Posts  All Forums:Forum Nav:

[Nvidia] NVIDIA Launches PhysX 3.0 With Support For Emerging Gaming Platforms - Page 15

post #141 of 145
Quote:
Originally Posted by Viridian View Post
No, again, Red Faction had superb CPU effects. But they were held back because the CPU needs to run the rest of the game. Offloading onto the GPU is natural because you can fit more effects into the game that way. Intel were looking at GPU integration with Havok, but they shelved Larabee and implemented it into Sandy Bridge CPU's.

It's not about trying to sell more GPU's. It's that CPU's are unable to handle the number of effects that a GPU can, as efficiently as a GPU can. There is no GPU conspiracy
It is though, it's proven than CPU PhysX runs crappy old code (x87 and a bit of SSE1, where's SSE2, SSE3, SSE4.1/.2/a and AVX?


And the CPU usually has 2-4 cores sitting there doing nothing while you game, whereas a GPU is usually at 100% for the entire game...Why do we want to put more load on the bit that doesn't need it? I mean, yeah, fair enough, GPUs are needed for some physics scenarios, but at least run as much on the CPU is possible at 60fps then use the GPU.

Quote:
Originally Posted by Viridian View Post
http://physxinfo.com/news/2279/amd-a...f-the-problem/

AMD users have AMD to blame for the "closed" and "propitiatory" nature of Physx.
AMD would be completely stupid to take up PhysX though, nVidia owns it...Meaning that to say "We're putting it on our cards" would give their opponent a weapon used against them, who's to say that nVidia wouldn't force a PhysX update when there's no competitors? One that on purposely cripples AMD cards? That's why they don't want a propitiatory standard run by their main competitor as the standard for the industry.

This is why Linux is the main server OS, no business would really want to support a potential competitor (License fees for Win Server 2k8 anyone?) and running Linux ensures they don't do that.

I personally believe nVidia should just dump PhysX and work on OpenCL, it'd pay less than what PhysX would get them but I seriously don't see PhysX becoming the number one physics API forevermore due to the lack of running on AMD cards. (No Viridian, it's not AMDs fault for not accepting, that would be dumb of them to accept.)
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #142 of 145
Quote:
Originally Posted by anotheralex View Post
Yea physx is so amazing..

Games like Doom3, Quake3, Half Life 2, Fallout3, Elder Scrolls (pick a number) would not be as amazing if it wasn't for physx.. oh wait none of them use physx!!

I cannot believe developers can make such amazing games without the use of physx!!

Well at least Witcher 2, one of the most beautiful games available today is only so with the help of physx.. oh wait it too does not use physx!

I cannot believe these widely popular games, possessing some of the best graphics especially for their time, were made without physx!

Hmm.. maybe they were extremely popular and widely accepted because they do not use such a restrictive api such as physx.. just a thought.
Restrictive API? Please post somewhere where you know what you're talking about.
Quote:
Originally Posted by Brutuz View Post
It is though, it's proven than CPU PhysX runs crappy old code (x87 and a bit of SSE1, where's SSE2, SSE3, SSE4.1/.2/a and AVX?
Right, but even with good SSE4, and the like, support, the CPU still isn't as fast as the GPU for anything. Hence the whole arrival of GPGPU's onto the scene.
Quote:
Originally Posted by Brutuz View Post
And the CPU usually has 2-4 cores sitting there doing nothing while you game, whereas a GPU is usually at 100% for the entire game...Why do we want to put more load on the bit that doesn't need it? I mean, yeah, fair enough, GPUs are needed for some physics scenarios, but at least run as much on the CPU is possible at 60fps then use the GPU.
Yeah, I see your point and I agree with you, but as I pointed out earlier, NVidia are in the business of making GPU's, not CPU's. You can only go so far with your interests in supporting gamers, but they'd be cutting off their nose to spite their face if they totally dropped GPU support.

And, it's not all that bad. In Batman AA, with the Physx set to normal my GPU ploughs through no problem. With high it takes a hit in Physx heavy areas but still doens't drop below 30FPS, and even then that's just a tiny dip.

I think it's daft that Mafia II can't run Physx perfectly on a GTX580, but then I also know Just Cause 2 looks absolutely devilish on one of those two. I know that's not a Physx game, but it's CUDA-heavy.

Quote:
Originally Posted by Brutuz View Post
AMD would be completely stupid to take up PhysX though, nVidia owns it...Meaning that to say "We're putting it on our cards" would give their opponent a weapon used against them, who's to say that nVidia wouldn't force a PhysX update when there's no competitors? One that on purposely cripples AMD cards? That's why they don't want a propitiatory standard run by their main competitor as the standard for the industry.
Rubbish. You didn't read my other rebuttal. AMD had the chance and even mused about buying Aegia before NVidia did. The second NVidia snapped them up AMD were trashing the hell out of it, much like a spoiled child.

Who's to say it has to be an industry standard and there's also things, as I mentioned before, such as agreements and contracts. If I had been AMD that would have been my first concern. But I'd've also negotiated about it and seen what I could get out of the deal. Could I get Physx and an agreement that they wouldn't pull that crap?

I mean, personally, I think that's extreme AMD paranoia. NVidia produce better GPU's 9/10, so NVidia don't have anything to worry about. I think they just wanted to see a licensing deal with AMD because that would make them money (pennies per GPU) as well as get Physx implemented.
Quote:
Originally Posted by Brutuz View Post
This is why Linux is the main server OS, no business would really want to support a potential competitor (License fees for Win Server 2k8 anyone?) and running Linux ensures they don't do that.
Linux is the main server OS because it's free, pretty tight and there is plenty of community based support out there. But many, MANY large corporations use Windows based OS's because of the support that you buy when you receive a license. Companies just prefer to pay for support instead of having to find it on the web. Hell, it's small hosting companies and even slightly larger ones that sell cheap LAMP dedicated/shared servers because LAMP is a free software package. No corporation in their right mind (...well debatable with the recent hackings) would use LAMP. Many use Windows +Oracle.
Quote:
Originally Posted by Brutuz View Post
I personally believe nVidia should just dump PhysX and work on OpenCL, it'd pay less than what PhysX would get them but I seriously don't see PhysX becoming the number one physics API forevermore due to the lack of running on AMD cards. (No Viridian, it's not AMDs fault for not accepting, that would be dumb of them to accept.)
Then what you believe isn't at all objective and is fueled by personal opinion instead of sense. Why the hell would NVidia pay a lot of money for Aegia and then scrap the technology? To suit scorned AMD fans who could have had the technology if their chosen company had played ball? Yes, Brutuz, it is AMD's fault. They could have bought Aegia themselves (would they have been as generous as NVidia were and offered the tech for pennies per GPU?) Then they could have licensed Physx from NVidia on a hardware level.

Support for CUDA is free. AMD could and still can develop a driver for CUDA and CUDA could run on AMD cards. That would be free. Physx would cost.

Here:

Why Won’t ATI Support CUDA and PhysX?

Quote:
Nvidia “owns” and controls the future of CUDA, so it’s not open in the “open source” definition, but it’s certainly free. Nvidia tells us it would be thrilled for ATI to develop a CUDA driver for their GPUs.

But what about PhysX? Nvidia claims they would be happy for ATI to adopt PhysX support on Radeons. To do so would require ATI to build a CUDA driver, with the benefit that of course other CUDA apps would run on Radeons as well. ATI would also be required to license PhysX in order to hardware accelerate it, of course, but Nvidia maintains that the licensing terms are extremely reasonable—it would work out to less than pennies per GPU shipped.
AMD's take?

Quote:
So what is AMD/ATI’s take on all this? I spoke with Senior PR manager Rob Keosheyan at AMD, and he had plenty to say about the situation. Open industry standards are extremely important to AMD as a company, and they feel that GP-GPU work should be no different. It’s working hard only on its own StreamSDK and Brook+, but with the Kronos group on OpenCL, where it sees the real future.

If open standards are so important, why partner with Havok for physics work? That technology is far from open; it’s owned by Intel, the other chief competitor of AMD/ATI. Of course, there are no truly open physics middleware solutions on the market with any traction, so that point might be kind of moot.
Havok, which is closed source, so much for touting open source. StreamSDK which went no-where and OpenCL, which NVidia is also a part of.

Quote:
Keosheyan says, “We chose Havok for a couple of reasons. One, we feel Havok’s technology is superior. Two, they have demonstrated that they’ll be very open and collaborative with us, working together with us to provide great solutions. It really is a case of a company acting very indepently from their parent company. Three, today on PCs physics almost always runs on the CPU, and we need to make sure that’s an optimal solution first.” Nvidia, he says, has not shown that they would be an open and truly collaborative partner when it comes to PhsyX. The same goes for CUDA, for that matter.

Though he admits and agrees that they haven’t called up Nvidia on the phone to talk about supporting PhysX and CUDA, he says there are lots of opportunities for the companies to interact in this industry and Nvidia hasn’t exactly been very welcoming.
So, AMD haven't actually spoken to NVidia about it. They're just going off past interactions. It seems like NVidia didn't want to be as "open and collaborative" as they would have liked. Who knows what's happened there. But it seems more like, to be, looking at this objectively, that Havok doesn't cost pennies per GPU and runs on the CPU.

However, what's that bolded part all about? Wait, wait wait, AMD were BIG into Havok FX when Aegia was still a company. Until Intel bought Havok and pulled the rug from unders it's feet. They even had graphs and designs showing it being 9x faster than the Aegia PPU on an XT1900. So basically, AMD were looking into GPU Physics, but give up when Havok FX was basically ripped from their arms.

Today, Physx is certainly better looking than Havok. It's great that Havok is CPU-optimised, but a lot of the effects I see on Physx systems are just much better. The most recent fluid effects with the lighthouse looked absolutely amazing. We're a long way off that being placeable into a game environment, but it's good to know NVidia are working hard and making the API into something great.

All the scorn it receives is simply and very directly because you can't run it on AMD cards. That's AMD's fault as much as it is NVidia's. NVidia purposely, it seems, borked Batman:AA's support for AMD cards, which blew, but then again at the same time AMD cheifs have spent a whole 2 years publically trashing Physx saying it's going to fail and be totally irrelevant. All the while they screwed Stream and haven't been able to push and get an OpenCL game out of the door. Havok FX, which they spent much time and money on, was ripped from their arms, still kicking.

Personally, both companies have been crap over the whole issue. But with the amount of trash talking AMD have done over Physx, you'd think they had some ace in the hole on the way, but the sad fact is they don't. The utter contempt people have for Physx is literally because of the market segmentation it causes. However, no matter your opinion, it's solely down to AMD's refusal to play ball and even inquire about the technology:

Quote:
I spoke with Roy Taylor, Nvidia’s VP of Content Business Development, and he says his phone hasn’t even rung to discuss the issue. “If Richard Huddy wants to call me up, that’s a call I’d love to take,” he said.
Contracts could be drawn up that locked NVidia into a "no pulling the damn rug from under our feet" clause and we'd be all gold. We could all enjoy Physx and then the people left whinging would just be SOOL.

But, heh, you can run Physx with an AMD card as the main and an NVidia card as a PPU. So, really, I don't see where all the fuss is from. Open source really doesn't make something great.
Edited by Viridian - 7/3/11 at 4:29am
    
CPUMotherboardGraphicsRAM
Intel i5 2500 @ 4,004GHz (1.202v) Asus P8P67-PRO B3 Leadtek GTX260 65nm 896MB 700|1430|1100 4GB DDR3 Muskin 1686MHz 
Hard DriveOptical DriveOSMonitor
2xWD CB 250GB, 2x SG 320GB, 1xSS F1 1TB Pioneer DVD-RW Windows 7 Professional 64bit Samsung 2032BW 20" 1680x1050 16:10 
KeyboardPowerCaseMouse
Razer Lycosa CoolerMaster GX 550W Coolermaster Storm Sniper Razer DeathAdder 
Mouse Pad
Razer Goliathus 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel i5 2500 @ 4,004GHz (1.202v) Asus P8P67-PRO B3 Leadtek GTX260 65nm 896MB 700|1430|1100 4GB DDR3 Muskin 1686MHz 
Hard DriveOptical DriveOSMonitor
2xWD CB 250GB, 2x SG 320GB, 1xSS F1 1TB Pioneer DVD-RW Windows 7 Professional 64bit Samsung 2032BW 20" 1680x1050 16:10 
KeyboardPowerCaseMouse
Razer Lycosa CoolerMaster GX 550W Coolermaster Storm Sniper Razer DeathAdder 
Mouse Pad
Razer Goliathus 
  hide details  
Reply
post #143 of 145
Quote:
Originally Posted by Viridian View Post
Restrictive API? Please post somewhere where you know what you're talking about.
PhysX is an API.

re·stric·tive
–adjective
3. expressing or implying restriction or limitation of application, as terms, expressions, etc.
http://dictionary.reference.com/browse/restrictive

PhysX is only useful on Nvidia cards.. thus it is restrictive; you cannot make the most of it on any other card.

Edit: GPU computing definitely allows for amazing things.. I am sure once Microsoft has completed C++ AMP, alternatives to physx will come that will work on all gpus (non-restrictive).
http://blogs.msdn.com/b/vcblog/archi...ucing-amp.aspx
Edited by anotheralex - 7/3/11 at 5:50am
a rig
(15 items)
 
  
CPUMotherboardGraphicsRAM
fx-6100 asrock deluxe5 radeon 5850 samsung ddr3 1600 
Hard DriveCoolingOSMonitor
3 x hdd nh-d14 win 7 x64 samsung 2053bw 
PowerCaseMouseMouse Pad
ocz 750 haf 912 logitech g9x a mouse pad 
Audio
auzen forte 
  hide details  
Reply
a rig
(15 items)
 
  
CPUMotherboardGraphicsRAM
fx-6100 asrock deluxe5 radeon 5850 samsung ddr3 1600 
Hard DriveCoolingOSMonitor
3 x hdd nh-d14 win 7 x64 samsung 2053bw 
PowerCaseMouseMouse Pad
ocz 750 haf 912 logitech g9x a mouse pad 
Audio
auzen forte 
  hide details  
Reply
post #144 of 145
Man still love to the 8 series cards. They just won't die! I MISS my 8800Ultra. Best video card I've ever owned for pure epeen.
Gemini
(26 items)
 
 
Computer/Office
(4 photos)
CPUMotherboardGraphicsRAM
Intel Core i5-2500K ASUS P8Z68 Deluxe PNY Nvidia GeForce GTX 680  CORSAIR XMS3 1333Mhz 16GB 
Hard DriveHard DriveHard DriveHard Drive
2 x Samsung Spinpoint F3R - RAID 1 (1 TB) Western Digital Green EARS (1 TB) Western Digital Green EARS (1.5 TB)  Western Digital My Passport Elite USB 2.0 (640GB) 
Hard DriveOptical DriveOptical DriveCooling
Western Digital My Book Essential USB 3.0 (2 TB) LG GBC-H20L Blu-ray Asus DRW-1814BLT CORSAIR H80 Push/Pull 
CoolingOSMonitorMonitor
Dual 200mm Fans / Dual 120mm Fans Windows 7 Ultimate x64 SP1 Dell UltraSharp U2412M (24") - Primary Dell UltraSharp U2412M (24") - Secondary 
KeyboardPowerCaseMouse
Logitech G510 CORSAIR Enthusiast Series TX750 V2 CORSAIR 650D Logitech G500 
Mouse PadAudioAudioAudio
XTRAC Ripper  Creative X-Fi Platinum Fatality Champion Series Sennheiser HD 595 Headphones  Logitech Z-2300 THX Cert 2.1 System 
OtherOther
Logitech HD Pro Webcam C910 Cyber Power CP1500AVRT 900Watt UPS 
CPUMotherboardGraphicsRAM
Intel Core i7-3610QM 2.3GHz Intel HM77 NVIDIA Geforce GTX 675M & Intel HD 4000 16GB G.Skill DDR3 1600 
Hard DriveOptical DriveOSMonitor
2x HGST 1TB 7200 RPM HD's RAID 0 MATSHITA BD-RE 6X Windows 8 Pro 64 Bit with Media Center 17.3" 1920x1080 Matte  
KeyboardPowerCaseMouse
Steelseries  180 Watt A/C and 9 Cell Battery Allant Black Carrying Case for 17.3" Notebook M... Razer Orochi 2013 
Mouse PadAudioAudioOther
Razer - Kabuto DynAudio speakers and sub with Realtek HD ALC ... Asus Xonar U3 USB DAP/Sound Card w/ Sony XBA-4 ... Intel Centrino Advanced-N 6235 
  hide details  
Reply
Gemini
(26 items)
 
 
Computer/Office
(4 photos)
CPUMotherboardGraphicsRAM
Intel Core i5-2500K ASUS P8Z68 Deluxe PNY Nvidia GeForce GTX 680  CORSAIR XMS3 1333Mhz 16GB 
Hard DriveHard DriveHard DriveHard Drive
2 x Samsung Spinpoint F3R - RAID 1 (1 TB) Western Digital Green EARS (1 TB) Western Digital Green EARS (1.5 TB)  Western Digital My Passport Elite USB 2.0 (640GB) 
Hard DriveOptical DriveOptical DriveCooling
Western Digital My Book Essential USB 3.0 (2 TB) LG GBC-H20L Blu-ray Asus DRW-1814BLT CORSAIR H80 Push/Pull 
CoolingOSMonitorMonitor
Dual 200mm Fans / Dual 120mm Fans Windows 7 Ultimate x64 SP1 Dell UltraSharp U2412M (24") - Primary Dell UltraSharp U2412M (24") - Secondary 
KeyboardPowerCaseMouse
Logitech G510 CORSAIR Enthusiast Series TX750 V2 CORSAIR 650D Logitech G500 
Mouse PadAudioAudioAudio
XTRAC Ripper  Creative X-Fi Platinum Fatality Champion Series Sennheiser HD 595 Headphones  Logitech Z-2300 THX Cert 2.1 System 
OtherOther
Logitech HD Pro Webcam C910 Cyber Power CP1500AVRT 900Watt UPS 
CPUMotherboardGraphicsRAM
Intel Core i7-3610QM 2.3GHz Intel HM77 NVIDIA Geforce GTX 675M & Intel HD 4000 16GB G.Skill DDR3 1600 
Hard DriveOptical DriveOSMonitor
2x HGST 1TB 7200 RPM HD's RAID 0 MATSHITA BD-RE 6X Windows 8 Pro 64 Bit with Media Center 17.3" 1920x1080 Matte  
KeyboardPowerCaseMouse
Steelseries  180 Watt A/C and 9 Cell Battery Allant Black Carrying Case for 17.3" Notebook M... Razer Orochi 2013 
Mouse PadAudioAudioOther
Razer - Kabuto DynAudio speakers and sub with Realtek HD ALC ... Asus Xonar U3 USB DAP/Sound Card w/ Sony XBA-4 ... Intel Centrino Advanced-N 6235 
  hide details  
Reply
post #145 of 145
Quote:
Originally Posted by Viridian View Post
Right, but even with good SSE4, and the like, support, the CPU still isn't as fast as the GPU for anything. Hence the whole arrival of GPGPU's onto the scene.
Actually, it's only slower than the GPU in stuff with massive amounts of multi-threading, hence why Intels quick sync is faster and better than CUDA video encoding even though it only runs on far weaker GPUs, because the GPUs can't do all the work as fast as a CPU, just some of it, if most of us have say...2-4 cores sitting there not doing anything, those could be filled up with most of the physics we have in games still.

Quote:
Originally Posted by Viridian View Post
Yeah, I see your point and I agree with you, but as I pointed out earlier, NVidia are in the business of making GPU's, not CPU's. You can only go so far with your interests in supporting gamers, but they'd be cutting off their nose to spite their face if they totally dropped GPU support.

And, it's not all that bad. In Batman AA, with the Physx set to normal my GPU ploughs through no problem. With high it takes a hit in Physx heavy areas but still doens't drop below 30FPS, and even then that's just a tiny dip.
Yeah, I know that, I just question telling the limiting part to do more work while leaving other parts to sit there doing nothing while waiting for it to catch up.

Quote:
Originally Posted by Viridian View Post
Rubbish. You didn't read my other rebuttal. AMD had the chance and even mused about buying Aegia before NVidia did. The second NVidia snapped them up AMD were trashing the hell out of it, much like a spoiled child.

Who's to say it has to be an industry standard and there's also things, as I mentioned before, such as agreements and contracts. If I had been AMD that would have been my first concern. But I'd've also negotiated about it and seen what I could get out of the deal. Could I get Physx and an agreement that they wouldn't pull that crap?

I mean, personally, I think that's extreme AMD paranoia. NVidia produce better GPU's 9/10, so NVidia don't have anything to worry about. I think they just wanted to see a licensing deal with AMD because that would make them money (pennies per GPU) as well as get Physx implemented.
They trashed nVidia's PhysX because of marketing, they were no longer interested in doing anything with helping the oppositions technology...That's not something that is uncommon in the business world either, but then I also admit (unlike most here) that ATI would have happily done the same as nVidia and only wants an open technology picked up as they can't make their own proprietary one. (If they did, PhysX would kill it)

And nVidia do not produce better GPUs...AMD stopped going for the absolute fastest after they failed with the HD2900XT and until the GTX 460 came out, they owned the markets they were aiming at. AMD adopting PhysX would let nVidia do what they want with it while AMD has no power over it...No different to MS not following standards with IE simply because they had enough users to get away with it.


Quote:
Originally Posted by Viridian View Post
Linux is the main server OS because it's free, pretty tight and there is plenty of community based support out there. But many, MANY large corporations use Windows based OS's because of the support that you buy when you receive a license. Companies just prefer to pay for support instead of having to find it on the web. Hell, it's small hosting companies and even slightly larger ones that sell cheap LAMP dedicated/shared servers because LAMP is a free software package. No corporation in their right mind (...well debatable with the recent hackings) would use LAMP. Many use Windows +Oracle.
Free has something to do with it, but is not a main reason...Windows isn't plain good enough in that environment (UNIX was the previous king of the server world, Linux is POSIX...etc) but I can definitely think of some companies that use Linux for reasons that at least, somewhere down the chain, include "The money we have to spend on servers doesn't go to a competitor."

Quote:
Originally Posted by Viridian View Post
All the scorn it receives is simply and very directly because you can't run it on AMD cards. That's AMD's fault as much as it is NVidia's. NVidia purposely, it seems, borked Batman:AA's support for AMD cards, which blew, but then again at the same time AMD cheifs have spent a whole 2 years publically trashing Physx saying it's going to fail and be totally irrelevant. All the while they screwed Stream and haven't been able to push and get an OpenCL game out of the door. Havok FX, which they spent much time and money on, was ripped from their arms, still kicking.
It's not AMDs fault though, sure, they said no...But you'd be completely dumb to accept something that's likely to become a standard (Once we hit photorealistic graphics, physics will be the one thing that lets you wave your graphics card around to help increase epeen sizes) if both major companies accept it, it's why Silverlight didn't take off enough to beat Flash, or why SDX CD drives didn't replace IDE/SATA DVD/CD drives.

Proprietary standards are never the way to go for something that will most certainly be used in every game, especially by a company that stands to gain or lose money in that field.

Quote:
Originally Posted by Viridian View Post
Contracts could be drawn up that locked NVidia into a "no pulling the damn rug from under our feet" clause and we'd be all gold. We could all enjoy Physx and then the people left whinging would just be SOOL.

But, heh, you can run Physx with an AMD card as the main and an NVidia card as a PPU. So, really, I don't see where all the fuss is from. Open source really doesn't make something great.
Except what's the bet that nVidia wouldn't agree to/even make a contract like that? Hence why I said just say "Screw it" and make an open one/have a third party make one who just want to for whatever reason, I don't care if its open source, free software or made with the worst license possible as long as no company stands to gain by cheating with the standard...DirectX being the main rendering method was already bad enough.
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [Nvidia] NVIDIA Launches PhysX 3.0 With Support For Emerging Gaming Platforms