Originally Posted by Seven7h
Dead wrong. PhysX is completely unlike anything that it was 6-7 years ago. It's had lots of new effects, and been mostly rewritten. APEX API and effects were all absolutely invented under NVIDIA. Cloth, hair, fluid, etc. The stuff that was there before was particles and rigid bodies. Even the stuff that was there when they were purchased has been completely rewritten. It is always evolving, and NVIDIA is paying those salaries. There have been PhysX effects/tech that have originated from engineers that were always NVIDIA employees too. At this point it's fair to say that most everything was developed under NVIDIA rather than Ageia.
This is partially incorrect, the base for everything in PhysX was developed before NVidia bought them. Invidia then took that work and moved it onto the Cuda platform and optimized it. Not saying NVidia did not move it forward leaps and bounds but they did not "invent" it. It's the same with SLI, the entire concept was actually taken from the tech developed by 3DFX after they where bought by NVidia. Funny thing is for years NVidia told everyone SLI was a gimmick. Then they reintroduce it and claim to have created it.
Next the entire why does AMD not do this or that, the reason in the end is money and they do not have it. NVidia dumps a LOT of money every year into the gaming world just to make sure developers are using their products. Even with all the money they dump however you can see only a fraction of developers get onboard.
Finally lets get to the question of why AMD does not have support. The myth is that AMD was offered PhysX by NVidia, WRONG. in 2008 a group outside of either company began work on a project to see if Cuda and thus PhysX could be implemented on AMD cards. The group sent messages to AMD and NVidia asking for help. In AMDs case they wanted information on the direct chip programming as well as AMD to supply them with video cards. AMD declined for a number of reasons but mostly because they expected NVidia to shut down the project .
Now it is important to note the timing of the events here. The group, after all this has played out with AMD is suddenly and very quietly given and offer to help with this development from NVidia but since AMD will not play ball the point is mute. The group goes public with this but the matter is already settled.
NVidia did not offer anything of value when you read between the lines. They knew AMD would not work with this before they offered to help. Up until that point they had not bothered to respond at all. This was a "political" move to let them look like they held the high ground, the problem is any sod with a brain knows it for what it is. If NVidia was open to this why do NVidia drivers specifically target AMD cards to shut of PhysX support on an NVidia card if an AMD card is present? Why not develop PhysX to work with a stand alone PhysX card? Why are their AIB partners forbidden to sell an NVidia chip on a card designed for nothing but a PhysX or CUDA co-processor?
Now finally NVidia keeps PhysX to itself because they are out to make money, only partially true. You see right now PhysX does NOT make any money. In fact PhysX costs NVidia money, they have to practically pay developers for it's use instead of more open alternatives. If they even offered a lightly priced licensing for a very limited form of Cuda to just make use of PhysX they would make more money from PhysX being used. The issue however is not PhysX it is CUDA. CUDA is much like what AMD fought with Intel when it came to optimized software, if you pull the GPU computing out of CUDA and into OpenCL or DirectCompute something wild happens, NVidia gets spanked.
On a level playing field in the world of GPU computing , a world that admittedly NVidia helped to form, they are not the leaders. The only lead they have is through a proprietary source of coding, open the coding and suddenly they are second tier.
Now to the matter at hand, Witcher 3. I think the demo they are showing is awesome but I am skeptical. Look at Tomb Raider, it looked awesome in demos but in actual play most people turned it off. The reasons where many fold but in the end it came down to a feature that added nothing to the game play but cut into the performance. The fur implementation in Witcher looks amazing but unless you are going to watch wolves run or stand around looking at your fur cloak blowing the truth is it will have no impact on the game, or at least not a positive impact since it will cut into the games performance.
This BTW is in my opinion the issue with most Physics implementations, they are look great in demos but in actual game play just become useless. Lets look at Borderlands 2 for example, the star of the PhysX lineup. When the PhysX effects are in play the elemental effects of the weapons are spectacular. However trying playing with your buddies as a sniper. Looking through your scope the world becomes a blob of effects as you try to scope in a target and the other targets are exploding in elemental effects. Or how about as you shoot and hit things, chips fly around and litter the ground, amazing? Not when you realize what you are shooting is unchanged, where are the chips coming from? This is not Physics, it is extra-dimensional littering, I mean those bits come from somewhere right, perhaps an alternate dimension.
At the end of the day this will be a neat feature in Witcher 3 but one that will not have any real effect on game play. I think a discussion on if companies should make games with proprietary technology of a single hardware company is a good one to hold. But in the end Witcher 3 will be a great game or a fail based on the game itself, not if PhysX is in it or not.