Overclock.net banner

[Guru3D] NVIDIA PhysX Engine Now is Open-Source

6K views 55 replies 39 participants last post by  umeng2002 
#1 · (Edited)
NVIDIA PhysX is going open source. NVIDIA is doing this because physics simulation - long key to immersive games and entertainment - turns out to be more important than we ever thought. Physics simulation dovetails with AI, robotics and computer vision, self-driving vehicles, and high-performance computing.

It’s foundational for so many different things we’ve decided to provide it to the world in an open source fashion. Meanwhile, we’re building on more than a decade of continuous investment in this area to simulate the world with ever greater fidelity, with on-going research and development to meet the needs of those working in robotics and with autonomous vehicles. PhysX will now be the only free, open-source physics solution that takes advantage of GPU acceleration and can handle large virtual environments.

It will be available as open source starting Monday, Dec. 3, under the simple BSD-3 license.
Source

Original Blog Post

Github page

They should have done this years ago, but better late than never.
 
#3 ·
Did nVidia finally realize that nobody cares enough about PhysX anymore? It's about time. I never saw the difference in real world gameplay (as in, not just comparing still screenshots) and gave up on "wanting" PhysX years ago.

Now let's hope they give up on other protectionist strategies such as G-sync and start behaving for the benefit of the consumer.
 
#8 ·
Finally! It's also about that time to allow your GPUs to use freesync properly. ;)
 
#14 · (Edited)
I originally had a BFG PhysX seperate PCI card, 128mb VRAM, this was before Nvidia bought BFG and AGEIA tech.

It had an awesome blue led light that would come on when playing games to let you know it was working lol

Now its integrated tech into nvidia


https://www.newegg.com/Product/Product.aspx?Item=N82E16814143055
 
#15 ·
I'd be more excited about Nvidia finally giving into Freesync (and using GSYNC as a sort of premium branded Freesync similar to Freesync 2 with LFC). Samsung has launched TVs with Freesync so it's going to be a matter of time that NVidia will be pressured further.

I think ultimately what has happened is that not many developers use PhysX.
 
#32 ·
Physx is still widely used... it is just used in potato software mode. It's honestly a shame, cause many games that support PhysX in software mode don't even allow it to be run in hardware mode, even if you have an Nvidia card.

One of the best examples is Planetside 2, which still to this day heavily relies on PhysX baked right into the engine, but GPU PhysX support was officially dropped all the way back in the late stages of beta testing, and unofficial support ended with the **** round 2 "optimization" patches a year or two after that.

This change was done in an attempt to make the game run more consistently across different hardware configs and to remove the huge advantage Nvidia players hard over AMD players, but really all it did was castrate performance, visuals, and even to a minor extent gameplay for the majority of players who were running cards powerful enough to really play the game well in the first place anyway.

Now with it going open source, it would be lovely to see new cards that are literally orders of magnitude faster than what existed when PS2 launched, finally able to run the game as it was originally intended again. It'll almost certainly never happen in the case of PS2, just because development has long since passed its prime for that title, but in theory this could be a huge boon for everyone, even those with Nvidia cards which could in theory run PhysX this whole time but in practice could not because no developer is ever willing to enable hardware mode when only a small subset of users can use it.
 
#33 ·
The last game that I remember PhysX being a big deal was the first Batman Arkham game, and before that the first Mirror's Edge. Those are both close to ten years old.
 
#43 ·
you missed all the silliness of BL2 (6 years ago)?



though it's been in TW3 DLCs and fallout4, yeah, that was the last time NV actively marketed physx in a game.
 
#34 ·
Cheers Nvidia, just make sure you make all of the other black box stuff open source. Cheers!
 
#38 · (Edited)
PhysX ... all I can think about are Borderlands 2 boss fights with physx set to high ... and literally, not being able to see anything because of all the sparks ... LOL ... fun bit of tech ... pretty sure PhysX is dead with Windows 10, though but I could be wrong.
 
#39 ·
Does anyone really care at this stage?
RTX is the new gimp works.
 
#52 ·
PhysX peaked with Mirrors Edge and Arkham Asylum.
 
#53 ·
Yep. Used a GTX 260 to play those two with physx.
 
#54 ·
Source

They should have done this years ago, but better late than never.

FAKE REASON:

physics simulation - long key to immersive games and entertainment - turns out to be more important than we ever thought



REAL REASON:

Physics simulation dovetails with AI, robotics and computer vision, self-driving vehicles, and high-performance computing.
 
#55 ·
In my experience, PhysX always been glaringly and unnecessarily overboard.

For example in Borderlands 2 if you had 2 sirens in party you had to turn it off because of the cluster**** of trash flying around that didn't let you see what the hell was going on.
 
#56 ·
Yeah, as paulerxx said, I used my old GTX 260 Core 216 card with those two titles and I really liked it... but after that... I always found PhysX particles and such a waste and too much of a performance hog.

I even tried using the GTX 260 as a PhysX card with my GTX 580, and the results were inconsistent like SLI. Sometimes it was faster with two cards, sometimes it disrupted the frame pacing, sometimes it was just slower.

But now, I don't think any one uses it for GPU accelerated physics, in games at least.

Now, it seems, it's just an SDK/ API for physics like Havok...
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top