Overclock.net - An Overclocking Community

Overclock.net - An Overclocking Community (https://www.overclock.net/forum/)
-   Software News (https://www.overclock.net/forum/226-software-news/)
-   -   [OC3D]Crytek Showcases Real-Time Ray Traced Reflections in CryEngine on RX Vega 56 (https://www.overclock.net/forum/226-software-news/1722630-oc3d-crytek-showcases-real-time-ray-traced-reflections-cryengine-rx-vega-56-a.html)

EastCoast 03-15-2019 04:15 PM

[OC3D]Crytek Showcases Real-Time Ray Traced Reflections in CryEngine on RX Vega 56
 



Technology Reveal: Real-Time Ray Traced Reflections achieved with CRYENGINE. All scenes are rendered in real-time in-editor on an AMD Vega 56 GPU. Reflections are achieved with the new experimental ray tracing feature in CRYENGINE 5 - no SSR.

Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE’s Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.

https://www.overclock3d.net/news/sof...JUrvjY6sZAjcUQ

Very interesting and a impressive demo to say the least.

Gunderman456 03-15-2019 04:28 PM

But I neeeds those tensor coresssss.........

EastCoast 03-15-2019 04:31 PM

I wouldn't hold my breath when something like this tech demo is "downloadable". If crytek is cooking up something for 2020 we will see more about it soon enough.

Puck 03-15-2019 04:40 PM

Real time on only a Vega56 is pretty promising that this tech will actually make it to games in the near future.

littledonny 03-15-2019 04:47 PM

Correct reflections make a big difference in immersion for me. The current screen space techniques don't always look quite right.

ToTheSun! 03-15-2019 05:49 PM

Quote:

Originally Posted by littledonny (Post 27893474)
Correct reflections make a big difference in immersion for me. The current screen space techniques don't always look quite right.

And this demo looks stupidly right!

Quote:

Originally Posted by Gunderman456 (Post 27893450)
But I neeeds those tensor coresssss.........

With tensor cores, this demo would probably run much better.

battlenut 03-15-2019 06:04 PM

Quote:

Originally Posted by ToTheSun! (Post 27893546)
And this demo looks stupidly right!


With tensor cores, this demo would probably run much better.

I would like to see this video on a VEGA 64 or a Radeon VII.

UltraMega 03-15-2019 06:06 PM

Well this just make Nvidia look extra silly. Branding ray tracing as an Nvidia only feature was a mistake for Nvidia that will bite them later on... since its... ya know... not.

PontiacGTX 03-15-2019 06:14 PM

Quote:

Originally Posted by UltraMega (Post 27893584)
Well this just make Nvidia look extra silly. Branding ray tracing as an Nvidia only feature was a mistake for Nvidia that will bite them later on... since its... ya know... not.

Like physx?see how other engines or even havok achieve similar result without using nvidia propietary technology

tpi2007 03-15-2019 06:41 PM

I want to download and try this demo. Yesterday, please.

UltraMega 03-15-2019 06:41 PM

Quote:

Originally Posted by PontiacGTX (Post 27893594)
Like physx?see how other engines or even havok achieve similar result without using nvidia propietary technology


Yea, exactly like PhysX except probably worse. With PhysX, Nvidia at least did really do something, they bought out Ageia which was a company that had started making dedicated physics cards. PhysX is really the rebranding of Ageia's tech into Nvidias umbrella. It was dumb and annoying for them to push it onto gamers the way they tried to, but it makes sense that they would try to do something with their inventment towards purchasing Ageia.


With ray tracing, Microsoft simply added the feature to DX12 and then Nvidia came along and tried to market it as an Nvidia only feature when its a totally agnostic DX12 feature. RTX = Nvidia branded ray tracing and not standard DX12 ray tracing, which is what the industry will have soon enough anyway and then it will just be all the more obvious that Nvidia tried to fool consumers that don't pay attention into thinking they invented ray tracing for games instead of just trying to brand an open platform feature as something specific to their brand.


You could then say, well Nvidia is first so for now RTX is a Nvidia only feature and technically that is true but RTX sucks right now so I'm not sure what Nvidia really gained from doing that. By the time it doesn't suck, they won't be the only ones doing it.


One question I have is, when AMD eventually releases cards designed for ray tracing, will the few RTX games stay RTX or will they get patched to support both? Or will AMD just be able to do RTX? Probably some time before we find out.

EastCoast 03-15-2019 07:33 PM

@UltraMega

I think that RT will be "simulated" to look like real time RT without the performance hit but with the similar effect on global illumination, reflections, etc.

tubers 03-15-2019 07:44 PM

Says "Rendered"? Is that the same as "running in real time"?

Gunderman456 03-15-2019 07:50 PM

It's real time. Ray Tracing was implemented in DX12 by M$ and Nvidia hijacked the name and open concept by saying but we have the #[email protected]%N^S&O*R Coreeesss.....

Crap (to me as a gamer) that was being used by Pro devs, which Nvidia recouped by turning around and selling the broken pro cards to gamers.

They bought this company in Israel so they can cram as many Pro cards in a million machines and have them go at lightning speed thanks to their new purchase. They don't care about gamers anymore.

Broken Pro cards will be sold to enthusiasts from now on and if you want the full chip broken cards expect to pay over $2500. If you can't, then you can go for the Ti for under $1500. What a deal.

If you want none pro cards, well you're now relegated to the low end tier cards that use to be in the 1050Ti class or below; you can get the 1660Ti or the 50 cards below that.

Nvidia curbed stomped the gamers that supported them all these years and it's the gamers who supported the $600 cards that need to be curbed stomped.

AMD saw what was happening and joined the bandwagon for the first time with their Pro Vegas I and II.

nonametoclaim 03-15-2019 08:14 PM

Quote:

Originally Posted by EastCoast (Post 27893698)
@UltraMega

I think that RT will be "simulated" to look like real time RT without the performance hit but with the similar effect on global illumination, reflections, etc.

this!? cant think of what to search for a link but doesn't AMD have their own open source rt software that is simply underutilized? i know ive seen AMD GPUs and ray tracing before.

as far as the demo... im really not a fan of all the bloom and ambient occlusion, to me it just makes everything seem like it has its own light source and takes away from the true beauty of an engine. i almost always turn them off. certain titles in vorpx makes a difference sure but usually its off.

as per a performance hit, i mean with dedicated cores isnt there going to be something missing frame wise when you dont have them? or am i just out of my element here? can they not be replaced by compute power?

final note, truly happy to see cryengine implementing(or trying to, at least) rt. its a shame to see something requiring a brand(physx cough).

m4fox90 03-15-2019 09:16 PM

Quote:

Originally Posted by Gunderman456 (Post 27893714)

They bought this company in Israel so they can cram as many Pro cards in a million machines and have them go at lightning speed thanks to their new purchase. They don't care about gamers anymore.

They certainly pay the lip service to gamers, but their actions tell a different story. It's as easy to see as when you go to their website and use their menus to try to find Geforce products, now hidden away in a corner behind AI AND DATA CENTER AND ;fart noises;

Gunderman456 03-15-2019 09:26 PM

Quote:

Originally Posted by m4fox90 (Post 27893802)
They certainly pay the lip service to gamers, but their actions tell a different story. It's as easy to see as when you go to their website and use their menus to try to find Geforce products, now hidden away in a corner behind AI AND DATA CENTER AND ;fart noises;

Sad, when the new consoles with their 8 Core Ryzen 2 and 24 (shared?) GB RAM with Navi 20? start natively bundling keyboard/mice with PC type I/Os the PC game is over since why put up with the hassle and the fleecing when you can buy a $400 console you can plug into your monitor or TV and play with your keyboard and mouse. No need to upgrade, get fleeced and worry on whether you can still play your games down the road. That console will play 1440p 144Hz no problem let alone 1080p 60Hz at over 100+fps which 90% of the people own anyway.

Heck, I'll be tempted at that point and the PC Master Race will be over.

EastCoast 03-15-2019 09:52 PM

Quote:

Originally Posted by nonametoclaim (Post 27893736)
this!? cant think of what to search for a link but doesn't AMD have their own open source rt software that is simply underutilized? i know ive seen AMD GPUs and ray tracing before.

as far as the demo... im really not a fan of all the bloom and ambient occlusion, to me it just makes everything seem like it has its own light source and takes away from the true beauty of an engine. i almost always turn them off. certain titles in vorpx makes a difference sure but usually its off.

as per a performance hit, i mean with dedicated cores isnt there going to be something missing frame wise when you dont have them? or am i just out of my element here? can they not be replaced by compute power?

final note, truly happy to see cryengine implementing(or trying to, at least) rt. its a shame to see something requiring a brand(physx cough).

Crytek has been dabbling with SVOTI found in Kingdom Come Deliverance.
And, there is Polyphony Digital using ray tracing for their next version of Grand Tursimo game. Right now they are "simulating" RT in their replays and this is on PS5!

Example:


So I really do believe with enough R/D real time RT will be meaningless and about as valued as physx is today.


Quote:

Originally Posted by Gunderman456 (Post 27893818)
Sad, when the new consoles with their 8 Core Ryzen 2 and 24 (shared?) GB RAM with Navi 20? start natively bundling keyboard/mice with PC type I/Os the PC game is over since why put up with the hassle and the fleecing when you can buy a $400 console you can plug into your monitor or TV and play with your keyboard and mouse. No need to upgrade, get fleeced and worry on whether you can still play your games down the road. That console will play 1440p 144Hz no problem let alone 1080p 60Hz at over 100+fps which 90% of the people own anyway.

Heck, I'll be tempted at that point and the PC Master Race will be over.

That's a straight up nuclear option isn't it?
AMD would have one awesome mid-range APU that would beat a 1660 with a i5.
Not saying that day will come. But your forward thinking might scare a few.

Gunderman456 03-15-2019 09:59 PM

Quote:

Originally Posted by EastCoast (Post 27893836)
That's a straight up nclear option isn't it?
AMD would have one awesome mid-range APU that would beat a 1660 with a i5.
Not saying that day will come. But your forward thinking might scare a few.

It better scare Asus, AsRock, Gigabyte, MSI, etc... etc... These greedy companies are also upping the anti when a top tear gaming mobo was ~$250 now they are at $500-$600+. They made marijuana legal worldwide and these companies are smoking to much weed. No one notices or says anything either. Companies repeatedly collude on RAM (affects SSDs, Video Cards and RAM prices) and HDD prices and all we get is bend over as "enthusiast" keep buying. They colluded on TVs but never been caught for monitors, I would have thought that would have also been a logical step. It's amazing how greed has almost affected every PC component and "enthusiasts" keep buying. Yes, they tell you that it's none of your business how the more money then brains crowd spend their money. They're not hurting anything.

EastCoast 03-15-2019 10:05 PM

Quote:

Originally Posted by Gunderman456 (Post 27893848)
It better scare Asus, AsRock, Gigabyte, MSI, etc... etc... These greedy companies are also upping the anti when a top tear gaming mobo was ~$250 now they are at $500-$600+. They made marijuana legal worldwide and these companies are smoking to much weed. No one notices or says anything either. Companies collude on RAM (affects SSDs, Video Cards and RAM prices) and HDD prices and all we get is bend over as "enthusiats" keep buying.

I can't find the article but AMD appears to be incorporating HBM on the MB for the new APU skus some how. That would shake MB manufactures real good as a wake up call. If that turns out to be true.

Gunderman456 03-15-2019 10:11 PM

Quote:

Originally Posted by EastCoast (Post 27893852)
I can't find the article but AMD appears to be incorporating HBM with their APU some how. That would shake MB manufactures real good as a wake up call. If that turns out to be true.

Let flaming hail fall around their heads and cowering in caves will not help them in the end.

EastCoast 03-15-2019 10:12 PM

Quote:

Originally Posted by Gunderman456 (Post 27893862)
Let flaming hail fall around their heads and cowering in caves will not help them in the end.

I found another article on it but it's different from what I read about it (I found a link on reddit which I am not able to find at the moment) but here is something else about it:

Quote:

Separately, we also learned that AMD is working on what they call “Near Memory,” or HBM being used in conjunction with future CPU components. We’re not clear presently on whether that includes desktop CPUs, but we do know that HBM for CPUs is in active R&D, and given Hades Canyon, that’s not necessarily a big surprise.
https://www.gamersnexus.net/news-pc/...7nm-challenges

This is disruptive to the market and if AMD is successful and true would be a serious contention.

Gunderman456 03-15-2019 10:32 PM

Yeah, there was also talk of soldered CPUs/RAM on proprietary mobos a few years back, forget the term for it. Maybe both AMD and Intel should shed all these parasites that feed off them since they no longer provide value to the community anyway.

Like I said, I think this is elementary, as consoles (more like PCs now) are the future of gaming and phones/tablets/laptops will be for business/daily use form factors. No one will be able to tolerate this kind of fleecing for much longer.

Majin SSJ Eric 03-15-2019 10:51 PM

Crytek, this is really cool and all but the next time I hear anything from you it better be an announcement for Crysis 4!!!!

white owl 03-16-2019 12:01 AM

Quote:

Originally Posted by Majin SSJ Eric (Post 27893900)
Crytek, this is really cool and all but the next time I hear anything from you it better be an announcement for Crysis 4!!!!

I'd also love another Crysis but if you're itching to see the beauty of their engine again, check out Hunt: Showdown. I don't have time to play every game that I want to but that one is on the list somewhere.
Tbh I wish PvP was still active on Crysis 3, sadly I missed all the fun. Looked like a blast.


You can also have a good deal of fun using their free Cryengine sandbox thing.

8051 03-16-2019 12:21 AM

Quote:

Originally Posted by PontiacGTX (Post 27893594)
Like physx?see how other engines or even havok achieve similar result without using nvidia propietary technology

I've never seen any other physics engine do smoke and fog effects like physX. Ditto for the FleX effects in Killing Floor 2.

8051 03-16-2019 12:24 AM

Quote:

Originally Posted by nonametoclaim (Post 27893736)
this!? cant think of what to search for a link but doesn't AMD have their own open source rt software that is simply underutilized? i know ive seen AMD GPUs and ray tracing before.

as far as the demo... im really not a fan of all the bloom and ambient occlusion, to me it just makes everything seem like it has its own light source and takes away from the true beauty of an engine. i almost always turn them off. certain titles in vorpx makes a difference sure but usually its off.

as per a performance hit, i mean with dedicated cores isnt there going to be something missing frame wise when you dont have them? or am i just out of my element here? can they not be replaced by compute power?

final note, truly happy to see cryengine implementing(or trying to, at least) rt. its a shame to see something requiring a brand(physx cough).

PhysX is now open source -- including vendor-neutral GPU accelerated physX libraries.

looniam 03-16-2019 12:46 AM

CRYTEK is remaking HARD RESET!?!?!?!?


sweet!

Defoler 03-16-2019 12:49 AM

Quote:

Originally Posted by Majin SSJ Eric (Post 27893900)
Crytek, this is really cool and all but the next time I hear anything from you it better be an announcement for Crysis 4!!!!

Didn't they close the studio that made crysis?
If I'm not mistaken the developers from crysis moved to make homefront, but that studio was sold when crytek UK got sold off.
So I wouldn't hold my breath about it.

Quote:

Originally Posted by Gunderman456 (Post 27893848)
when a top tear gaming mobo was ~$250 now they are at $500-$600+.

Top tier motherboards were not 250$ for at least the last decade. No idea where you brought that from.

Quote:

Originally Posted by Gunderman456 (Post 27893714)
Nvidia hijacked the name and open concept

Ray tracing existed way before you even heard the term GPU.
Nvidia though didn't "hijack" anything. They were marking their GPUs with a new capability to sell GPUs. Just like the VR era, or when AMD used mantle to sell their GPUs, or Vulkan to sell their GPUs, or tressfx to sell their GPUs. Etc.

Quote:

Originally Posted by UltraMega (Post 27893584)
Well this just make Nvidia look extra silly. Branding ray tracing as an Nvidia only feature was a mistake for Nvidia that will bite them later on... since its... ya know... not.

Where did nvidia ever branded ray tracing as nvidia only?

Quote:

Originally Posted by Gunderman456 (Post 27893450)
But I neeeds those tensor coresssss.........

There is no information about the resolution and FPS done in this demo.
I would like to see it in a dynamic environment of a game to see if it actually has the same FPS drop like nvidia, or it does better.

Clocknut 03-16-2019 01:04 AM

Just like physics, it started with specialize hardware, then ported to GPU. In the end the market use CPU because it simply easier to implement.

I wonder Ray tracing will follow the same path as well going partially software via CPU. We got soo much excess CPU power now, why not use them.

Alastair 03-16-2019 01:55 AM

Quote:

Originally Posted by Gunderman456 (Post 27893818)
Sad, when the new consoles with their 8 Core Ryzen 2 and 24 (shared?) GB RAM with Navi 20? start natively bundling keyboard/mice with PC type I/Os the PC game is over since why put up with the hassle and the fleecing when you can buy a $400 console you can plug into your monitor or TV and play with your keyboard and mouse. No need to upgrade, get fleeced and worry on whether you can still play your games down the road. That console will play 1440p 144Hz no problem let alone 1080p 60Hz at over 100+fps which 90% of the people own anyway.

Heck, I'll be tempted at that point and the PC Master Race will be over.

No please no I love my gaming PC's! I don't fork out for top end stuff but I still want my PC. I don't like consoles. :( please Gaben save us!

tubers 03-16-2019 02:21 AM

Quote:

Originally Posted by Defoler (Post 27894056)
Didn't they close the studio that made crysis?
If I'm not mistaken the developers from crysis moved to make homefront, but that studio was sold when crytek UK got sold off.
So I wouldn't hold my breath about it.



Top tier motherboards were not 250$ for at least the last decade. No idea where you brought that from.



Ray tracing existed way before you even heard the term GPU.
Nvidia though didn't "hijack" anything. They were marking their GPUs with a new capability to sell GPUs. Just like the VR era, or when AMD used mantle to sell their GPUs, or Vulkan to sell their GPUs, or tressfx to sell their GPUs. Etc.



Where did nvidia ever branded ray tracing as nvidia only?



There is no information about the resolution and FPS done in this demo.
I would like to see it in a dynamic environment of a game to see if it actually has the same FPS drop like nvidia, or it does better.

I wish i bookmarked it but there's an old toms article like early 2000 talking about ray tracing and a many core CPU would benefit the process. Can't find it anymore and I just saw that late 2018.

edited; 2009

https://www.tomshardware.com/reviews...on,2351-9.html

zGunBLADEz 03-16-2019 07:20 AM

what a slap on the face to nvidia XD

DNMock 03-16-2019 07:39 AM

Quote:

Originally Posted by Gunderman456 (Post 27893714)
Broken Pro cards will be sold to enthusiasts from now on and if you want the full chip broken cards expect to pay over $2500. If you can't, then you can go for the Ti for under $1500. What a deal.


You make it sound like this is something new, but that's what Nvidia has been doing for quite a while now. The Titan cards specifically have always been Quaddro's that didn't pass QC and had the defective cores soldered off. (Although I think the Titan BLACK was actually a rebranded Quaddro due to excessive inventory)

Quote:

Originally Posted by Clocknut (Post 27894074)
Just like physics, it started with specialize hardware, then ported to GPU. In the end the market use CPU because it simply easier to implement.

I wonder Ray tracing will follow the same path as well going partially software via CPU. We got soo much excess CPU power now, why not use them.

That would make a lot of sense with high core counts finally starting to make a push into mainstream. Not sure how it would work, but I would think that would be a great way of utilizing the spare cores and threads sitting around on a PC.

Redwoodz 03-16-2019 08:05 AM

Quote:

Originally Posted by ToTheSun! (Post 27893546)
And this demo looks stupidly right!


With tensor cores, this demo would probably run much better.

Yes, 10FPS more for $1,000 more. LMAO.

Quote:

Originally Posted by DNMock (Post 27894290)
You make it sound like this is something new, but that's what Nvidia has been doing for quite a while now. The Titan cards specifically have always been Quaddro's that didn't pass QC and had the defective cores soldered off. (Although I think the Titan BLACK was actually a rebranded Quaddro due to excessive inventory)

That would make a lot of sense with high core counts finally starting to make a push into mainstream. Not sure how it would work, but I would think that would be a great way of utilizing the spare cores and threads sitting around on a PC.

And I cannot tell you how many have been on this and other forums crying how AMD is selling "defective" Pro cards as gaming cards in reference to Radeon VII. The mindshare is strong.


You guys need to stop supporting these companies trying to monopolize the PC ecosystem,it is the worse possible outcome. Anytime a company tries to introduce proprietary tech in an effort to corner the market we consumers need to respond lethaly, meaning shut down sales. Only way to keep it from happening.

Gunderman456 03-16-2019 08:08 AM

Quote:

Originally Posted by DNMock (Post 27894290)
You make it sound like this is something new, but that's what Nvidia has been doing for quite a while now. The Titan cards specifically have always been Quaddro's that didn't pass QC and had the defective cores soldered off. (Although I think the Titan BLACK was actually a rebranded Quaddro due to excessive inventory)

Yes, they have and that is why I lamented Nvidia (approach that was copied by AMD), the cards, the prices and the people who bought the first $600+ cards. Prices that keep climbing into the absurd like we have at present and will not improve in the future.

I mean no matter your means, and mine are fine, you can't and shouldn't support those practices. If one took a moment to reflect before hitting that buy button, you could have seen the chess moves Nvidia was making and give them the finger before things got out of hand.

Out of principle, I'm still with my R9 290s. Last time AMD was reasonable. Out of principle, most people that are Nvidia centric should be with a GTX 580.

We could have controlled this as consumers. Oh, well.

DNMock 03-16-2019 08:28 AM

Quote:

Originally Posted by Gunderman456 (Post 27894306)
Yes, they have and that is why I lamented Nvidia (approach that was copied by AMD), the cards, the prices and the people who bought the first $600+ cards. Prices that keep climbing into the absurd like we have at present and will not improve in the future.

I mean no matter your means, and mine are fine, you can't and shouldn't support those practices. If one took a moment to reflect before hitting that buy button, you could have seen the chess moves Nvidia was making and give them the finger before things got out of hand.

Out of principle, I'm still with my R9 290s. Last time AMD was reasonable. Out of principle, most people that are Nvidia centric should be with a GTX 580.

We could have controlled this as consumers. Oh, well.

Right, and I for one am skipping a generation for the first time in a decade due to a combination of the excessive price gouging of the 20 series and abject lack of anything from AMD. Seems like a lot of people are following suite as well from Nvidia's sales numbers.

PontiacGTX 03-16-2019 08:32 AM

Quote:

Originally Posted by Gunderman456 (Post 27894306)
Yes, they have and that is why I lamented Nvidia (approach that was copied by AMD), the cards, the prices and the people who bought the first $600+ cards. Prices that keep climbing into the absurd like we have at present and will not improve in the future.

I mean no matter your means, and mine are fine, you can't and shouldn't support those practices. If one took a moment to reflect before hitting that buy button, you could have seen the chess moves Nvidia was making and give them the finger before things got out of hand.

Out of principle, I'm still with my R9 290s. Last time AMD was reasonable. Out of principle, most people that are Nvidia centric should be with a GTX 580.

We could have controlled this as consumers. Oh, well.

Once at AMD they Saw people were willing to pay 1200usd for a video card, AMD didnt hold back to release a slighly higher priced GPU if Radeon VII is 700usd with a small die,and 16GB imagine how much could it be with dedicated cores for RT?

ToTheSun! 03-16-2019 08:38 AM

Quote:

Originally Posted by Redwoodz (Post 27894304)
Yes, 10FPS more for $1,000 more. LMAO.

Reason tells me dedicated hardware and a much more powerful 2080ti would run this, what seems to be a much more efficient implementation of ray tracing than what we've seen in BF5 and Metro, with much better framerates. That's unsubstantiated, but clearly less so than your comment.

Though, when have you ever given nVidia the benefit of the doubt? I can't recall.

PontiacGTX 03-16-2019 08:48 AM

Quote:

Originally Posted by ToTheSun! (Post 27894348)
Reason tells me dedicated hardware and a much more powerful 2080ti would run this, what seems to be a much more efficient implementation of ray tracing than what we've seen in BF5 and Metro, with much better framerates. That's unsubstantiated, but clearly less so than your comment.

Though, when have you ever given nVidia the benefit of the doubt? I can't recall.

he is saying that the amount of money spent for a RTX capable GPU is considerably more expensive(than using a cheaper GPU with better implementation of reflection for example) than the benefit (10FPS more)

dagget3450 03-16-2019 09:18 AM

Quote:

Originally Posted by Defoler (Post 27894056)

Ray tracing existed way before you even heard the term GPU.
Nvidia though didn't "hijack" anything. They were marking their GPUs with a new capability to sell GPUs. Just like the VR era, or when AMD used mantle to sell their GPUs, or Vulkan to sell their GPUs, or tressfx to sell their GPUs. Etc.


Where did nvidia ever branded ray tracing as nvidia only?


I always see you defending Nvidia even when it doesn't make much sense to me. That is okay, it's your prerogative. I do wonder though in your blind defense if you even consider your own post and questions......

You pretty much answered your own question above. I mean it doesn't take a fricking rocket scientists to see RTX branding everywhere tied in with NVIDIA. They most certainly have though marketing made the assertion that Ray tracing is NVIDIA only. If you wish to argue they did not which seems to be your assertion. Then you should at least concede they pushed RT cores and promoted the Turing GPU's as the only possible solution for Ray tracing currently. I remember the silly press conference video they did where they showed it took like 4 previous gen titan GPU's to do some silly crappy ray tracing implementation. Then BLAM here is a gpu that can do it real time, and blah blah blah. If your too shortsighted to see this then. You totally glossed over the Physx crap mentioned above that nvidia did as well.

To me i feel AMD has done the same with the whole "worlds first 7nm gpu" but on a smaller scale. Marketing makes it seem like no one else will make a 7nm gpu ever. Even though we know eventually that not be the case. Maybe they are proud they did it first with RTX on Nvidias side, but you cannot ignore how its being branded RTX tied straight to Nvidia only.

I would love to see this demo released as a benchmark or more details as well. However i feel skeptics such as yourself will just find a way to talk it down.

JackCY 03-16-2019 09:46 AM

The issue with getting realistic rendering is not software, that exists for a long long time, the problem is having good enough hardware to run it real time.
Of course now that Nvidia has moved the performance up again, slowly but at least a little. Developers will add more and more complex effects to cater to higher performance hardware.
Vega 56 which was high end is quickly becoming a mainstream like performance.

Reflections are not even that important in many games, good lighting/global illumination is to get realistic shadows and not have to use a super fake AO that looks disgusting.

Demo with reflections vs a full game is also a big difference in how much processing can be allocated to reflections.

PontiacGTX 03-16-2019 10:45 AM

Quote:

Originally Posted by JackCY (Post 27894440)
The issue with getting realistic rendering is not software, that exists for a long long time, the problem is having good enough hardware to run it real time.
Of course now that Nvidia has moved the performance up again, slowly but at least a little. Developers will add more and more complex effects to cater to higher performance hardware.
Vega 56 which was high end is quickly becoming a mainstream like performance.

Reflections are not even that important in many games, good lighting/global illumination is to get realistic shadows and not have to use a super fake AO that looks disgusting.

Demo with reflections vs a full game is also a big difference in how much processing can be allocated to reflections.

I think Crytek has delivered what they have showcased in their demos,keep in mind all those assets belong to homefront the revolution which is already a game of course those arent the in game

ToTheSun! 03-16-2019 11:06 AM

Quote:

Originally Posted by PontiacGTX (Post 27894360)
he is saying that the amount of money spent for a RTX capable GPU is considerably more expensive(than using a cheaper GPU with better implementation of reflection for example) than the benefit (10FPS more)

Can you show me where an RTX card has been tested to get only 10 FPS more than a Vega 56?

PontiacGTX 03-16-2019 11:24 AM

Quote:

Originally Posted by ToTheSun! (Post 27894532)
Can you show me where an RTX card has been tested to get only 10 FPS more than a Vega 56?

that was what he was implying, I dont know how many fps more can give. I mean some people are focused on the price/performance than raw performance but if the RTX 2080Ti only could give 10FPS that would be around 70-90usd per each FPS and it isnt reasonable IF that was true

geoxile 03-16-2019 03:12 PM

Seems like Crytek is swindling people. This is just SVOGI, a lighting model from like two years ago that uses voxel based raytracing unlike RTX's per-poly raytracing. Nvidia explored a similar concept back in 2014 called VXGI which used voxel cone tracing as a way to lighten the volume of work necessary for modeling light. The fact that Crytek is trying to jump on the recent raytracing bandwagon in an attempt to pass this off as an equivalent to the (hardware accelerated) traditional polygonal raytracing is really deceptive.

edit forgot a video: original VXGI demo from 2014
Updated demo for RTX

epic1337 03-16-2019 03:21 PM

Quote:

Originally Posted by geoxile (Post 27894792)
Seems like Crytek is swindling people. This is just SVOGI, a lighting model from like two years ago that uses voxel based raytracing unlike RTX's per-poly raytracing. Nvidia explored a similar concept back in 2014 called VXGI which used voxel cone tracing as a way to lighten the volume of work necessary for modeling light. The fact that Crytek is trying to jump on the recent raytracing bandwagon in an attempt to pass this off as an equivalent to the (hardware accelerated) traditional polygonal raytracing is really deceptive.

well its better than nothing, technically its a solution that goes between standard lighting and RTX.
this at least proves the fact that there can be budget raytracing, even if its inferior.

geoxile 03-16-2019 03:31 PM

Quote:

Originally Posted by epic1337 (Post 27894802)
well its better than nothing, technically its a solution that goes between standard lighting and RTX.
this at least proves the fact that there can be budget raytracing, even if its inferior.

It is better than nothing, I agree. The difference probably isn't devastating in the first place, especially in motion, and this technique could probably be scaled up to be on par with current polygonal raytracing implementations (it probably already has been compared to older demos), the same way some voxel engines can produce extremely dense models, like some 3D sculpting engines like 3Dcoat IIRC. But this is like nvidia trying to pass off DLSS as a real 4k competitor, when currently it's just 1440p with AA even at its best.

Also, as far as raytracing goes afaik even some post processing effects use it to some degree.

keikei 03-16-2019 03:37 PM

Wow. RIP RTX.

ltpdttcdft 03-16-2019 03:46 PM

Quote:

Originally Posted by keikei (Post 27894818)
Wow. RIP RTX.

RX Vega is now RTX Vega! Trademark lawyers chew on this.

Redwoodz 03-16-2019 05:00 PM

Quote:

Originally Posted by ToTheSun! (Post 27894348)
Reason tells me dedicated hardware and a much more powerful 2080ti would run this, what seems to be a much more efficient implementation of ray tracing than what we've seen in BF5 and Metro, with much better framerates. That's unsubstantiated, but clearly less so than your comment.

Though, when have you ever given nVidia the benefit of the doubt? I can't recall.

That was sort of joke,obviously we have no direct comparisons.

The point is this tech is running 4k no problem on a $300 gpu vs needing a $1400 gpu to run 4K with Nvidia's Raytracing.

As for giving benefit of the doubt? When Nvidia changes their name maybe I won't be so skepticle( you should look it up in Latin). I've seen Jensen fight off open source standards for 15 years.... so yeah.

EastCoast 03-16-2019 05:17 PM

Quote:

Originally Posted by Redwoodz (Post 27894906)
That was sort of joke,obviously we have no direct comparisons.

The point is this tech is running 4k no problem on a $300 gpu vs needing a $1400 gpu to run 4K with Nvidia's Raytracing.

As for giving benefit of the doubt? When Nvidia changes their name maybe I won't be so skeptical( you should look it up in Latin). I've seen Jensen fight off open source standards for 15 years.... so yeah.

Bold part is really the crux of this IMO. And I still think, in the end, it will be some sort of hybrid ray tracing of some sort that will be good enough yet have minimal impact on performance. Because I still believe this is all focused for 2020 next gen consoles.
4K with HDR complimented through Hybrid Ray Tracing.
While some will complain "it ain't real RT" they would soon see masses buying skus providing it that is well below the price of the 2080TI = the crux of what you said.

Here's the real deal:
AMD isn't a market leader, nor has the stock price of their competitors. This in turns does have some effect on R/D of their existing IPs. But here's where it get's interesting...they have the necessary IP's to integrate and bring disruptive changes to the market unlike Nvidia. This is why Intel is getting onboard, they see the potential shift coming.

Here's a thought:
What would the landscape be today if ATi didn't sell off Adreno graphics core inside Snapdragon and AMD still bought AIT? Better yet, if AMD didn't sell off it's ARM license to Qualcomm and Broadcom when they bought ATI? It's this kind of disruptive market tactics that would have been a pure money stream today for AMD if it was managed right.

Point is that with what IPs they have now and if managed correctly, can bring some interesting skus to market. But I digress.

One example (besides the obvious with RT):
AMD brings forth AI adaptive learning designed to stop online cheating. How much value with this bring to competitive eSports games to companies like EA, Blizzard, etc?

Majin SSJ Eric 03-16-2019 07:09 PM

Quote:

Originally Posted by Defoler (Post 27894056)
Didn't they close the studio that made crysis?
If I'm not mistaken the developers from crysis moved to make homefront, but that studio was sold when crytek UK got sold off.
So I wouldn't hold my breath about it.

Sadly I agree that its virtually guaranteed that we will never again see another new Crysis game released, but that's the only thing relevant to me about Crytek at this point.

SoloCamo 03-16-2019 09:03 PM

Quote:

Originally Posted by Majin SSJ Eric (Post 27895036)
Sadly I agree that its virtually guaranteed that we will never again see another new Crysis game released, but that's the only thing relevant to me about Crytek at this point.

It's a shame, too. The whole "can it run Crysis" actually took away from the game in the sense that it was actually a pretty good game. Not the biggest fan of Crysis 2 but I did enjoy Crysis 3. Crysis 3 was actually my justification for buying a 7970GE.

Defoler 03-16-2019 11:53 PM

Quote:

Originally Posted by Redwoodz (Post 27894906)
obviously we have no direct comparisons.

The point is this tech is running 4k no problem on a $300 gpu vs needing a $1400 gpu to run 4K with Nvidia's Raytracing.

Didn't you just said we don't have direct comparison yet proceed to make a direct comparison?
What gives?
Also I don't see anywhere a claim that this is 4K. So I'm not sure why you put that in.

Quote:

Originally Posted by dagget3450 (Post 27894388)
I always see you defending Nvidia even when it doesn't make much sense to me. That is okay, it's your prerogative. I do wonder though in your blind defense if you even consider your own post and questions......

You pretty much answered your own question above. I mean it doesn't take a fricking rocket scientists to see RTX branding everywhere tied in with NVIDIA. They most certainly have though marketing made the assertion that Ray tracing is NVIDIA only. If you wish to argue they did not which seems to be your assertion. Then you should at least concede they pushed RT cores and promoted the Turing GPU's as the only possible solution for Ray tracing currently. I remember the silly press conference video they did where they showed it took like 4 previous gen titan GPU's to do some silly crappy ray tracing implementation. Then BLAM here is a gpu that can do it real time, and blah blah blah. If your too shortsighted to see this then. You totally glossed over the Physx crap mentioned above that nvidia did as well.

To me i feel AMD has done the same with the whole "worlds first 7nm gpu" but on a smaller scale. Marketing makes it seem like no one else will make a 7nm gpu ever. Even though we know eventually that not be the case. Maybe they are proud they did it first with RTX on Nvidias side, but you cannot ignore how its being branded RTX tied straight to Nvidia only.

I would love to see this demo released as a benchmark or more details as well. However i feel skeptics such as yourself will just find a way to talk it down.

I don't defend nvidia. I'm against misinformation.
A claim was put that nvidia hijacked ray tracing, and that is not correct.
Instead, you proceed to put claims on my words that are not correct, and you talk about RTX branding, which is not a claim for ray tracing only. Nvidia only claim to have hardware based ray tracing, which is correct, they are the only ones who have it right now. That still doesn't support the claim they either hijacked it or claim they are the only ones who have ray tracing as a whole.

I understand your frustration, as seen in a fan boy eyes, that every defence against misinformation, is direct attack on your AMD favouring brand. But that is something you have to work on.

epic1337 03-17-2019 05:49 AM

Quote:

Originally Posted by geoxile (Post 27894814)
It is better than nothing, I agree. The difference probably isn't devastating in the first place, especially in motion, and this technique could probably be scaled up to be on par with current polygonal raytracing implementations (it probably already has been compared to older demos), the same way some voxel engines can produce extremely dense models, like some 3D sculpting engines like 3Dcoat IIRC. But this is like nvidia trying to pass off DLSS as a real 4k competitor, when currently it's just 1440p with AA even at its best.

Also, as far as raytracing goes afaik even some post processing effects use it to some degree.

exactly, plus this would also put more pressure on Nvidia.
for the most part consoles have no access to Nvidia's RTX besides this attempt to mimic it with CryEngine.
which means to say theres bound to be devs interested enough to follow the trend, in both consoles and PC games.


in long term, there'd be a split with RayTracing, one is a multi-platform feature while the other is Nvidia's RTX.

huzzug 03-17-2019 07:09 AM

Quote:

Originally Posted by Defoler (Post 27895294)
I don't defend nvidia. I'm against misinformation.
A claim was put that nvidia hijacked ray tracing, and that is not correct.
Instead, you proceed to put claims on my words that are not correct, and you talk about RTX branding, which is not a claim for ray tracing only. Nvidia only claim to have hardware based ray tracing, which is correct, they are the only ones who have it right now.

Every software feature is hardware based. Even emulation needs hardware. I though you said you didn't like misinformation.

Quote:

I understand your frustration, as seen in a fan boy eyes, that every defence against misinformation, is direct attack on your Nvidia favouring brand. But that is something you have to work on.
Amen

P. S. I was just trying to give my 2cents to this hair splitting contest.

Quote:

Originally Posted by EastCoast (Post 27894930)
AMD isn't a market leader, nor has the stock price of their competitors. This in turns does have some effect on R/D of their existing IPs.

Share prices have no bearing on the ability of any organization's fund allocation for R&D of their IP's.

appylol 03-17-2019 07:12 AM

Pretty sweet to see this. So this likely is the method that most modern game engines are gonna implement for 'faking' real time ray tracing on the PS5 and XboxNext, with Navi SOCs? While the game devs that care can implement the proprietary Nvidia dlls for PC titles.

My soon to be 3 years old gtx1080 might have the grunt to pull this off at decent IQ - maybe even at 1440p given that the 1080 has a slight edge over Vega56. Thoughts?

ToTheSun! 03-17-2019 07:39 AM

Quote:

Originally Posted by epic1337 (Post 27894802)
well its better than nothing, technically its a solution that goes between standard lighting and RTX.
this at least proves the fact that there can be budget raytracing, even if its inferior.

Yeah, I'm of the same opinion.

I mean, let's just say this first for the sake of fairness: this is a demo. Games with real-time player input will have a more complex set of constraints. However, I think the difference in performance requirements is palpable, just from looking at this showcase.

Raytracing IS the future, and anything that can accelerate its adoption is great. Even if this is not TRVE raytracing, in the same way RTX is, it still looks spectacular (and miles better than traditional rendering), and we should be adopting this before anything else. Well, at least before nVidia's RTX can run at decent resolutions and framerates.

skline00 03-17-2019 08:39 AM

Quote:

Originally Posted by battlenut (Post 27893578)
I would like to see this video on a VEGA 64 or a Radeon VII.

So would I.

I would like them to run this on the same testbed cpu with a GTX 2080, GTX 2080ti, Vega 56/64 and a Radeon VII. Then we could see what affect the RT cores have.

Diffident 03-17-2019 12:02 PM

There was mention that the demo was in 4k, it is but according to the article on Techspot it's [email protected]

WannaBeOCer 03-17-2019 02:57 PM

Quote:

Originally Posted by Gunderman456 (Post 27893714)
It's real time. Ray Tracing was implemented in DX12 by M$ and Nvidia hijacked the name and open concept by saying but we have the #[email protected]%N^S&O*R Coreeesss.....

Crap (to me as a gamer) that was being used by Pro devs, which Nvidia recouped by turning around and selling the broken pro cards to gamers.

They bought this company in Israel so they can cram as many Pro cards in a million machines and have them go at lightning speed thanks to their new purchase. They don't care about gamers anymore.

Broken Pro cards will be sold to enthusiasts from now on and if you want the full chip broken cards expect to pay over $2500. If you can't, then you can go for the Ti for under $1500. What a deal.

If you want none pro cards, well you're now relegated to the low end tier cards that use to be in the 1050Ti class or below; you can get the 1660Ti or the 50 cards below that.

Nvidia curbed stomped the gamers that supported them all these years and it's the gamers who supported the $600 cards that need to be curbed stomped.

AMD saw what was happening and joined the bandwagon for the first time with their Pro Vegas I and II.

AMD has been doing the same thing with their professional cards for years. Nothing changed aside from AMD focusing on AI instead of just GPGPU with the release of their Instinct cards.

nVidia and others have been working on their own Hybrid Ray Tracing API for years. Without nVidia's RT Cores we wouldn't have a card(RTX 2080 Ti) capable of playing Ray Traced games with decent frames rates. We've seen multiple demos throughout the years from AMD, nVidia and Imagination Technologies all of them look nice but only one so far has the hardware to utilize it for gaming that actually is usable. Just like DX11 with Tessellation, both nVidia and AMD have to have Tessellation units for a great experience. With the 7970 AMD beefed up the tessellator in the Geometry engine: https://www.pcper.com/reviews/Graphi...ation-Texture-

nVidia first released OptiX in Nov 3, 2009, we see here it's running on 4 Fermi cards and he mentions it could be used for gaming one day.



We've seen the performance of RT cores in Port Royal along with V-Ray.


Along with Tensor Cores in Solidworks


Edit: Want to add what Crytek said regarding GPU RT hardware. I'm sure we'll see the RT performance once the demos utilize nVidia's RTX enhanced libraries.

Quote:

However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards

ToTheSun! 03-17-2019 05:55 PM

Quote:

Originally Posted by WannaBeOCer (Post 27895930)
Edit: Want to add what Crytek said regarding GPU RT hardware. I'm sure we'll see the RT performance once the demos utilize nVidia's RTX enhanced libraries.

It only makes sense. If they can achieve 4K30 (albeit in a demo) using a Vega 56, RTX cards will probably deliver a game-ready experience.

Majin SSJ Eric 03-17-2019 06:44 PM

Quote:

Originally Posted by SoloCamo (Post 27895136)
It's a shame, too. The whole "can it run Crysis" actually took away from the game in the sense that it was actually a pretty good game. Not the biggest fan of Crysis 2 but I did enjoy Crysis 3. Crysis 3 was actually my justification for buying a 7970GE.

I know I'm in the vast minority but Crysis 2 was probably my all-time favorite Crysis game (though I adore every one of them). I really just fell in love with the second game's (well, third game I guess) story line which I felt really came the closest of all of them to sucking me into the world and making me believe I actually was Alcatraz with the weight of the planet suddenly and unexpectedly thrust upon my shoulders. There was also some truly unforgettable moments (albeit the cinematic kind that gamers typically hate) from that game that are still ingrained into my brain to this day like when you're inside that bridge between high-rises and the Ceph ship shoots it, blowing you out and down onto the street where you then watch the choppers chase it around the skyscraper in the distance, shoot it down, and see it come BACK right at you and crash behind the parking garage right behind you! The first time I experienced that entire sequence I had to pick my jaw up off the floor simultaneously because of how gorgeously it was rendered and because of how epic the whole thing was! There was also the most emotional (for me) part of the game where you are unable to rescue the young couple from the collapsing high-rise. Of course there are epic moments in the other games (like Psycho listening in as the fighter pilot gets shot down by the Ceph in Warhead or the part where Prophet blows the dam in Crysis 3), but Crysis 2 does an overall better job of hitting all the perfect story beats for me, being well paced throughout, and it also has the best ending of any of the games in the franchise (again, IMO).

Also have to throw in how gorgeous the game looks once the high-texture and DX11 patches are installed. Crysis 3 of course is the overall best-looking game in the franchise (and the original Crysis has hands-down the most impressive graphics for its time), but I just remember the anticipation I had had waiting for those graphics packs to be released and how blown away I was once they were installed and I started just wandering around looking at everything in NY! It still sticks with me every time I think back about Crysis 2, as does Hans Zimmer's absolutely incredible score....

SoloCamo 03-17-2019 09:28 PM

Quote:

Originally Posted by Majin SSJ Eric (Post 27896166)
I know I'm in the vast minority but Crysis 2 was probably my all-time favorite Crysis game (though I adore every one of them). I really just fell in love with the second game's (well, third game I guess) story line which I felt really came the closest of all of them to sucking me into the world and making me believe I actually was Alcatraz with the weight of the planet suddenly and unexpectedly thrust upon my shoulders. There was also some truly unforgettable moments (albeit the cinematic kind that gamers typically hate) from that game that are still ingrained into my brain to this day like when you're inside that bridge between high-rises and the Ceph ship shoots it, blowing you out and down onto the street where you then watch the choppers chase it around the skyscraper in the distance, shoot it down, and see it come BACK right at you and crash behind the parking garage right behind you! The first time I experienced that entire sequence I had to pick my jaw up off the floor simultaneously because of how gorgeously it was rendered and because of how epic the whole thing was! There was also the most emotional (for me) part of the game where you are unable to rescue the young couple from the collapsing high-rise. Of course there are epic moments in the other games (like Psycho listening in as the fighter pilot gets shot down by the Ceph in Warhead or the part where Prophet blows the dam in Crysis 3), but Crysis 2 does an overall better job of hitting all the perfect story beats for me, being well paced throughout, and it also has the best ending of any of the games in the franchise (again, IMO).

Also have to throw in how gorgeous the game looks once the high-texture and DX11 patches are installed. Crysis 3 of course is the overall best-looking game in the franchise (and the original Crysis has hands-down the most impressive graphics for its time), but I just remember the anticipation I had had waiting for those graphics packs to be released and how blown away I was once they were installed and I started just wandering around looking at everything in NY! It still sticks with me every time I think back about Crysis 2, as does Hans Zimmer's absolutely incredible score....

In all fairness, I never did play Crysis 2 past the first few areas (maybe hour or two tops). But I did install the texture pack which made such a huge difference. I essentially played the game more or less to look at the visuals. Sounds like I'll need to go back to it.

Last I checked, the maldo mod was actually hard to get your hands on these days. Any reliable sites you can maybe throw my way for it?

Redwoodz 03-18-2019 05:43 AM

Quote:

Originally Posted by WannaBeOCer (Post 27895930)
AMD has been doing the same thing with their professional cards for years. Nothing changed aside from AMD focusing on AI instead of just GPGPU with the release of their Instinct cards.

nVidia and others have been working on their own Hybrid Ray Tracing API for years. Without nVidia's RT Cores we wouldn't have a card(RTX 2080 Ti) capable of playing Ray Traced games with decent frames rates. We've seen multiple demos throughout the years from AMD, nVidia and Imagination Technologies all of them look nice but only one so far has the hardware to utilize it for gaming that actually is usable. Just like DX11 with Tessellation, both nVidia and AMD have to have Tessellation units for a great experience. With the 7970 AMD beefed up the tessellator in the Geometry engine: https://www.pcper.com/reviews/Graphi...ation-Texture-

nVidia first released OptiX in Nov 3, 2009, we see here it's running on 4 Fermi cards and he mentions it could be used for gaming one day.

https://youtu.be/BAZQlQ86IB4

https://youtu.be/sD-lzVAlSDc

We've seen the performance of RT cores in Port Royal along with V-Ray.

https://www.youtube.com/watch?v=yVflNdzmKjg

Along with Tensor Cores in Solidworks

https://youtu.be/87_OVm3E1Oc

Edit: Want to add what Crytek said regarding GPU RT hardware. I'm sure we'll see the RT performance once the demos utilize nVidia's RTX enhanced libraries.

For the sake of everyone maybe you can explain how Unigine2 with SSRTGI ray-tracing differs? It's been out for what 3years now?

WannaBeOCer 03-18-2019 09:23 AM

Quote:

Originally Posted by Redwoodz (Post 27896584)
For the sake of everyone maybe you can explain how Unigine2 with SSRTGI ray-tracing differs? It's been out for what 3years now?

Differs from what? The CryTek demo which has a reflective drone along with multiple mirrors or the Star Wars demo/Port Royal which 40% of the scene is a mirror?

Here's Imagination Technologies' Apartment demo that was created 3 years ago running on a passively cooled mobile GPU.


bigjdubb 03-18-2019 11:08 AM

1 Attachment(s)
It's a demo, we should all know to not put any faith in demos. What they are showing us is real time for artists. Real time for gamers means that it's running in a game that we are playing. Until it's doing that it gets a big

PontiacGTX 03-18-2019 11:09 AM

if only crytek could make remake/redux of Crysis 1 and bring back the MP!, I dont know why they keep avoiding this, now that the game run better due the engine optimization, using the new APIs and also taking advantage of the compute from the GPUs...or at least release something popular(some game like R6S or battle royale game) so people could see the improvement for the reflections, they wouldnt need to do new assets Hunt the showdown or Homefront assets could work

Quote:

Originally Posted by bigjdubb (Post 27897018)
It's a demo, we should all know to not put any faith in demos. What they are showing us is real time for artists. Real time for gamers means that it's running in a game that we are playing. Until it's doing that it gets a big


bigjdubb 03-18-2019 11:19 AM


Exactly.

Majin SSJ Eric 03-18-2019 10:10 PM

I don't understand the point being made here. I am intimately familiar with every single Crysis game and I don't see any difference at all between what is being shown in those demos and what I remember seeing in the actual games as I walked around just looking at everything in awe...

8051 03-18-2019 11:31 PM

Quote:

Originally Posted by Majin SSJ Eric (Post 27896166)
There was also the most emotional (for me) part of the game where you are unable to rescue the young couple from the collapsing high-rise. Of course there are epic moments in the other games (like Psycho listening in as the fighter pilot gets shot down by the Ceph in Warhead or the part where Prophet blows the dam in Crysis 3), but Crysis 2 does an overall better job of hitting all the perfect story beats for me, being well paced throughout, and it also has the best ending of any of the games in the franchise (again, IMO).

The "young couple" in the "collapsing high-rise" isn't something I remember from Crysis 2 -- and yes I played thru that whole game twice (low res textures and all).

Offler 03-19-2019 03:29 AM

:D

UltraMega 03-19-2019 05:01 AM

Crytek has had this lighting tech around for a while, apparently. As has already been stated, they call the lighting tech SVOTI and indeed some games already use it. I found a bunch of videos showing it.






NightAntilli 03-19-2019 06:07 AM

Leaving two old posts of mine in here... Short version is, RTX is nothing special. Compute cores that were stripped from nVidia cards have been reintroduced (with some changes) and are now sold as the best thing ever. Even worse, both Pascal and AMD cards are capable of ray tracing. The limit is their actual speed, although, ray tracing being a compute technique, AMD can easily shine in it due to their async compute capabilities.

https://www.overclock.net/forum/379-...l#post27838198

https://www.overclock.net/forum/379-...l#post27834408

Defoler 03-19-2019 06:24 AM

Quote:

Originally Posted by NightAntilli (Post 27898064)
Leaving two old posts of mine in here... Short version is, RTX is nothing special. Compute cores that were stripped from nVidia cards have been reintroduced (with some changes) and are now sold as the best thing ever. Even worse, both Pascal and AMD cards are capable of ray tracing. The limit is their actual speed, although, ray tracing being a compute technique, AMD can easily shine in it due to their async compute capabilities.

https://www.overclock.net/forum/379-...l#post27838198

https://www.overclock.net/forum/379-...l#post27834408

Of course AMD will be capable of ray tracing. Heck, even software wise anyone can do it. But that is not the point. Pascal can do ray tracing as well without the tensor cores.

I don't think async compute has anything to do with it or AMD will benefit from it for that specific API.
Async compute is a way to parallel work on the GPU. If the GPU is under heavy load, it won't have room to put more work on it for ray tracing. For AMD it will still be done on the same cores as their regular work. For nvidia on RTX cards, they have those "spare" cores which aren't doing any work, but can have the offload of ray tracing done on them.
If the GPU isn't under heavy load, it will be able to run DXR on AMD just like pascal will do. Run it on their regular cores. AMD might be able to utilise the GPU a bit better (maybe, if indeed they will be able to utilise the compute well for DXR and considering they have higher compute power), but considering since pascal, nvidia have also increased their ability to parallel work and dynamic use of the queue like AMD, so I'm not sure AMD's "edge" with async compute is an edge anymore.

Async compute has been a bit shushed in the last 2 years, since nvidia made changes with pascal, basically making that "edge" irrelevant to AMD.

Not saying that to some cards it might still be powerful (like VII), but that is one card. I hope navi brings a better answer and no hit on DXR, but we will see when it comes.

PontiacGTX 03-19-2019 06:24 AM

Quote:

Originally Posted by Majin SSJ Eric (Post 27897794)
I don't understand the point being made here. I am intimately familiar with every single Crysis game and I don't see any difference at all between what is being shown in those demos and what I remember seeing in the actual games as I walked around just looking at everything in awe...

the point is clear in some of the demo were not even showing a playable game yet the graphics features were delivered...

Mand12 03-19-2019 07:06 AM

Quote:

Originally Posted by Gunderman456 (Post 27893450)
But I neeeds those tensor coresssss.........

Are you seriously starting a discussion on the difference between hardware and software acceleration?

NightAntilli 03-19-2019 07:49 AM

Leaving this article from a year ago here...

https://www.tomshardware.com/news/am...are,36702.html

Note how the first line says "GPU accelerated". Can we leave the whole nonsense of software vs hardware behind? Ray Tracing done on AMD hardware with Async compute is still hardware. Stop pretending this is somehow a software solution.

I guess graphics cards don't require drivers at all, considering they are hardware for games...

ToTheSun! 03-19-2019 08:16 AM

Quote:

Originally Posted by NightAntilli (Post 27898182)
Leaving this article from a year ago here...

https://www.tomshardware.com/news/am...are,36702.html

Note how the first line says "GPU accelerated". Can we leave the whole nonsense of software vs hardware behind? Ray Tracing done on AMD hardware with Async compute is still hardware. Stop pretending this is somehow a software solution.

I guess graphics cards don't require drivers at all, considering they are hardware for games...

RT cores are specialized at doing raytracing and nothing else. "Software accelerated" might not be the most accurate nomenclature, but the point of the distinction is that the GPU is not specifically and exclusively built for raytracing, in the same way that RT cores are.

It's also a way to differentiate them in regard to performance expectation. That's what Mand12 meant. Casually mocking nVidia's hardware solution on the basis that it's considered superfluous is missing the forest for the trees.

Mand12 03-19-2019 09:42 AM

Quote:

Originally Posted by ToTheSun! (Post 27898216)
RT cores are specialized at doing raytracing and nothing else. "Software accelerated" might not be the most accurate nomenclature, but the point of the distinction is that the GPU is not specifically and exclusively built for raytracing, in the same way that RT cores are.

It's also a way to differentiate them in regard to performance expectation. That's what Mand12 meant. Casually mocking nVidia's hardware solution on the basis that it's considered superfluous is missing the forest for the trees.

And neither is Pascal built for it, even though another thread is talking about how their new drivers are adding it. What a shock it will be when Pascal RT sucks compared to using hardware designed for it.

The mocking is mostly what I was mocking. Fine-tuned hardware does its job better than anything else, in this field.

NightAntilli 03-19-2019 10:28 AM

Quote:

Originally Posted by ToTheSun! (Post 27898216)
RT cores are specialized at doing raytracing and nothing else. "Software accelerated" might not be the most accurate nomenclature, but the point of the distinction is that the GPU is not specifically and exclusively built for raytracing, in the same way that RT cores are.

It's also a way to differentiate them in regard to performance expectation. That's what Mand12 meant. Casually mocking nVidia's hardware solution on the basis that it's considered superfluous is missing the forest for the trees.

RT cores are ASIC cores. But that doesn't mean the GPU was exclusively built for Ray Tracing. If that was the case, what are the Tensor cores doing there? What are the CUDA cores still doing there? If they really wanted a GPU specifically for RT, they would basically put mainly the ASIC there, and just enough CUDA/Tensor cores for the most basic geometry and denoising calculations. This is a transitory GPU with a step towards Ray Tracing.
nVidia's hardware 'solution' is the tackling of their own weakness in their own cards, which was compute power. Ray Tracing is a computationally heavy rendering technique. Their white paper states that the RT cores offload the CUDA/SM cores to do other work while the RT cores (or the ASIC) handles the ray tracing calculation. That is all fine, and it's a good thing for them to tackle.

People need to start thinking for themselves instead of looking at everything through the lens of nVidia. nVidia's solution is for their own cards, but people seem to think that because only they have that solution, everyone else is incapable. That is false. AMD is COMPLETELY different here. Have any of you ever wondered why AMD has many more flops, but still less performance in games? Rather than sweeping it under the rug by saying it's drivers or an inefficient architecture, I'll tell you why.

AMD does not have this issue of their stream processors being saturated and requiring offloading. AMD's compute power is above nVidia's in general. Why do you think miners flocked to AMD's cards during the mining boom? In fact, in terms of compute, the Radeon VII is the equivalent of the 2080Ti. Yes. Really. It doesn't translate into games, because games simply are not compute heavy. You could argue that it is an inefficient architecture, and it is, for rasterization. Not for compute.
Their stream processors don't need to be offloaded to do ray tracing, because in games, many of them are idling anyway, which is the reason why in the case of Vega 56 and Vega 64, they perform EXACTLY the same in games if they run at the exact same clock speeds. The additional 512 stream processors (which is close to 20% additional compute power) are literally doing NOTHING in games. That is without accounting how many idling stream processors there are within the total of 3584 in Vega 56. Who knows how many there are in the likes of the Radeon VII.

That's where the ACEs come in. They were designed to allow those idling stream processors to be used, in parallel to all the others already being used. All that idling power can be used to get ray tracing to work, without reducing current performance, because it's specifically using the idle stream processors. Now, we all know that would not be enough to implement ray tracing, which is why I'll go one step further with you. All the power that is used to do traditional shading techniques, that all comes free when you turn those off to do ray tracing instead. In other words, by lowering the amount of traditional rendering, you free up resources and thus more stream processors to increase ray tracing performance. And remember that AMD's cards are considered inefficient at those types of rendering techniques....

Also, the issue that nVidia mentions about thousands of instructions needed with Pascal for the ray tracing calculations... That is relevant because nVidia didn't have a hardware scheduler. AMD has multiple hardware schedulers in their cards, making that also a moot point. They can handle the stream of instructions and assign them efficiently to any idling stream processor through the ACEs. So I repeat... Stop looking at everything through the lens of nVidia.

That does not mean that an ASIC specifically for ray tracing would not help AMD's cards. ASICs for ray tracing would help everyone and everything. One can even put them on CPU cores if one so desires and eliminate the need for a ray tracing GPU. But, making use of those idling stream processors in AMD cards is practically a necessity before they go there. Why would they put additional hardware in there, while over 20% of the current compute power is not being used? Everything is already there to harness that power, and ray tracing is one of the best suited techniques to leverage GCN.

mouacyk 03-19-2019 10:43 AM

Quote:

Originally Posted by NightAntilli (Post 27898388)
WOT

I suppose it makes sense to leverage AMD's abundantly idle hardware. Why was AMD idling for so long? It can't be that they don't know how to program their own hardware and had to wait for Crytek?

ToTheSun! 03-19-2019 10:46 AM

Quote:

Originally Posted by NightAntilli (Post 27898388)
Spoiler!

Sure, but how does any of that relate to the fact that some people seem to be implying RT cores are not doing anything of importance and/or impact? You do say so yourself, but Mand12 and I were replying specifically to someone who was implying otherwise.

Now, the extra compute that modern AMD cards have, which no one is trying to obscure, can be used for compute heavy workloads and, in that way, run applications more efficiently than Pascal cards. Because of this, on the cusp of raytracing integration, nVidia decided to add ASIC's. That's why Turing cards are better than previous gens at raytracing and denoising.

AMD is on the record saying they don't believe their current hardware would produce satisfactory performance for real-time raytracing, even with all the extra compute capability their cards have over Pascal.

So, how does all of this not give RT cores legitimacy to exist in the current paradigm of development? You agree with my view, but, again, the original comments were not directed at you.

NightAntilli 03-19-2019 12:04 PM

Quote:

Originally Posted by ToTheSun! (Post 27898410)
Sure, but how does any of that relate to the fact that some people seem to be implying RT cores are not doing anything of importance and/or impact? You do say so yourself, but Mand12 and I were replying specifically to someone who was implying otherwise.

Now, the extra compute that modern AMD cards have, which no one is trying to obscure, can be used for compute heavy workloads and, in that way, run applications more efficiently than Pascal cards. Because of this, on the cusp of raytracing integration, nVidia decided to add ASIC's. That's why Turing cards are better than previous gens at raytracing and denoising.

AMD is on the record saying they don't believe their current hardware would produce satisfactory performance for real-time raytracing, even with all the extra compute capability their cards have over Pascal.

So, how does all of this not give RT cores legitimacy to exist in the current paradigm of development? You agree with my view, but, again, the original comments were not directed at you.

Does anything other than the RTX 2080 Ti give satisfactory performance for real-time ray tracing? And it's still hybrid ray tracing... So... Yeah. I can fully understand why AMD makes such a statement. AMD does not have many resources, and they are not going to put resources into this, especially when nVidia themselves are disappointed by the sales of the RTX cards. And it makes sense that they don't want to segment their GPUs, with some being capable and some not being capable. Consumers have enough aversion against them as it is.

Quote:

Originally Posted by mouacyk (Post 27898408)
I suppose it makes sense to leverage AMD's abundantly idle hardware. Why was AMD idling for so long? It can't be that they don't know how to program their own hardware and had to wait for Crytek?

My answer above to ToTheSun applies to your question as well. Radeon Rays has been a thing since at least 2015... But it was simply not adopted for games. And hardware is still not really ready. Look at what RTX is doing. In BFV, RT was done specifically for reflections. In Metro Exodus, it was done specifically for global illumination. What would happen if you want to put those two together? Or more of them, like an ambient occlusion effect? No cards can do that as of now.

mouacyk 03-19-2019 12:16 PM

Makes sense AMD doesn't want to jump in the deep end. It's not ripe for the picking yet. Gotcha. (holy $1300 for acceptable partial hybrid ray tracing ... yeah)

WannaBeOCer 03-19-2019 12:31 PM

Quote:

Originally Posted by NightAntilli (Post 27898530)
Does anything other than the RTX 2080 Ti give satisfactory performance for real-time ray tracing? And it's still hybrid ray tracing... So... Yeah. I can fully understand why AMD makes such a statement. AMD does not have many resources, and they are not going to put resources into this, especially when nVidia themselves are disappointed by the sales of the RTX cards. And it makes sense that they don't want to segment their GPUs, with some being capable and some not being capable. Consumers have enough aversion against them as it is.


My answer above to ToTheSun applies to your question as well. Radeon Rays has been a thing since at least 2015... But it was simply not adopted for games. And hardware is still not really ready. Look at what RTX is doing. In BFV, RT was done specifically for reflections. In Metro Exodus, it was done specifically for global illumination. What would happen if you want to put those two together? Or more of them?

nVidia releases hardware capable of gaming using Ray Tracing and everyone shouts "it's not full Ray Tracing" of course it's not. nVidia has been talking about their expectations with hybrid Ray Tracing in the early 2000s and finally released their API OptiX in 2008. It's called progress and it cost a decent amount to create this technology. No one is hiding that it's a hybrid Ray Tracing method since it's demanding. No one is forcing you to buy it but games sure do look better with it.

https://devblogs.microsoft.com/direc...tx-raytracing/

Quote:

Eventually, raytracing may completely replace rasterization as the standard algorithm for rendering 3D scenes. That said, until everyone has a light-field display on their desk, rasterization will continue to be an excellent match for the common case of rendering content to a flat grid of square pixels, supplemented by raytracing for true 3D effects.
The demo that Crytek created uses SVOGI that a nVidia researcher created named Cyril Crassin. I'm sure there is a good reason why they dropped voxel based for BVH and I'm sure AMD had a word in the decision for BVH when they worked with Microsoft to add DxR.

http://on-demand.gputechconf.com/gtc...lumination.pdf
https://blog.icare3d.org/2012/06/unr...l-time-gi.html

Some input from random people: http://ompf2.com/viewtopic.php?t=166

NightAntilli 03-19-2019 02:05 PM

Quote:

Originally Posted by WannaBeOCer (Post 27898578)
nVidia releases hardware capable of gaming using Ray Tracing and everyone shouts "it's not full Ray Tracing" of course it's not. nVidia has been talking about their expectations with hybrid Ray Tracing in the early 2000s and finally released their API OptiX in 2008. It's called progress and it cost a decent amount to create this technology. No one is hiding that it's a hybrid Ray Tracing method since it's demanding. No one is forcing you to buy it but games sure do look better with it.

https://devblogs.microsoft.com/direc...tx-raytracing/



The demo that Crytek created uses SVOGI that a nVidia researcher created named Cyril Crassin. I'm sure there is a good reason why they dropped voxel based for BVH and I'm sure AMD had a word in the decision for BVH when they worked with Microsoft to add DxR.

http://on-demand.gputechconf.com/gtc...lumination.pdf
https://blog.icare3d.org/2012/06/unr...l-time-gi.html

Some input from random people: http://ompf2.com/viewtopic.php?t=166

You'd be surprised how many times I read replies in forums where people think the RTX games are fully ray traced.

Two of your links are dead btw.

Zenairis 03-19-2019 04:17 PM

Quote:

Originally Posted by PontiacGTX (Post 27893594)
Quote:

Originally Posted by UltraMega (Post 27893584)
Well this just make Nvidia look extra silly. Branding ray tracing as an Nvidia only feature was a mistake for Nvidia that will bite them later on... since its... ya know... not.

Like physx?see how other engines or even havok achieve similar result without using nvidia propietary technology

While this is true that they achieve it to a certain extent they are no where near as remotely efficient with it. No AMD architecture base game has proven that it can compete with Witcher 3, FFXV or Metro Exodus in terms of effects and in the first twos case hair, grass and physics related effects of the sort. Sure almost any GPU can draw rays. But can it do it in real time in high volume with a large number of rays. That is the question

Majin SSJ Eric 03-19-2019 10:38 PM

Quote:

Originally Posted by Mand12 (Post 27898334)
And neither is Pascal built for it, even though another thread is talking about how their new drivers are adding it. What a shock it will be when Pascal RT sucks compared to using hardware designed for it.

The mocking is mostly what I was mocking. Fine-tuned hardware does its job better than anything else, in this field.

I suspect (with great cynicism) that the actual reason Nvidia has opened DxR up to Pascal cards in the upcoming driver updates is precisely because RTX sales are in the tank and they want to show everyone how "Horrible" Pascal is at rendering these ray-traced scenes. But, "if you want to unlock the REAL POWAHHHH of Nvidia's awesome RT implementation then all you need to do is buy one of these shiny new RTX cards and take your gaming to the next level!!!"

Look, RTX is a real hardware feature and it has real benefits over any non-hardware dedicated ray-tracing solution (even Jim from AdoredTV did a whole video in which he actually praised Nvidia for being very clever with their RTX hybrid RT strategy), but at the end of the day it doesn't really matter all that much to anyone looking for a new GPU today. Hardly any games actually support RTX (Still), the only card that provides truly acceptable RTX performance costs $1200+ (the 2080 is only capable at 1080p, and even then just barely), and the few games that do support RTX don't provide enough tangible eye-candy benefits over traditional rendering to be anywhere near worth the cost to frame rates (and there are issues with Nvidia's hybrid RT solution, as demonstrated HERE
In summation, I personally find RTX to be a fascinating technological achievement by Nvidia that probably needed more time to mature before being thrown out there onto an unsuspecting GPU market. Toss in the fact that they needlessly segmented their entire GPU lineup purely around this one somewhat gimmicky feature (RTX vs GTX cards), marketed RTX by insinuating that it was the "Only" way to implement ray-tracing in games, and created RTX SKU's like the 2060 that have no hope whatsoever of delivering satisfying FPS when actually utilizing the feature, and I think its obvious why sales have been lackluster and Nvidia has seen blow back from the community.

Hwgeek 03-20-2019 01:21 AM

Today is full with AMD on GDC:
Enabling Real-Time Light Baking Workflows in Saber Engine with AMD Radeon-Rays Library (Presented by AMD)
https://schedule.gdconf.com/session/...-by-amd/864864

8051 03-20-2019 01:50 AM

Quote:

Originally Posted by PontiacGTX (Post 27893594)
Like physx?see how other engines or even havok achieve similar result without using nvidia propietary technology

PhysX is no longer Nvidia proprietary tech, it's open-source and even supports vendor neutral GPU acceleration libraries.

ToTheSun! 03-20-2019 03:45 AM

Quote:

Originally Posted by Majin SSJ Eric (Post 27899422)
I suspect (with great cynicism) that the actual reason Nvidia has opened DxR up to Pascal cards in the upcoming driver updates is precisely because RTX sales are in the tank and they want to show everyone how "Horrible" Pascal is at rendering these ray-traced scenes. But, "if you want to unlock the REAL POWAHHHH of Nvidia's awesome RT implementation then all you need to do is buy one of these shiny new RTX cards and take your gaming to the next level!!!"

Look, RTX is a real hardware feature and it has real benefits over any non-hardware dedicated ray-tracing solution (even Jim from AdoredTV did a whole video in which he actually praised Nvidia for being very clever with their RTX hybrid RT strategy), but at the end of the day it doesn't really matter all that much to anyone looking for a new GPU today. Hardly any games actually support RTX (Still), the only card that provides truly acceptable RTX performance costs $1200+ (the 2080 is only capable at 1080p, and even then just barely), and the few games that do support RTX don't provide enough tangible eye-candy benefits over traditional rendering to be anywhere near worth the cost to frame rates (and there are issues with Nvidia's hybrid RT solution, as demonstrated HERE
Spoiler!


In summation, I personally find RTX to be a fascinating technological achievement by Nvidia that probably needed more time to mature before being thrown out there onto an unsuspecting GPU market. Toss in the fact that they needlessly segmented their entire GPU lineup purely around this one somewhat gimmicky feature (RTX vs GTX cards), marketed RTX by insinuating that it was the "Only" way to implement ray-tracing in games, and created RTX SKU's like the 2060 that have no hope whatsoever of delivering satisfying FPS when actually utilizing the feature, and I think its obvious why sales have been lackluster and Nvidia has seen blow back from the community.

I agree. Based on what I've seen from Crytek's demo, I hope that they eventually find ways to bring that graphical fidelity into games in a more efficient manner. All things considered, BFV's and Metro's were a little underwhelming, even if still very cool in absolute terms.

Can you imagine Cyberpunk 2077 with Neon Noir's look? Match made in heaven, and possibly the best justification for RT cores ever.

Cherryblue 03-20-2019 04:39 AM

Quote:

Originally Posted by 8051 (Post 27899568)
PhysX is no longer Nvidia proprietary tech, it's open-source and even supports vendor neutral GPU acceleration libraries.

I do believe only the cpu-based tech went open-source, not gpu based physx ; well maybe neutral vendor can add gpu acceleration library, but they didn't get what already existed, all is still to build for AMD.

Diffident 03-21-2019 03:08 PM

Quote:

Originally Posted by Cherryblue (Post 27899702)
I do believe only the cpu-based tech went open-source, not gpu based physx ; well maybe neutral vendor can add gpu acceleration library, but they didn't get what already existed, all is still to build for AMD.


On GitHub

Quote:

Welcome to the NVIDIA PhysX SDK source code repository. This depot includes the PhysX SDK and the Kapla Demo application.
The NVIDIA PhysX SDK is a scalable multi-platform physics solution supporting a wide range of devices, from smartphones to high-end multicore CPUs and GPUs. PhysX is already integrated into some of the most popular game engines, including Unreal Engine, and Unity3D. PhysX SDK on developer.nvidia.com.

Drake87 03-21-2019 06:48 PM

Quote:

Originally Posted by Gunderman456 (Post 27893848)
It better scare Asus, AsRock, Gigabyte, MSI, etc... etc... These greedy companies are also upping the anti when a top tear gaming mobo was ~$250 now they are at $500-$600+. They made marijuana legal worldwide and these companies are smoking to much weed. No one notices or says anything either. Companies repeatedly collude on RAM (affects SSDs, Video Cards and RAM prices) and HDD prices and all we get is bend over as "enthusiast" keep buying. They colluded on TVs but never been caught for monitors, I would have thought that would have also been a logical step. It's amazing how greed has almost affected every PC component and "enthusiasts" keep buying. Yes, they tell you that it's none of your business how the more money then brains crowd spend their money. They're not hurting anything.

I think you have a misunderstanding of marijuana's potency. It's not pcp or meth.

Hwgeek 03-23-2019 05:00 AM

1 Attachment(s)
Looks like Radeon Rays 3.0 is availible now:
And look - FP16 is supported now, so indeed Vega will use it's 2:1 FP16 for better performance :-).
https://www.amd.com/pl/technologies/...der-developers

rluker5 03-23-2019 07:43 AM

Quote:

Originally Posted by ToTheSun! (Post 27899654)
I agree. Based on what I've seen from Crytek's demo, I hope that they eventually find ways to bring that graphical fidelity into games in a more efficient manner. All things considered, BFV's and Metro's were a little underwhelming, even if still very cool in absolute terms.

Can you imagine Cyberpunk 2077 with Neon Noir's look? Match made in heaven, and possibly the best justification for RT cores ever.

I'm hoping Nvidia will have a new release by then. Something faster.

ToTheSun! 03-23-2019 07:59 AM

Quote:

Originally Posted by rluker5 (Post 27904618)
I'm hoping Nvidia will have a new release by then. Something faster.

Octane needs to be the new nVidia CEO.

prjindigo 03-23-2019 11:08 AM

That's nice and all, its about time someone actually implemented it...

but nobody's gonna use it because Crytek sues companies for obeying the terms of their license.


All times are GMT -7. The time now is 06:03 AM.

Powered by vBulletin® Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2020 DragonByte Technologies Ltd.