Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD
New Posts  All Forums:Forum Nav:

[WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD - Page 25

post #241 of 799
Thread closed for cleaning.

When it reopens, there will be no more name calling in the thread or you will be removed from the discussion. That includes "shill" or anything else.

This is an important topic as it could result in the reversal of the discrete gpu market IF true.

So let's stick to having a professional discussion about it.
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
post #242 of 799
Reopened.
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
Upstairs Rig
(11 items)
 
  
CPUMotherboardGraphicsRAM
4770k Asus Maximus VI Hero evga 1080 Ti with Hybrid mod Corsair Vengeance Pro 2133 mhz 
Hard DriveHard DriveCoolingOS
Samsung 850 EVO 500gb WD Caviar Black Corsair h100i GTX Windows 8.1 64bit 
MonitorPowerCase
xb280hk EVGA Supernova 1000 G2 Corsair Carbide Air 540 
  hide details  
Reply
post #243 of 799
Quote:
Originally Posted by Dargonplay View Post

Yet another Gameworks title.

Textures have nothing to do with GameWorks in the game I referenced, at all.

Once again, someone blaming a product for something it has ZERO impact on. GameWorks wasn't even a factor in my statement you quoted.

Quote:
Originally Posted by Fyrwulf View Post

Uh, no. Sorry, that's not how this works. nVidia chose not to support a feature of an open and independent API in their hardware implementation, that's their fault and AMD is well within their rights to force feed crow pie to nVidia for their hubris given their relative market positions. nVidia chose to turn PhysX into a closed system by blocking out AMD card owners from utilizing it, that is also nVidia's fault and not okay given their market position.

Um, yes, yes it is how it works. AMD published the announcement, they chose the words, not me.

Second, Nvidia's hardware decision many years ago has nothing to do with ASC now, in terms of it being used. It wasn't "supported" as you say, because years ago there was no need for it, just like heavy compute and DX 12 still won't matter for a year or two minimum.

Your PhysX comment is also completely lacking in factual basis. Nvidia purchased PhysX, they had a financial investment in it. They did not lock out AMD, in fact they offered to license the technology to AMD for "pennies per GPU shipped", and AMD turned them down. AMD/ATi literally wanted to use PhysX for free, even in the face of Nvidia purchasing it and having a financial interest in it.

If AMD wanted to use a technology that cost money, they need to contribute to that.

As far as using Nvidia and AMD cards in one system? That was NEVER officially supported.
Edited by PostalTwinkie - 2/12/16 at 11:24am
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
post #244 of 799
Quote:
Originally Posted by BradleyW View Post

But you miss out on good looking features, which can be implimented far more optimally compared to GameWorks.

Did you look at the algorithms used for GameWorks stuff?

If so, I sure would like to know (i'm sure Nvidia would too) how they can improve it.
post #245 of 799
Quote:
Originally Posted by looniam View Post


god forbid that gpu manufactures have to invest time and money to develop and promote their own technology, whether proprietary or open, to improve IQ setting in the game experience.

You're right, what I was thinking!, Godrays improves Image Quality and the Game Experience, thanks for proving me wrong, also all games that were developed with Gameworks offered the best game experience ever.
Quote:
Originally Posted by looniam View Post

and btw, its not ad hominem when you show how they were wrong. wink.gif

https://en.wikipedia.org/wiki/Ad_hominem

You have only proven your ability to rant, you didn't even touched the argument itself, I can't see how that equals to proving someone wrong.
Quote:
Originally Posted by looniam View Post

i'll be very clear here; sorry, but i don't care to discuss your rubbish with you.

No great loss.
Quote:
Originally Posted by looniam View Post

to be blunt all you've done is complain and parrot/fabricate rubbish. so that's why i didn't care to reply directly to you but someone i know that can be reasonable.

And you've done so differently.

Truth is, Gameworks only benefits nVidia, I don't hate nVidia I jut don't like what they're doing with this industry, I haven't seen a single Gameworks title that haven't ended up being a broken mess at release, and my points in my previous posts about how early benchmarks of these titles affects AMD's image are still untouched.
Quote:
Originally Posted by PostalTwinkie View Post


Once again, someone blaming a product for something it has ZERO impact on. GameWorks wasn't even a factor in my statement you quoted.

It is a factor when all other Gameworks titles presents the same issues. You said, and I'll quote for you to remember.
Quote:
Originally Posted by PostalTwinkie View Post

Very minor, if any, visual changes between higher levels of settings, yet massive performance impacts. Call of Duty Ghosts seemed to be really bad with this, no decernable difference between the higher tier settings. Yet going from "High" to "Ultra" resulted in a near 50% performance hit at times.

That's what Gameworks do, Godrays, Tesselation X64 in people's heads, you say this behavior is very common now but it wasn't before Gameworks.
Quote:
Originally Posted by PostalTwinkie View Post

Very minor, if any, visual changes between higher levels of settings, yet massive performance impacts.

This just screams Gameworks and Godrays thumb.gif
Edited by Dargonplay - 2/12/16 at 12:10pm
post #246 of 799
Quote:
Originally Posted by PostalTwinkie View Post

Textures have nothing to do with GameWorks in the game I referenced, at all.

Once again, someone blaming a product for something it has ZERO impact on. GameWorks wasn't even a factor in my statement you quoted.
Um, yes, yes it is how it works. AMD published the announcement, they chose the words, not me.

Second, Nvidia's hardware decision many years ago has nothing to do with ASC now, in terms of it being used. It wasn't "supported" as you say, because years ago there was no need for it, just like heavy compute and DX 12 still won't matter for a year or two minimum.

Your PhysX comment is also completely lacking in factual basis. Nvidia purchased PhysX, they had a financial investment in it. They did not lock out AMD, in fact they offered to license the technology to AMD for "pennies per GPU shipped", and AMD turned them down. AMD/ATi literally wanted to use PhysX for free, even in the face of Nvidia purchasing it and having a financial interest in it.

If AMD wanted to use a technology that cost money, they need to contribute to that.

As far as using Nvidia and AMD cards in one system? That was NEVER officially supported.
your posts are always amusing
so nvidia NEVER hardcoded physx into their cards for when they detected an amd card? that was a dream we all saw years ago? probably..... they even made a nice suprise to people that used hybrid physx making any game unplayable by messing the textures
nvidia offered physx months after amd got into havok which by definition is better

also nvidia hardware decision was about money and only that in their own blog they said they were working closely with ms on dx12 for 4 years 2010-2014 they even marketed their cards as fully dx12 cards back then which now we know that its BS as usual
post #247 of 799
Quote:
Originally Posted by airfathaaaaa View Post

your posts are always amusing
so nvidia NEVER hardcoded physx into their cards for when they detected an amd card? that was a dream we all saw years ago? probably..... they even made a nice suprise to people that used hybrid physx making any game unplayable by messing the textures
nvidia offered physx months after amd got into havok which by definition is better

also nvidia hardware decision was about money and only that in their own blog they said they were working closely with ms on dx12 for 4 years 2010-2014 they even marketed their cards as fully dx12 cards back then which now we know that its BS as usual

Did you read what I said? No, better question, what are you even talking about or saying? It doesn't make sense.

I said dual card systems, Nvidia for PhysX, AMD for primary, was never officially supported by Nvidia. Which it wasn't. That is a completely different statement from saying it couldn't be done. Also, are you agreeing with me about AMD not getting PhysX over money? Because, that is why I said AMD didn't get it.

Nvidia wanted money from AMD, AMD didn't want to pay it. Pretty straight forward.

I honestly don't know if you are agreeing with me, or not, or what you just typed. But, I look forward to clarity.
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
post #248 of 799
Quote:
Originally Posted by PostalTwinkie View Post

Did you read what I said? No, better question, what are you even talking about or saying? It doesn't make sense.

I said dual card systems, Nvidia for PhysX, AMD for primary, was never officially supported by Nvidia. Which it wasn't. That is a completely different statement from saying it couldn't be done. Also, are you agreeing with me about AMD not getting PhysX over money? Because, that is why I said AMD didn't get it.

Nvidia wanted money from AMD, AMD didn't want to pay it. Pretty straight forward.

I honestly don't know if you are agreeing with me, or not, or what you just typed. But, I look forward to clarity.
you said that nvidia never locked out amd cards which is not true since they DID lock amd cards out(on which the dx12 being hardware agnostic can totally nullify this which we already have seen on aos )
you remember when nvidia offered amd physx? it was 7 years ago months before we started to see the first clues about the cgn 1.0 do you seriously think any company would have said yes considering that they would have need a back end into their hardware for that?
nope
also considering that they choose havok and actually use it and push it to be literally everywere says a lot about what is going on behind the scenes
post #249 of 799
Quote:
Originally Posted by Dargonplay View Post

Truth is, Gameworks only benefits nVidia, I don't hate nVidia I jut don't like what they're doing with this industry, I haven't seen a single Gameworks title that haven't ended up being a broken mess at release, and my points in my previous posts about how early benchmarks of these titles affects AMD's image are still untouched[.
i guess you don't read what you quote.
Quote:
Originally Posted by looniam View Post

the problem with AMD's performance on a games release is they never have drivers ready and you know this.

cheers. cheers.gif

funny thing, this is the ONLY forum were most of the talk is about gameworks . everyone else is actually discussing a-sync compute and how AMD can use that to their advantage.

its nvidia's fault ya know. wink.gif
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #250 of 799
Quote:
Originally Posted by looniam View Post

the problem with AMD's performance on a games release is they never have drivers ready and you know this.

If you read anything at all you wouldn't be quoting things that prove my point, AMD Can't take part of a Gameworks Title development process, how do you expect them to have optimizations ready at launch? Oh wait, they don't, I believe you can find brain the power to figure out the why, you just got it backwards.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD