Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD
New Posts  All Forums:Forum Nav:

[WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD - Page 23

post #221 of 799
Quote:
Originally Posted by Lex Luger View Post

The AMD conspiracy theorists really need to cool it when it comes to gameworks. Would you rather nvidia go back to physx only where the option to enable them is grayed out?

Do you have any ACTUAL proof that gameworks titles intentionally cripple AMD hardware even when all gameworks effects are disabled?

If async compute and dx12 give AMD a big advantage over kepler and maxwell, I would be happy for AMD and wouldn't shout from the rooftops, CONSPIRACY, CONSPIRACY on every thread relating to games and AMD gpu's for next 5 years like some of the people on these forums.

YOU HAVE NO PROOF! NONE! ZIP! Deal with it.wink.gif

Well here's a fact, AMD have countless time stated they can't optimize anything for Gameworks titles during the development process, Nvidia is the only one who can and Nvidia doesn't let developers to add AMD's own code to optimize the game for Radeon Cards, AMD have to make these optimizations with their drivers instead of being released with the game, this means the game have to launch and after it is available to everyone AMD have to optimize their drivers for it. In all Gameworks titles AMD always loses miserably the performance war on early game but always catch up at late game, the thing is, for a marketing standpoint... Late game doesn't matter.

When a game comes out every Web Site will benchmark the soul out of it, there are a few (Minor sites) that will benchmark the same game again months, even years after release, once the early benchmarks for Gameworks titles are out, AMD Reputations gets tainted for the remaining of that generation and people would always see it as the 2nd tier offering, even when they are in fact the best performing offering at the time (Late game), also when casual players looking to upgrade see these "Late game benchmarks" they often disregard them because they aren't coming from big sites like Techspot or Anandtech who only benchmark games when they're freshly out in the market.

After seeing Nvidia's record of business decision will you really say that the idea of nVidia doing this on purpose to gain marketshare hampering AMD is far fetched? The tessellation Scandal, Hairworks Effects, Godrays, PhysX, Gameworks... They all have something in common, they hurt AMD more than they hurt nVidia, and they grant nVidia the performance crown when games are releasing, which in the end is all that matters for newcomers and people looking to upgrade who just do a few benchmark searches when deciding for a new video card.
Quote:
Originally Posted by Unkzilla View Post

who really cares about how current cards are going to run in the future? I'm going to be dumping my GPU on ebay quick smart and will be ready to upgrade - to either side

This right here is what I'm talking about.
Edited by Dargonplay - 2/12/16 at 1:58am
post #222 of 799
Quote:
Originally Posted by Unkzilla View Post

A lot of wasted energy here

If we are to believe how much performance increase the new GPUs (from both sides) are going to offer, who really cares about how current cards are going to run in the future? I'm going to be dumping my GPU on ebay quick smart and will be ready to upgrade - to either side

By the time these titles release in any meaningful quantity, its probably going to be a debate of 25 vs 30fps on the current gen hardware

Many current cards support dx12. Early benchmarks show significant performance improvement. To qoute to the vice prez, 'this is a big deal'.

post #223 of 799
Quote:
Originally Posted by keikei View Post

Many current cards support dx12. Early benchmarks show significant performance improvement. To qoute to the vice prez, 'this is a big deal'.

That VP9 stream is so real.
The Machine
(14 items)
 
Nexus 7 2013
(11 items)
 
 
CPUMotherboardGraphicsRAM
A10 6800K Asus F2A85-V MSI 6870 Hawx, VTX3D 5770, AMD HD6950(RIP), Sap... G.skill Ripjaws PC12800 6-8-6-24 
Hard DriveOptical DriveOSMonitor
Seagate 7200.5 1TB NEC 3540 Dvd-Rom Windows 7 x32 Ultimate Samsung P2350 23" 1080p 
PowerCaseMouseAudio
Seasonic s12-600w CoolerMaster Centurion 5 Logitech G600 Auzen X-Fi Raider 
CPUMotherboardGraphicsRAM
Quad Krait 300 at 1.5Ghz Qualcomm APQ8064-1AA SOC Adreno 320 at 400mhz 2GB DDR3L-1600 
Hard DriveOSMonitorKeyboard
32GB Internal NAND Android 5.0 7" 1920X1200 103% sRGB & 572 cd/m2 LTPS IPS Microsoft Wedge Mobile Keyboard 
PowerAudio
3950mAh/15.01mAh Battery Stereo Speakers 
  hide details  
Reply
The Machine
(14 items)
 
Nexus 7 2013
(11 items)
 
 
CPUMotherboardGraphicsRAM
A10 6800K Asus F2A85-V MSI 6870 Hawx, VTX3D 5770, AMD HD6950(RIP), Sap... G.skill Ripjaws PC12800 6-8-6-24 
Hard DriveOptical DriveOSMonitor
Seagate 7200.5 1TB NEC 3540 Dvd-Rom Windows 7 x32 Ultimate Samsung P2350 23" 1080p 
PowerCaseMouseAudio
Seasonic s12-600w CoolerMaster Centurion 5 Logitech G600 Auzen X-Fi Raider 
CPUMotherboardGraphicsRAM
Quad Krait 300 at 1.5Ghz Qualcomm APQ8064-1AA SOC Adreno 320 at 400mhz 2GB DDR3L-1600 
Hard DriveOSMonitorKeyboard
32GB Internal NAND Android 5.0 7" 1920X1200 103% sRGB & 572 cd/m2 LTPS IPS Microsoft Wedge Mobile Keyboard 
PowerAudio
3950mAh/15.01mAh Battery Stereo Speakers 
  hide details  
Reply
post #224 of 799
Quote:
Originally Posted by NightAntilli View Post

It was just PR because everyone that has tested the architectures has confirmed that GCN can do async compute with major performance gains, while on paper nVidia's also can, but in practice the asynchronicity benefits are nullified due to the high delay caused by the required context switch. Implementing it through drivers for nVidia will not do anything other than change the way things are rendered, but not making it any faster.

We aren't talking about if Nvidia can do it or not via hardware, we know they can't.

We are talking about the Nvidia specific code that Oxide thought was working (was a false positive), and it was pointed out by Nvidia with a bit of disagreement from Oxide. However, it was fixed, as they verified the false positive.
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
post #225 of 799
Quote:
Originally Posted by Dargonplay View Post

Well here's a fact, AMD have countless time stated they can't optimize anything for Gameworks titles during the development process, Nvidia is the only one who can and Nvidia doesn't let developers to add AMD's own code to optimize the game for Radeon Cards, AMD have to make these optimizations with their drivers instead of being released with the game, this means the game have to launch and after it is available to everyone AMD have to optimize their drivers for it. In all Gameworks titles AMD always loses miserably the performance war on early game but always catch up at late game, the thing is, for a marketing standpoint... Late game doesn't matter.

When a game comes out every Web Site benchmark the soul out of it, there are a few (Minor sites) that benchmarks the same game again months, even years after release, once the early benchmarks for Gameworks titles are out, AMD Reputations gets tainted for the remaining of that generation and people would always see it as the 2nd tier offering, even when they are in fact the best performing offering at the time (Late game), also when casual players looking to upgrade see these "Late game benchmarks" they often disregard them because they aren't coming from big sites like Techspot or Anandtech who only benchmark games when they're freshly out in the market.

After seeing Nvidia's record of business decision will you really say the above is far fetched? The tessellation Scandal, Hairworks Effects, Godrays, PhysX, Gameworks... They all have something in common, they hurt AMD more than they hurt nVidia, and they grant nVidia the performance crown when games are releasing, which in the end is all that matters for newcomers and people looking to upgrade who just do a few benchmark searches when deciding for a new video card.
This right here is what I'm talking about.

Well said. Even if AMD can optimize they have to spend more time and money to optimize Nvidia IP. Also I am much better to not have GameWork and have PhsyX like before.
Ishimura
(21 items)
 
Silent Knight
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel Core i7 3770K @ 4.6GHz ASRock Z77E-ITX eVGA GTX 1080 Ti Hybrid AMD Radeon R9 16GB DDR3-2400MHz  
Hard DriveHard DriveCoolingCooling
SanDisk Ultra II 960GB Toshiba X300 5TB Corsair H100i GTX eVGA Hybrid Water Cooler  
CoolingOSMonitorKeyboard
4x GentleTyphoon AP-15 Windows 10 Pro 64-Bit Philips Brilliance BDM4065UC 4K Razer BlackWidow Chroma  
PowerCaseMouseMouse Pad
eVGA SuperNOVA 750 G3 Define Nano S Logitech G502 Proteus Core PECHAM Gaming Mouse Pad XX-Large 
AudioAudioAudioAudio
Audioengine D1 DAC Mackie CR Series CR3 Audio-Technica ATH-M50 Sennheiser HD 598 
Audio
Sony XB950BT 
CPUMotherboardGraphicsRAM
AMD Phenom II X4 955 @ 4.2GHz ASUS M4A79XTD EVO AMD Radeon HD 7970 3GB @ 1200/1500 2x 4GB G.SKILL Ripjaws X DDR3-1600 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 60GB WD Caviar Green 1.5TB 2 x Seagate Barracuda 2TB XSPC Raystorm 
CoolingCoolingOSPower
EK-FC7970 XSPC RS360 Windows 10 Pro 64-Bit Corsair TX750 
Case
NZXT Switch 810  
  hide details  
Reply
Ishimura
(21 items)
 
Silent Knight
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel Core i7 3770K @ 4.6GHz ASRock Z77E-ITX eVGA GTX 1080 Ti Hybrid AMD Radeon R9 16GB DDR3-2400MHz  
Hard DriveHard DriveCoolingCooling
SanDisk Ultra II 960GB Toshiba X300 5TB Corsair H100i GTX eVGA Hybrid Water Cooler  
CoolingOSMonitorKeyboard
4x GentleTyphoon AP-15 Windows 10 Pro 64-Bit Philips Brilliance BDM4065UC 4K Razer BlackWidow Chroma  
PowerCaseMouseMouse Pad
eVGA SuperNOVA 750 G3 Define Nano S Logitech G502 Proteus Core PECHAM Gaming Mouse Pad XX-Large 
AudioAudioAudioAudio
Audioengine D1 DAC Mackie CR Series CR3 Audio-Technica ATH-M50 Sennheiser HD 598 
Audio
Sony XB950BT 
CPUMotherboardGraphicsRAM
AMD Phenom II X4 955 @ 4.2GHz ASUS M4A79XTD EVO AMD Radeon HD 7970 3GB @ 1200/1500 2x 4GB G.SKILL Ripjaws X DDR3-1600 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 60GB WD Caviar Green 1.5TB 2 x Seagate Barracuda 2TB XSPC Raystorm 
CoolingCoolingOSPower
EK-FC7970 XSPC RS360 Windows 10 Pro 64-Bit Corsair TX750 
Case
NZXT Switch 810  
  hide details  
Reply
post #226 of 799
Quote:
Originally Posted by BradleyW View Post

The difference in IQ between low and ultra god rays was around 1% at the cost of 20+ frames.

I don't even see the 1% of IQ difference in those screenshots above. They are literally 100% the same image between godrays low and ultra! Pretty scandalous right there...
post #227 of 799
Quote:
Originally Posted by Lex Luger View Post

The AMD conspiracy theorists really need to cool it when it comes to gameworks. Would you rather nvidia go back to physx only where the option to enable them is grayed out?


Short answer.... Yes.
New(old) build
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel Xeon E3 1231-V3 gigabyte GA-H97N GTX 1060 G-skill lifetime 1600mhz 
Hard DriveHard DriveCoolingOS
2x WD green 1x ocz agility SSD  Modded deepcool Gammaxx Windows 8.1 with classic shell for games. 
OSMonitorKeyboardPower
Fedora KDE for everything else.  Yamakasi catleap q270 Gamdias hermes ultimate  EVGA G2 750 
CaseMouseMouse PadAudio
Core V1 G303 - Hand cancer edition.  goliathus  AKG k240  
  hide details  
Reply
New(old) build
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel Xeon E3 1231-V3 gigabyte GA-H97N GTX 1060 G-skill lifetime 1600mhz 
Hard DriveHard DriveCoolingOS
2x WD green 1x ocz agility SSD  Modded deepcool Gammaxx Windows 8.1 with classic shell for games. 
OSMonitorKeyboardPower
Fedora KDE for everything else.  Yamakasi catleap q270 Gamdias hermes ultimate  EVGA G2 750 
CaseMouseMouse PadAudio
Core V1 G303 - Hand cancer edition.  goliathus  AKG k240  
  hide details  
Reply
post #228 of 799
Quote:
Originally Posted by Majin SSJ Eric View Post

I don't even see the 1% of IQ difference in those screenshots above. They are literally 100% the same image between godrays low and ultra! Pretty scandalous right there...

Actually, they might as well be called Godrays because you just to have faith of their existence.... Oh wait
post #229 of 799
Quote:
Originally Posted by Majin SSJ Eric View Post

I don't even see the 1% of IQ difference in those screenshots above. They are literally 100% the same image between godrays low and ultra! Pretty scandalous right there...

This is more common than people want to admit.

Very minor, if any, visual changes between higher levels of settings, yet massive performance impacts. Call of Duty Ghosts seemed to be really bad with this, no decernable difference between the higher tier settings. Yet going from "High" to "Ultra" resulted in a near 50% performance hit at times.
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
post #230 of 799
Quote:
Originally Posted by PostalTwinkie View Post

This is more common than people want to admit.

Very minor, if any, visual changes between higher levels of settings, yet massive performance impacts. Call of Duty Ghosts seemed to be really bad with this, no decernable difference between the higher tier settings. Yet going from "High" to "Ultra" resulted in a near 50% performance hit at times.

Yet another Gameworks title.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD