Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD
New Posts  All Forums:Forum Nav:

[WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD - Page 16

post #151 of 799
Quote:
Originally Posted by Dargonplay View Post

What are you smoking sir? That is certainly something strong. Battlefield 4 is probably one of the best optimized games in recent History, it had NETWORK Problems at release, which doesn't relate to performance problems in this known Universe, thanks for proving my point.

I think that was just a typo. BF3 was an abortion on launch. BF4 was bad, but not terrible, except the single player campaign which was just a broken mess.
post #152 of 799
Quote:
Originally Posted by xxdarkreap3rxx View Post

Could have sworn it had some bad memory leaks too. Unless that was 3. Or another game.
There were memory leak issues but that was due to bad AMD drivers at the time.
What
(12 items)
 
  
CPUMotherboardGraphicsRAM
i7-6950X X99 Deluxe ASUS GTX Titan Black 32GB @ 3200 
Hard DriveOSMonitorKeyboard
Samsung 960 Pro Windows 10 Pro x64 Dell UltraSharp U3014 Microsoft SideWinder X4 
PowerCaseMouseAudio
Corsair AX860 NZXT Switch 810 Mionix Castor Asus Xonar STX + 2x LME49860NA OPs + HD650s 
  hide details  
Reply
What
(12 items)
 
  
CPUMotherboardGraphicsRAM
i7-6950X X99 Deluxe ASUS GTX Titan Black 32GB @ 3200 
Hard DriveOSMonitorKeyboard
Samsung 960 Pro Windows 10 Pro x64 Dell UltraSharp U3014 Microsoft SideWinder X4 
PowerCaseMouseAudio
Corsair AX860 NZXT Switch 810 Mionix Castor Asus Xonar STX + 2x LME49860NA OPs + HD650s 
  hide details  
Reply
post #153 of 799
Quote:
Originally Posted by xxdarkreap3rxx View Post


Gotta hover over me and select "Follow Member" thumb.gif

lachen.gif

fair enough
Quote:
Originally Posted by xxdarkreap3rxx View Post

I was referencing Nvidia blocking driver initialization when AMD cards are present to avoid "SLI" between say a 980 Ti and Fury X. Pretty sure that's what the original context was.

look right now i choose to live in the land of magic and unicorns where AMD and NV will cohabitate in peace and harmony.
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #154 of 799
Quote:
Originally Posted by looniam View Post

look right now i choose to live in the land of magic and unicorns where AMD and NV will cohabitate in peace and harmony.

I am pretty sure it is just their respective users that don't cohabitate in peace and harmony.
Computer
(14 items)
 
  
CPUMotherboardGraphicsRAM
i7-4790k Gigabyte Z97X Gaming 7 Nvidia GTX 1080ti FE (EK block) CORSAIR Vengeance 
Hard DriveHard DriveCoolingOS
Samsung 850 Evo Mushkin Reactor Custom Loop Windows 10 
MonitorKeyboardPowerCase
Ben Q BL3201PT Logitech G110 EVGA 1300w Thermatake Core P5 
MouseAudio
LOGITECH G502 PROTEUS CORE Corsair 2100 
  hide details  
Reply
Computer
(14 items)
 
  
CPUMotherboardGraphicsRAM
i7-4790k Gigabyte Z97X Gaming 7 Nvidia GTX 1080ti FE (EK block) CORSAIR Vengeance 
Hard DriveHard DriveCoolingOS
Samsung 850 Evo Mushkin Reactor Custom Loop Windows 10 
MonitorKeyboardPowerCase
Ben Q BL3201PT Logitech G110 EVGA 1300w Thermatake Core P5 
MouseAudio
LOGITECH G502 PROTEUS CORE Corsair 2100 
  hide details  
Reply
post #155 of 799
Quote:
Originally Posted by NuclearPeace View Post

Then I don't see the issue. If an effect, Gameworks or not, doesn't run well on your rig, its common sense to just turn it off. I don't see why people have to go on internet message boards and whine endlessly that it's NVIDIA's fault that their computer cant handle it.

I explain the "issue" in the part of my post that you omitted.. What's the point in replying to one aspect of my post? rolleyes.gif
post #156 of 799
Quote:
Originally Posted by looniam View Post

look right now i choose to live in the land of magic and unicorns where AMD and NV will cohabitate in peace and harmony.

post #157 of 799
So many posts, so little progress. Here's my understanding. You know, we can go back and forth between the positive and negative attributes between RTG and Nvidia. One things certain, Nvidia's current line up of GPU's are well suited for limited API's. Their drivers are highly tuned in relation to this. As for RTG, the GCN architechture has a very deep and complex pipeline with a tone of available processing cores/shaders. This is well suited for a low level API, which will allow the whole GPU to be fed with data properly and optimally. DX11 is very limiting to GCN and stops much of the GPU from ever being utilized. That's why Nvidia win the benchmarks despite GCN being the far supirior architecture in terms of available power.
Edited by BradleyW - 2/11/16 at 1:30pm
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
post #158 of 799
Quote:
Originally Posted by MoorishBrutha View Post

You dudes act like you all don't have a search engine or something. It's well-known that Nvidia is using that 80% to bully developers not to use Async Compute:
http://www.dsogaming.com/news/oxide-developer-nvidia-was-putting-pressure-on-us-to-disable-certain-settings-in-the-benchmark/

Try and keep up...

http://www.overclock.net/t/1569897/various-ashes-of-the-singularity-dx12-benchmarks/2120_20#post_24379702

In the post from Kollock, from Oxide, he states...
Quote:
Originally Posted by Kollock 
We actually just chatted with Nvidia about Async Compute, indeed the driver hasn't fully implemented it yet, but it appeared like it was. We are working closely with them as they fully implement Async Compute. We'll keep everyone posted as we learn more.

There was no pressure from Nvidia. Nvidia stated things weren't working, Oxide thought they were, it went back and forth, and it turns out they really weren't working. So Oxide began to work closely with Nvidia to correct it.
Edited by PostalTwinkie - 2/11/16 at 1:34pm
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
Reply
post #159 of 799
Quote:
Originally Posted by xxdarkreap3rxx View Post

People are stupid as hell. "Ultra" or "Very High" are the same. Gameworks is just extra pretty graphics if you have the power. Whether that be on game launch or replaying the game years later. Personally, I love Gameworks. I can go back to a game in 3 years with a more powerful card and play on settings even higher than what I originally played at.

So it's normal for a single lighting effect (godrays) or the hair of a single character (hairworks) to eat up over 20% of the entire GPU processing budget? If the Witcher 3 was made using the same level of optimization as gameworks stuff does, you might be able to get Geralt to be able to run around in a blank white environment at 4k 30fps if you have 4-way Titan-X sli ruinning.

Heck Square-Enix was doing hairworks level of detail on a PS3 in their games, so don't tell me it's just super detailed and that amazing. It's just optimized like trash, pure and simple.

Just curious, did you buy Arkham Knight on PC and just run at low settings because, hey in 3 years you might be able to turn it up to high, and praise the quality of work done?
post #160 of 799
Quote:
Originally Posted by BradleyW View Post

So many posts, so little progress. Here's my understanding. You know, we can go back and forth between the positive and negative attributes between RTG and Nvidia. One things certain, Nvidia's current line up of GPU's are well suited for limited API's. Their drivers are highly tuned in relation to this. As for RTG, the GCN architechture has a very deep and complex pipeline with a tone of available processing cores/shaders. This is well suited for a low level API, which will allow the whole GPU to be fed with data properly and optimally. DX11 is very limiting to GCN and stops much of the GPU from ever being utilized. That's why Nvidia win the benchmarks despite GCN being the far supirior architecture in terms of available power.

Pretty much this. thumb.gif

I'll say this again since it got buried earlier.. Imagine how AMD must have felt, watching their hardware get mocked on power consumption because all that complex hardware was guzzling watts, while no developers were actually using it.. They were so desperate that they eventually went as far to create their own API.. lachen.gif That's what happens when you pretty much have a monopoly in the form of Direct X..
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD