Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD
New Posts  All Forums:Forum Nav:

[WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD - Page 18

post #171 of 799
Quote:
Originally Posted by NuclearPeace View Post

Firstly, anyone (including me) is able to obtain a license to use or modify Gameworks from NVIDIA, including AMD. The whole "black box" shtick is just not only scary sounding words to get people to be irrationally scared of something (i.e "death panels" and "chemtrails"), its false.

Secondly, how is it NVIDIA's fault that a game is released in a broken or poor state? Gameworks is a library of graphical effects and its not NVIDIA who created the game.

I'm sick and tired of the GW arguments, do some research of your own. Or better yet, if AMD's GE garbage starts crippling Nvidia in the future, don't moan about it.
Quote:
Originally Posted by PostalTwinkie View Post

lachen.gif

Get real! What is the point of even having a conversation if you are going to be just dismissive, of Oxide themselves, just to hold your point up?

Oxide screwed up, spoke big publicly, and then actually ended up with a foot in their mouth. Nvidia said it wasn't working, Oxide disagreed, and later found out that Nvidia was right. So at that point Oxide made the apology, let us all know, and began working to fix it.

To be dismissive of that completely ruins the point of having a conversation.

A developer says they were pressured into disabling features of their game by Nvidia, Nvidia and the boss of that developer then end up with a back and forth on Twitter. The CEO of Oxide then tells Nvidia to "tread lightly". Those are the facts, then later down the road a spokesperson for Oxide says they love Nvidia and everyone else and all is fine. Right on. thumb.gif

As you and the other team green brigade keep reminding everyone, Nvidia has 80% marketshare. Oxide is a small company, you do the math.

If Oxide was wrong then why did Nvidia go work on a driver, if it was all Oxides problem? Where is this "apology" you speak of?
post #172 of 799
the funny part is that no one is speaking the real thing

the whole dx12 works on amd and doesnt work on nvidia is going so far as when ms was talking about the x box one....
dx12 is the evolution of the api that xbox one has an api that is built around amd cgn cards...who ever thinks otherwise needs to wake up and its not like nvidia didnt had the chance to grab the console market but their pr derpament said the console gains are terrible http://www.extremetech.com/gaming/150892-nvidia-gave-amd-ps4-because-console-margins-are-terrible

now ofc when you read this on 2016 you obviously call for BS and it makes sense the only thing that works as it suppose to on that company is the pr team
after that we now for a fact that nvidia was talking with ms about dx12 when it was on development as far as late 2013...to tell that they didnt knew about async is just a lie and a stupid one

their own words on their own site

"Our work with Microsoft on DirectX 12 began more than four years ago with discussions about reducing resource overhead. For the past year, NVIDIA has been working closely with the DirectX team to deliver a working design and implementation of DX12 at GDC"
http://blogs.nvidia.com/blog/2014/03/20/directx-12/

you see the internet never forgets and you can never hide from the truth eventually the karma will hit you thumb.gif
post #173 of 799
Quote:
Originally Posted by xxdarkreap3rxx View Post

I feel like it's not even worth replying to this terrible quality post.



http://www.overclock3d.net/reviews/gpu_displays/batman_arkham_knight_amd_vs_nvidia_performance_review/7


post #174 of 799
Quote:
Originally Posted by BradleyW View Post

Yeah, my comment was in regards to DX11 and how it limits GCN. That's a fact.
GCN is technically better. Nvidia's Arch is suited for DX11. DX11 chokes GCN. Nvidia does better in DX11 in general.
that wouldnt be a problem if AMD wanted to fix it, but if DX12 will improve the game developing industry , then is fine as it is
post #175 of 799
Quote:
Originally Posted by DNMock View Post

This is true, Gameworks runs like trash period. Doesn't matter if you are using an Intel Igpu, AMD, or Nvidia card, you turn on a gameworks feature and you are gonna get hammered for 25% reduction in FPS. I run dual T-X gpu's and still turn off most trashworks features because it destroys my FPS.

Amen.

Quote:
Originally Posted by GoLDii3 View Post




I can speak for Fallout 4. Performance is crap with Godrays on. I shouldn't have any issues running that game on my 980, if I enable Godrays I do. Pisses me off.
Edited by criminal - 2/11/16 at 2:03pm
Super P's rig
(20 items)
 
  
CPUMotherboardGraphicsRAM
5960x ASUS X99-A II Asus GTX 1080 Ti Corsair Vengeance DDR4 3000 
Hard DriveHard DriveHard DriveOptical Drive
MyDigitalSSD BPX NVMe Samsung 850 EVO Seagate Momentus XT 500 GB External DVDRW 
CoolingCoolingOSMonitor
EK-XLC Predator 240 Swiftech 240mm Radiator Windows 10 Samsung 40" 4K - UN40KU6290 
KeyboardPowerCaseMouse
G710+ EVGA SuperNOVA 850G2 Fractal Design Define S G700s 
Mouse PadAudioAudioAudio
Vipamz Extended XXXL Asus U7 M-Audio AV40 Sennheiser HD 439 
  hide details  
Reply
Super P's rig
(20 items)
 
  
CPUMotherboardGraphicsRAM
5960x ASUS X99-A II Asus GTX 1080 Ti Corsair Vengeance DDR4 3000 
Hard DriveHard DriveHard DriveOptical Drive
MyDigitalSSD BPX NVMe Samsung 850 EVO Seagate Momentus XT 500 GB External DVDRW 
CoolingCoolingOSMonitor
EK-XLC Predator 240 Swiftech 240mm Radiator Windows 10 Samsung 40" 4K - UN40KU6290 
KeyboardPowerCaseMouse
G710+ EVGA SuperNOVA 850G2 Fractal Design Define S G700s 
Mouse PadAudioAudioAudio
Vipamz Extended XXXL Asus U7 M-Audio AV40 Sennheiser HD 439 
  hide details  
Reply
post #176 of 799
Thread Starter 
Quote:
Originally Posted by ZealotKi11er View Post

The difference here even if A-Sync was AMD only is that A-Sync increases performance hence allowing better effect. While GameWorks allows better effects on sacrifice of limited performance. The reason we dont see much DX12 is Nvidia sadly. They are just not ready for it so they will delay it as much as possible. If Nvidia had GCN as their architecture right now with same marketshare you would see DX12 in every games. They sure have shown us how deticated they are with GameWorks.

I said all of this last year when Ark: Evolved got delayed. On Nvidia, DX11 did better than DX12.
post #177 of 799
Quote:
Originally Posted by PontiacGTX View Post

that wouldnt be a problem if AMD wanted to fix it, but if DX12 will improve the game developing industry , then is fine as it is

Do you know what RTG would have to do in order to fix the DX11 overhead they suffer? They'd have to develope a heavily modified version of GCN or scrap it all together. The very structure of their drivers would then be built from the ground up. Starting anew. It would take a few years. They don't have the time or money to do it. And why would they if low level API's may become the standard?
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
post #178 of 799
Quote:
Originally Posted by BradleyW View Post

Do you know what RTG would have to do in order to fix the DX11 overhead they suffer? They'd have to develope a heavily modified version of GCN or scrap it all together. The very structure of their drivers would then be built from the ground up. Starting anew. It would take a few years. They don't have the time or money to do it. And why would they if low level API's may become the standard?
but they can improve DIrectx 11 MT/ST performance they have been doing since Win 10 was released
post #179 of 799
Quote:
Originally Posted by PontiacGTX View Post

but they can improve DIrectx 11 MT/ST performance they have been doing since Win 10 was released

WDDM v2.0 was the reason for their slight boost. RTG also made driver improvements. Overall, this gave RTG an average of a 2 FPS increase at most in some CPU bound titles. This is the best they can do currently.
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
X79-GCN
(22 items)
 
  
CPUMotherboardGraphicsRAM
Intel 3930K 4.5GHz HT GIGABYTE GA-X79-UP4 AMD R9-290X GEil Evo Potenza DDR3 2400MHz CL10 (4x4GB) 
Hard DriveCoolingCoolingCooling
Samsung 840 Pro 120GB EK Supremacy (CPU) NF F12's P/P (360 Rad)  NF A14's (420 Rad)  
CoolingCoolingCoolingCooling
XSPC Chrome Compression Fittings EK RES X3 150 Primochill PremoFlex Advanced LRT Clear 1/2 ID EK-FC (R9 290X) 
CoolingCoolingCoolingOS
EK D5 Vario Top-X  Phobya G-Changer V2 360mm Phobya G-Changer V2 420mm Win 10 x64 Pro 
MonitorKeyboardPowerCase
BenQ XR3501 35" Curved Corsair Vengeance K90 Seasonic X-1250 Gold (v2) Corsair 900D 
MouseAudio
Logitech G400s Senn HD 598 
  hide details  
Reply
post #180 of 799
The fact of the matter is, we'll have only a few games using Async Computing, coming from companies with good relations with AMD, that on itself is sad because Async Computing is the future, a better alternative to everything we have, everyone should be using it now because it's the most efficient shading technique, but sadly because Nvidia have such a huge marketshare they'll have the influence and power to delay the progress in this industry as much as they want until they take their sweet years time to integrate Async Computing into Volta.

In the meantime, let more games be praised for X64 tessellation on every character's Hair that looks exactly the same as x8 but it is implemented like that for reasons you can only imagine, with games like Fallout 4 that looks like crap and run like crap.

This makes me sad.
Edited by Dargonplay - 2/11/16 at 2:19pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [WCCF] HITMAN To Feature Best Implementation Of DX12 Async Compute Yet, Says AMD