Overclock.net › Forums › Industry News › Video Game News › [DSoG]Report: Total War: WARHAMMER runs 27% slower in DX12 on NVIDIA’s hardware
New Posts  All Forums:Forum Nav:

[DSoG]Report: Total War: WARHAMMER runs 27% slower in DX12 on NVIDIA’s hardware - Page 28

post #271 of 360
1080 price is entirely AMDs fault

smile.gif


only half joking
The Green Beast
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770K @ 4500 Mhz ASRock Z77 Pro3 Gigabyte GTX 1080 G1 Gaming Crucial Ballistix 2x8GB DDR3-1600 
Hard DriveHard DriveHard DriveCooling
SSD Crucial M550 500GB SSD Samsung 850 Evo 1TB HDD Seagate 7200rpm 3TB Cooler Master Hyper 212 EVO; Xilence X5 
OSMonitorKeyboardPower
Windows 10 Pro 64-bit Acer Predator XB271HU 27" IPS Gsync 1440p 165Hz CM Storm QuickFire XT Cherry Red 800W modular 
CaseMouseMouse PadAudio
Fractal Design Define R4 Black Logitech G900 Chaos Spectrum SteelSeries QcK+ 4mm SK Gaming Realtek On-board 
  hide details  
Reply
The Green Beast
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770K @ 4500 Mhz ASRock Z77 Pro3 Gigabyte GTX 1080 G1 Gaming Crucial Ballistix 2x8GB DDR3-1600 
Hard DriveHard DriveHard DriveCooling
SSD Crucial M550 500GB SSD Samsung 850 Evo 1TB HDD Seagate 7200rpm 3TB Cooler Master Hyper 212 EVO; Xilence X5 
OSMonitorKeyboardPower
Windows 10 Pro 64-bit Acer Predator XB271HU 27" IPS Gsync 1440p 165Hz CM Storm QuickFire XT Cherry Red 800W modular 
CaseMouseMouse PadAudio
Fractal Design Define R4 Black Logitech G900 Chaos Spectrum SteelSeries QcK+ 4mm SK Gaming Realtek On-board 
  hide details  
Reply
post #272 of 360
Quote:
Originally Posted by ChevChelios View Post

1080 price is entirely AMDs fault

smile.gif


only half joking
while it's not amd's fault at all, it's amd loss. now they have to top 1080 in some way be it price or perfromance, if they released a $600-700 card in q2/3 they'd be the ones counting the profits not nvidia.
post #273 of 360
its probably a combination of:

(1) they wanted to "attack" the mainstream segment first for more volume (1060 got in the way though)

(2) 14nm LPP FinFET wasnt able to produce a bigger higher-end chip with enough efficiency as fast TSMC 16nm could, so there was no choice but to go with Polaris first
The Green Beast
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770K @ 4500 Mhz ASRock Z77 Pro3 Gigabyte GTX 1080 G1 Gaming Crucial Ballistix 2x8GB DDR3-1600 
Hard DriveHard DriveHard DriveCooling
SSD Crucial M550 500GB SSD Samsung 850 Evo 1TB HDD Seagate 7200rpm 3TB Cooler Master Hyper 212 EVO; Xilence X5 
OSMonitorKeyboardPower
Windows 10 Pro 64-bit Acer Predator XB271HU 27" IPS Gsync 1440p 165Hz CM Storm QuickFire XT Cherry Red 800W modular 
CaseMouseMouse PadAudio
Fractal Design Define R4 Black Logitech G900 Chaos Spectrum SteelSeries QcK+ 4mm SK Gaming Realtek On-board 
  hide details  
Reply
The Green Beast
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770K @ 4500 Mhz ASRock Z77 Pro3 Gigabyte GTX 1080 G1 Gaming Crucial Ballistix 2x8GB DDR3-1600 
Hard DriveHard DriveHard DriveCooling
SSD Crucial M550 500GB SSD Samsung 850 Evo 1TB HDD Seagate 7200rpm 3TB Cooler Master Hyper 212 EVO; Xilence X5 
OSMonitorKeyboardPower
Windows 10 Pro 64-bit Acer Predator XB271HU 27" IPS Gsync 1440p 165Hz CM Storm QuickFire XT Cherry Red 800W modular 
CaseMouseMouse PadAudio
Fractal Design Define R4 Black Logitech G900 Chaos Spectrum SteelSeries QcK+ 4mm SK Gaming Realtek On-board 
  hide details  
Reply
post #274 of 360
Quote:
Originally Posted by ChevChelios View Post

its probably a combination of:

(1) they wanted to "attack" the mainstream segment first for more volume (1060 got in the way though)

(2) 14nm LPP FinFET wasnt able to produce a bigger higher-end chip with enough efficiency as fast TSMC 16nm could, so there was no choice but to go with Polaris first

well judging by the performance/watt ratio of Polaris maybe it wasn't really possible to give us a higher end chip that early and they need more time. nvidia got away with 314mm chip cause they just came out all guns blazing with core and memory clocks on 1080
Edited by Klocek001 - 7/15/16 at 5:20am
post #275 of 360
Quote:
Originally Posted by ChevChelios View Post

its probably a combination of:

(1) they wanted to "attack" the mainstream segment first for more volume (1060 got in the way though)

(2) 14nm LPP FinFET wasnt able to produce a bigger higher-end chip with enough efficiency as fast TSMC 16nm could, so there was no choice but to go with Polaris first

Please show me where I can buy a 1060 now please? Kinda hard for a slide on the net to get in the way of a high volume GPU thats selling right now. And even then the 1060 looks like it'll cost more and perform about the same, maybe a tiny bit better in DX11 but def less in DX12. But sadly even if 1060 only matches the 480 and costs more, it will likely still outsell the 480.

Also, wasn't Polaris first known for a long time? It's not like they were forced to make this move, this had been planned for quite some time...
M06
(20 items)
 
  
CPUMotherboardGraphicsRAM
AMD FX6300 Gigabyte 990FXA-UD5 XFX 7950 - 3GB G.Skill Sniper 8GB (2x4GB) DDR3 2133 CL9 @ 1733... 
Hard DriveHard DriveHard DriveOptical Drive
WD Blue 500GB WD Black 1.5TB Crucial M4 128GB (OS) LG ODD 
CoolingOSMonitorMonitor
Deepcool Lucifer v2 Win7 Ultimate 64 bit Acer X223w (1050) LG 22EN33 (1080) 
KeyboardPowerCaseMouse
Sharkoon Tactix OCZ ModXstream Pro 700w Modular Corsair 300R CM Storm Xornet 
Mouse PadAudioAudioAudio
Steelseries Qck+ DOTA2 Edition Edifier e1100+  Sennheiser HD215 Plantronics Gamecom 307 
  hide details  
Reply
M06
(20 items)
 
  
CPUMotherboardGraphicsRAM
AMD FX6300 Gigabyte 990FXA-UD5 XFX 7950 - 3GB G.Skill Sniper 8GB (2x4GB) DDR3 2133 CL9 @ 1733... 
Hard DriveHard DriveHard DriveOptical Drive
WD Blue 500GB WD Black 1.5TB Crucial M4 128GB (OS) LG ODD 
CoolingOSMonitorMonitor
Deepcool Lucifer v2 Win7 Ultimate 64 bit Acer X223w (1050) LG 22EN33 (1080) 
KeyboardPowerCaseMouse
Sharkoon Tactix OCZ ModXstream Pro 700w Modular Corsair 300R CM Storm Xornet 
Mouse PadAudioAudioAudio
Steelseries Qck+ DOTA2 Edition Edifier e1100+  Sennheiser HD215 Plantronics Gamecom 307 
  hide details  
Reply
post #276 of 360
Quote:
Originally Posted by Klocek001 View Post

I can't really comprehend this way of thinking .... so 150 fps in dx12 is great but 150 fps in dx11 is not necessary ? and 150 fps in dx11 is only for bragging but 150 fps in dx12 isn't ?
Not sure if trolling or not comprehending the point. In any case, I never said that 150 FPS would be great when it's Vulkan and not great under DX11. That's your own misconception of my statement. The point is that whether you have 100 fps or 150 fps under DX11, those are both equally playable, especially if you factor in FreeSync/Gsync. But DX11 is slowly being phased out. Having 150 fps under upcoming APIs is a testament to the longevity of the card.
Quote:
Originally Posted by Klocek001 View Post

so you'd rather have 150 fps in one Vulkan game and 100 fps in all the other ones instead of 150 fps in all DX11 games but 100 fps in one Vulkan game ?
Actually yes. Why? Because Vulkan is the future API, while DX11 will slowly be phased out. 100 fps is perfectly playable. If it wasn't you would have a point. But since it's perfectly playable, and we know that DX12/Vulkan are the APIs that are growing, I see it as stupid to go for something that can play old games at 150 fps and new ones at 100 fps, rather than the other way around.
And aside from that, the majority of games that will be coming out will fall in one of these categories;

Games that support DX11 only, these would be mainly indie games that are not heavy to run anyway, thus practically any ok-ish DX11 card can run it. Having a GTX1080 and even a GTX 980 is probably overkill and thus a waste of money.
Games that support DX11 and DX12. When given the choice, nVidia will go for DX11, and AMD for DX12. This means that if their performance is similar under DX11, as in both are playable, AMD will pull ahead under DX12. If AMD is behind, the gap will be lowered or even flipped. What would be the reason to go for nVidia?
Games that support DX11/OpenGL and Vulkan. Vulkan is obviously the future here, and looking at Doom as a reference, we again know what's awaiting us.
Games that only support DX12/Vulkan. DX12 games only are already here, and AMD is dominating. Games that support Vulkan also give AMD a huge boost.

Why would you go for the one that is known to have equal or less performance under future APIs, rather than equal or more?? If you're only looking at the past up to right now, you have dinosaur thinking. That, or you upgrade every year in order to always have the faster card. That's your right, but, to a lot of us, that's a waste of money.

To put it differently. You have the choice between;
1) Perfect performance now, no performance boost in the near future.
2) Playable performance now, huge performance boosts in the near future.

If you have a limited amount of money, which would you go for? I would go for option 2, because longevity. Also... Look at this doom chart with nVidia's 700 series from 2013 on the bottom of the chart, while AMD's 7000 series from 2011 is between a GTX 960 and a GTX 970... Which would have been the better buy? The GTX 780(Ti) because it performed better at the time, or the HD7950/7970, because it performs better now even though it was slightly worse at the time?


Quote:
Originally Posted by Klocek001 View Post

You'd buy a 144hz display to run one game at it's native refresh instead of buying one to run all or most your games on it ?
Who's talking about 144Hz displays? The market that actually cares about a 144 Hz display is small. You know what I'm looking/waiting for right now? A 21:9 display that supports HDR and FreeSync, with a Freesync range of 30-75Hz. Refresh rate does not have to go above 75Hz for me. 120Hz is a bonus, 144Hz is overkill.
Edited by NightAntilli - 7/15/16 at 7:53am
post #277 of 360
Quote:
Originally Posted by Fuell View Post

Please show me where I can buy a 1060 now please?

there's a month's difference or less between 1060 and RX480 and you're coming out like that .... rolleyes.gif
I can already picture you judging ppl who bought 1080s/1070s six months before its amd counterpart launch mario.gif
post #278 of 360
reference 480 are not worth buying

trash of a cooler/heatsink and power (6-pin), throttles and 5% OC (lol)


custom 480s look to be coming either at the same date as custom 1060s (19-th July) or a few days/a week after that
The Green Beast
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770K @ 4500 Mhz ASRock Z77 Pro3 Gigabyte GTX 1080 G1 Gaming Crucial Ballistix 2x8GB DDR3-1600 
Hard DriveHard DriveHard DriveCooling
SSD Crucial M550 500GB SSD Samsung 850 Evo 1TB HDD Seagate 7200rpm 3TB Cooler Master Hyper 212 EVO; Xilence X5 
OSMonitorKeyboardPower
Windows 10 Pro 64-bit Acer Predator XB271HU 27" IPS Gsync 1440p 165Hz CM Storm QuickFire XT Cherry Red 800W modular 
CaseMouseMouse PadAudio
Fractal Design Define R4 Black Logitech G900 Chaos Spectrum SteelSeries QcK+ 4mm SK Gaming Realtek On-board 
  hide details  
Reply
The Green Beast
(16 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 3770K @ 4500 Mhz ASRock Z77 Pro3 Gigabyte GTX 1080 G1 Gaming Crucial Ballistix 2x8GB DDR3-1600 
Hard DriveHard DriveHard DriveCooling
SSD Crucial M550 500GB SSD Samsung 850 Evo 1TB HDD Seagate 7200rpm 3TB Cooler Master Hyper 212 EVO; Xilence X5 
OSMonitorKeyboardPower
Windows 10 Pro 64-bit Acer Predator XB271HU 27" IPS Gsync 1440p 165Hz CM Storm QuickFire XT Cherry Red 800W modular 
CaseMouseMouse PadAudio
Fractal Design Define R4 Black Logitech G900 Chaos Spectrum SteelSeries QcK+ 4mm SK Gaming Realtek On-board 
  hide details  
Reply
post #279 of 360
Quote:
Originally Posted by NightAntilli View Post

The point is that whether you have 100 fps or 150 fps under DX11, those are both equally playable
lol so why would anyone buy a fury x if 290 is just equally playable.... lachen.gif

looks like you're out of ideas for a making a point that would make sense today, a 50% faster card is better than the one it's compared to, not equal. 150 fps would translate to 50% more power, now would you call 40 fps vs 60 fps equal as well ?

I do get your point, kinda. I just think it's not as applicable in reality as you make it seem. Now let's make an example: a card that's obviously better at launch vs a card that ages better, both compared 3 years later. Can a three year old card really handle latest games at high resolution or is it just relatively better than the former one ? Are we talking unplayable on the former and solid,playable fps on the latter or 30 fps vs 40 fps ?
Edited by Klocek001 - 7/15/16 at 5:57am
post #280 of 360
Quote:
Originally Posted by SoloCamo View Post

Not true at all...

Tons of DX11 titles will continue to be played by many for the next few years and there will still be plenty coming out well after dx12 is released. Look how long DX9 stuck around for that matter in newer titles. I still play games on my main rig from that were made before they year 2000 if that's any indication. Not like any current gpu would have a problem with them.

DX12 and Vulkan have laid ground work, started the landslide and it won't be long before DX11 is free falling with it.
Not exactly feeling versed today but here's a novel:

Let's look at DirectX 10 as an example of how little an API is used.

There were a few factors which locked DX9 into a 10+ year life cycle, a lot of it having to do with the consoles. The last gen hung around for 10 years and it's hardware was effectively DX9 equivalent. DX10 was an upgrade but there was no real incentive for developers to step outside the comfort zone and spend more money developing for it in 2007 (let alone the small audience who transitioned from XP to Vista, DX10 was exclusive to Vista). Add the fact that the consoles were just released, hype surrounding them, money spent on exploring and developing for the hardware, extracting performance ect. DX10 was completely insignificant, targeted a minute audience and was exclusive to the PC... basically a self sacrificial male preying mantis doh.giftongue.gif

It took DX11 a solid 3 years to get a foothold from it's release with Seven in 09' and by that time the improvements seen were drastic compared to how games on the dated 360 and PS3 were looking; Seven was also a popular operating system. DX11 was a bit of an anomaly given the 360/PS3 console cycle was lasting far longer than a console cycle ever had; the consoles were dated, it was worth the developers time and effort to develop for DX11 because the labor provided fruits.

DirectX 12 and Vulkan correspond with the next gens, this is a given. Consoles straddle the collar of game technology more than we'd like to think - as far as DX12 goes, Windows 10 is essentially a free operating system and regardless of public or tech perception, "free" is free... you can fill in the blanks based on my earlier paragraphs here about what happens next.

DirectX 11 will hang around
As history.
Warning: Spoiler! (Click to show)
For funzies cause I'm bored

DX9
- locked to an OS
+ extremely popular OS
.. initial cost negated by it's rampant adoption, penetration, shelf life and time of console release
+ inline with console hardware/API
+ released before a console cycle
+ backlog of hardware compatibility in relation to the console release, released far before console
4.5/6 (if each neutral is 0.5)

DX10
- locked to an OS
- was not a popular OS
- cost
- strafing console hardware/API
- released just after the beginning of a console cycle
- did not have a backlog of hardware compatibility
0/6

DX11
- locked to an OS
+ extremely popular OS
- cost
..strafing console hardware (which worked for it but also hindered it)
..released far after the console cycle (same, worked for it but never allowed it to completely landslide)
- did not have a backlog of compatible hardware
2/6

DX12
- locked to an OS
..somewhat popular OS (+will end up being popular, no doubts)
+ free
+ inline with console hardware/API's
+ release window doesn't matter
+ has a backlog of compatible hardware (no new card needed unless it's extremely old)
4.5 (5)/6

Vulkan
+ not locked to an OS
.. OS neutral (can be a detriment)
+ free
+ inline with console hardware/API's
+ release window doesn't matter
+ has a backlog of compatible hardware
5.5/6

When you see free development tech from Epic, CryTek and Unity giving developers easy access to DX12 and factor in that big developers seem to be transitioning console ports onto Vulkan and DX12 there's absolutely no reason to use DX11. Many developers claiming that there is little work involved in the direct port but the basement, plumbing and wiring is in the transition solely to a DX12 render path - this releases DX11 (like Rose released Jack heart.gif), pushes the remaining to upgrade for free and it's set in stone.
Edited by pengs - 7/15/16 at 7:57am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [DSoG]Report: Total War: WARHAMMER runs 27% slower in DX12 on NVIDIA’s hardware