Overclock.net › Forums › Industry News › Software News › [Anand] Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading
New Posts  All Forums:Forum Nav:

[Anand] Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading - Page 17

post #161 of 1175
Quote:
Originally Posted by Defoler View Post

Did AMD provided nvidia with mantle source code?
Did AMD provided TressFX 1.0 and 2.0 source code?
Both of those were "open source free of charge". The answer to those to questions is no.
And that answer your talk about blocked code or opened code. Also TressFX went open AFTER the last tomb raider came out. Why didn't AMD release it earlier to allow nvidia to optimise to it I wonder?

Nvidia specific visuals can be turned off. New technology and visuals are sometimes too big. Can the 280x run a 4K with ultra settings at 60fps? Why not? Will the mid range card in 2 years will be able to do it? Most likely yes. C'est la vie. They are also affecting more. When running hairworks it affects all types of hair visuals, while tressfx is only running on a single character in the game. Now do the opposite and guess what happens?

What performance boost with more visuals did we get? TressFX 3.0 running on max takes about 5-10fps of performance. On Nvidia its takes a bit more.

Yes, it is hypocrisy. When nvidia does it, it is so wrong and bad. When AMD are doing it, it is so good and revolutionary.
1. yes it's called dx12.
2. yes after game launched

with more performance you can implement more visuals.

it's not hypocrisy because
when nvidia does it, they tie hands of devs and give no source
when amd does it , nvidia can do whatever they want. developers are free to implement what nvidia wants.

1 game vs alot of gimpworks title. talk about hypocrisy here. smile.gif


it's surprising how people always talk that they want AMD to do better. and here we are with closed eyes not willing to see THEY ARE DOING BETTER !!
post #162 of 1175
Quote:
Originally Posted by Charcharo View Post

For what it is worth...
People are forgetting that most AMD cards are competitive in DX11. The R9 380, 380X, 390, 390X and Fury Non X (and the Nano I guess) are not under their competition. Usually they win DX11 comparisons to their equivalent competitor. The Fury X was the problematic one (as well as... I guess the 370).

People are acting as if those cards are under their competitor when they ... are not. DX12 just adds a lot of performance on top. So does Async.
And that matters.


now maybe with some driver improvement's they have made but the 3xxx is just rebrands of the 2xxx so it did take a long time for them to catch up.

and that's only fps wise not power use or other variables like cf profiles(anyone who knows inspector can easy make or refine sli profiles) or other features like the best driver support around of any company....sure you can not please everyone but its more then just fps numbers


but anyway its good that they step up after all these years...that is if this is not more hype and they cut a lot out of the game to make it more user/hardware friendly.the number are not really too good at higher rez on anything right now not that this game needs 60+ but if you are on pc you do want some better then console looks
post #163 of 1175
Quote:
Originally Posted by Defoler View Post



I'm still trying to figure out what is the point of buying an underpowered card to use now (in 2013), for a chance of maybe in 2016 it will run equal to its competitor counterpart?
This shows a completely flawed logic.

Actually, it makes sense.

Remember, not everyone upgrades as often as you guys (nor should they).
Besides, AMD 2013 GPU line up was AS good as Nvidias in DX11 and DX10. Not worse, so you did not sacrifice anything really.
Current PC
(10 items)
 
   
CPUMotherboardGraphicsRAM
Ryzen 7 1700 ASUS PRIME X370-PRO R9 Fury Nitro Team Group Delta LED RED 
Hard DriveHard DriveOSMonitor
850 Evo WD 1TB Blue Windows 10 64-bit professional Acer XG270HU 
CaseMouse
NZXT S340 Elite Logitech G Pro 
CPUMotherboardGraphicsRAM
i5-750 Gigabyte GA-P55-US3L ATI 5770 1GB Gigabyte 4 GB DDR3 1333mhz RAM 
Hard DriveOSMonitor
WD 1TB Green Windows 7 64-bit Samsung Syncmaster 940mw 
CPUMotherboardGraphicsRAM
i5 3470 Gigabyte GA-875M-D2V GTX 760 Gigabyte Windforce  8GB DDR3 1600 Mhz 
Hard DriveOSMonitor
WD 1TB Black Windows 10 Pro 64 bit Acer Predator GN246HL 
  hide details  
Reply
Current PC
(10 items)
 
   
CPUMotherboardGraphicsRAM
Ryzen 7 1700 ASUS PRIME X370-PRO R9 Fury Nitro Team Group Delta LED RED 
Hard DriveHard DriveOSMonitor
850 Evo WD 1TB Blue Windows 10 64-bit professional Acer XG270HU 
CaseMouse
NZXT S340 Elite Logitech G Pro 
CPUMotherboardGraphicsRAM
i5-750 Gigabyte GA-P55-US3L ATI 5770 1GB Gigabyte 4 GB DDR3 1333mhz RAM 
Hard DriveOSMonitor
WD 1TB Green Windows 7 64-bit Samsung Syncmaster 940mw 
CPUMotherboardGraphicsRAM
i5 3470 Gigabyte GA-875M-D2V GTX 760 Gigabyte Windforce  8GB DDR3 1600 Mhz 
Hard DriveOSMonitor
WD 1TB Black Windows 10 Pro 64 bit Acer Predator GN246HL 
  hide details  
Reply
post #164 of 1175
Quote:
Originally Posted by Defoler View Post

Sorry, but "nvidia's marketing schemes" are pretty simple. And TBH, I had't seen a single nvidia "marketing scheme" for a very, very, long time.
.
Selling a 3.5GB card as 4GB certainly is a scheme of some sort, and the millions of units sold means that people are willing to accept the lie unfortunate mistake because nvidia is great, or something.
Lil' Roy Taylor
(11 items)
 
  
Reply
Lil' Roy Taylor
(11 items)
 
  
Reply
post #165 of 1175
Not to mention the DX12.1 marketing...
post #166 of 1175
Quote:
Originally Posted by p4inkill3r View Post

Selling a 3.5GB card as 4GB certainly is a scheme of some sort, and the millions of units sold means that people are willing to accept the lie unfortunate mistake because nvidia is great, or something.

yes because at the time it was head and shoulders above the 290 or the 290x even thumb.gif
not only that it seemed more amd fans cared about it then even us 970 owners...you ever oc a 970???/its totally rocks a 290 or 390x for that matter.
just like a hot bimbo sometimes you overlook some things but it does have 4g so by this time you should have it down pat 3.5+.5 biggrin.gif
post #167 of 1175
Quote:
Originally Posted by cowie View Post

now maybe with some driver improvement's they have made but the 3xxx is just rebrands of the 2xxx so it did take a long time for them to catch up.

and that's only fps wise not power use or other variables like cf profiles(anyone who knows inspector can easy make or refine sli profiles) or other features like the best driver support around of any company....sure you can not please everyone but its more then just fps numbers


but anyway its good that they step up after all these years...that is if this is not more hype and they cut a lot out of the game to make it more user/hardware friendly.the number are not really too good at higher rez on anything right now not that this game needs 60+ but if you are on pc you do want some better then console looks

Refreshes. Not exactly rebrands tongue.gif

Power Usage is not that big a difference. Especially since FRTC exists and idle power usage is very good, so long term differences will be small.
I have no idea on CF. Only ever used SLI. Though the more VRAM and XDMA engine does seem like it may give AMD an edge when it works. But I can not tell you much here as my personal knowledge on CF is not great. Only the VRAM thing is certain.

That VRAM thing also allows for extreme modding and great performance YEARS from now. So it is a long term investment. I am a PC Gamer, I mod my games smile.gif

I am a PC Gamer because of modding, Backwards Compatibility and Emulation. Also cheaper long term costs being a PC Gamer. Better visuals and performance (and the satisfaction of having fun with hardware) is JUST icing on top of the cake smile.gif !
Current PC
(10 items)
 
   
CPUMotherboardGraphicsRAM
Ryzen 7 1700 ASUS PRIME X370-PRO R9 Fury Nitro Team Group Delta LED RED 
Hard DriveHard DriveOSMonitor
850 Evo WD 1TB Blue Windows 10 64-bit professional Acer XG270HU 
CaseMouse
NZXT S340 Elite Logitech G Pro 
CPUMotherboardGraphicsRAM
i5-750 Gigabyte GA-P55-US3L ATI 5770 1GB Gigabyte 4 GB DDR3 1333mhz RAM 
Hard DriveOSMonitor
WD 1TB Green Windows 7 64-bit Samsung Syncmaster 940mw 
CPUMotherboardGraphicsRAM
i5 3470 Gigabyte GA-875M-D2V GTX 760 Gigabyte Windforce  8GB DDR3 1600 Mhz 
Hard DriveOSMonitor
WD 1TB Black Windows 10 Pro 64 bit Acer Predator GN246HL 
  hide details  
Reply
Current PC
(10 items)
 
   
CPUMotherboardGraphicsRAM
Ryzen 7 1700 ASUS PRIME X370-PRO R9 Fury Nitro Team Group Delta LED RED 
Hard DriveHard DriveOSMonitor
850 Evo WD 1TB Blue Windows 10 64-bit professional Acer XG270HU 
CaseMouse
NZXT S340 Elite Logitech G Pro 
CPUMotherboardGraphicsRAM
i5-750 Gigabyte GA-P55-US3L ATI 5770 1GB Gigabyte 4 GB DDR3 1333mhz RAM 
Hard DriveOSMonitor
WD 1TB Green Windows 7 64-bit Samsung Syncmaster 940mw 
CPUMotherboardGraphicsRAM
i5 3470 Gigabyte GA-875M-D2V GTX 760 Gigabyte Windforce  8GB DDR3 1600 Mhz 
Hard DriveOSMonitor
WD 1TB Black Windows 10 Pro 64 bit Acer Predator GN246HL 
  hide details  
Reply
post #168 of 1175
Quote:
Originally Posted by Charcharo View Post

Refreshes. Not exactly rebrands tongue.gif

Power Usage is not that big a difference. Especially since FRTC exists and idle power usage is very good, so long term differences will be small.
I have no idea on CF. Only ever used SLI. Though the more VRAM and XDMA engine does seem like it may give AMD an edge when it works. But I can not tell you much here as my personal knowledge on CF is not great. Only the VRAM thing is certain.

That VRAM thing also allows for extreme modding and great performance YEARS from now. So it is a long term investment. I am a PC Gamer, I mod my games smile.gif

I am a PC Gamer because of modding, Backwards Compatibility and Emulation. Also cheaper long term costs being a PC Gamer. Better visuals and performance (and the satisfaction of having fun with hardware) is JUST icing on top of the cake smile.gif !


sorry man refresh

but with no improvments in power use most lower power results came from just a fan
the ref 290'x were just a big heat bomb and 390 'x with the same refv fan would be the same exact thing.
the same bs that nv said with the 9800 to 250 refresh

so if you are a gamer then you spend upwards of 300 500 a year in games? but you will not upgrade hardware for your hobby?

I can't follow you there because the games the last decade have really gone down the tubes....what dev/game makers do you buy from.....I am 100% sure they lied to you more the amd/nv could ever biggrin.gif

i try everything then complane or be happy i am easy tongue.gif
Edited by cowie - 2/25/16 at 6:23am
post #169 of 1175
Quote:
Originally Posted by Defoler View Post

Did AMD provided nvidia with mantle source code?
Did AMD provided TressFX 1.0 and 2.0 source code?
Both of those were "open source free of charge". The answer to those to questions is no.
Actually the answer is:

Yes, yes and yes.
post #170 of 1175
Quote:
Originally Posted by airfathaaaaa View Post

i dont understand what you are saying
its crystal clear that nvidia would have never used mantle even if amd gave money to them....a company that loves closing down software lying about their cards and capabilities wont go open source so easy
this is why vulkan came in..and the sudden love of nvidia towards opengl
Nvidia has full access to Vulkan which has full access to Mantle's source code.
Khronos incorporated Mantle's best parts in Vulkan.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [Anand] Ashes of the Singularity Revisited: A Beta Look at DirectX 12 & Asynchronous Shading