Overclock.net › Forums › Industry News › Software News › [AnandTech] GeForce+Radeon: Previewing DirectX 12 Multi Adapter with Ashes of the Singularity
New Posts  All Forums:Forum Nav:

[AnandTech] GeForce+Radeon: Previewing DirectX 12 Multi Adapter with Ashes of the Singularity - Page 15

post #141 of 326
Quote:
Originally Posted by xxdarkreap3rxx View Post

Quote:
Originally Posted by rickcooperjr View Post

Don't worry I messaged Kollock with direct links to my posts and told him what I have heard and why this is my understanding of the situation.

Hopefully he can chime in as I just don't see how it wouldn't require an AMD or Nvidia proprietary driver. They are the ones who make the hardware and would know how it could be utilized to its fullest potential so I just don't see Microsoft putting in the resources themselves (which would most likely result in sub-optimal performance of the video cards for gaming). It's the same as with most devices; Microsoft provides standard drivers which are stripped of features and require the manufacturer's drivers for the best compatibility/performance.
I just noticed he already answered it in general http://www.overclock.net/t/1578269/anandtech-geforce-radeon-previewing-directx-12-multi-adapter-with-ashes-of-the-singularity/50#post_24544014
Quote:
Originally Posted by Kollock View Post

Quote:
Originally Posted by 47 Knucklehead View Post

So what are the numbers for a Fury X + 980Ti as compared to Fury X + Fury X or 980Ti + 980Ti?



Who will Crossfire/SLI work? Will it only be as good as the worst driver? What if one card has a profile for game x, but the other card doesn't? How will Crossfire perform in non-Full Screen mode? (Crossfire DOESN'T work, but SLI does a little, so what will happen?)

Profiles are irrelevant in explicit multi-GPU. In this mode, the driver has no explicity knowledge that is being used in conjunction with another GPU. Synchronization, frame pacing, latency are all being handled by the application. The primary advantage that SLI and crossfire give you is the possibility to share resources between the adapters in a more efficient manner. However, this would manifest itself by using linked mode adapter. For multi adapter approach, the primary thing you need is the ability to use shared adapter resources, with some possible performance benefits if the standard Swizzle format is supported. These are just features of D3D12.

Profiles are a very clunky cludge that was necessary in DX11, in DX12 explicit multi-GPU I do not see how a profile would have any meaningful advantage.
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
post #142 of 326
Quote:
Originally Posted by ku4eto View Post

Ok, i lol'd. Bad crossfire drivers is the same as bad SLI drivers.It is present on both sides. AMD are just slower with releasing those said profiles . Microstuttering - yea, stuff happens sometimes.Dunno what you mean with the open features, that work only on their card. Is there even one like that ? You have Gameworks that favours nVidia, and TXAA that runs fine on both. Or do you mean the PhysX by nVidiat that requires only nVidia hardware to run ? And i believe they did not bashed OxideDevs in any way. Unlike the lies nVidia said about "bug". Does AMD demand some tech and not paying for it ? Is that supposed to be Adaptive/Freesync of HBM ? And is HBM not new visual tech >

You seems to have been living under a rock (from both not knowing what has been going on, nor ability to write in a way it is remotely understandable).

AMD had huge stuttering issues until it FCAT times forced them to fix it. AMD denied it, said it was unfair, said it was not real. Yet they had to bend over and...

AMD put out TressFX as open source (came out with tomb raider). It was "open source" yet it was favouring AMD. And nvidia got that source, after the game was out, not when it was developed. Suddenly there was TressFX and it was open and "shame on nvidia for not supporting something no one told them existed or developed or opened". A double standard, as usual.

AMD keep saying that gameworks is bad. When games supporting it came out, AMD demanded that nvidia give them the source code so they can optimise, else their customers will always have performance issues. That of course was a blunt lie, as a week later AMD brought an update driver which fixed everything. No source was provided.

PhysX today is open source. AMD keep playing the "PhysX is only CPU, PhysX is bad" etc etc. Yet they can run the heavy CPU options under their GPU using their drivers. A couple of guys showed how changes to the open source code can run the PhysX on AMD. But AMD will never do that, since it will reduce their ranting source.

Gameworks belongs to nvidia. Of course it will run better on nvidia because this is their tech. AMD does not support gameworks, and you can disable it. You can also choose MSAA instead of TXAA. They require specific hardware because that is how their hardware works. You don't expect intel to run API for AMD's APU do you? That would be.... silly at best.

Nvidia never "bash" Oxide. They only stated that there were bugs in the AA API calls, and that they run async which does not support yet by nvidia. Behold, when Oxide put a new version and Nvidia put new drivers, guess who was running better? Oxide were working with nvidia to fix those things (despite AMD's claims that nvidia aren't going to have async).

Also, I remind you again, that AMD demanded the gameworks source code. They did not ask, they demanded, that nvidia must give them hairworks, waterworks and everything else, for free, asap, so they can run those tech. Nvidia said: no. The same as Nvidia had to optimise games without the source code, AMD can do that as well.

AMD stated that HBM is going to win over nvidia. That that was their "secret weapon" with much higher memory bandwidth and faster memory. It did not work for them. They leaked that they are going to get the first HBM 2 batch. Then it was found out that they had problems with it.

Also freesync is AMD's answer to g-sync. AMD stated that there will be tons of monitors ready but 2015. They right out said that everything thinking about g-sync should wait for adaptive sync because it is going to "win". Eventually there were maybe 3 by the end of Q2 2015. And it was very limited compared to g-sync performance overall.

You seems to have zero understanding that AMD approach is actively bashing nvidia and their features to make it look like nvidia are using their market to damage AMD. Yet nvidia are just doing nvidia. All they are doing is expanding their library of tricks, while AMD are standing still.
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
Main system
(16 items)
 
Editing PC
(8 items)
 
 
CPUGraphicsGraphicsRAM
E5-1680v2 AMD FirePro D700 AMD FirePro D700 64GB 1866mhz 
Hard DriveOSMonitorCase
1TB PCIE SSD OSX 10.10.x Dell U2713H Mac Pro 
  hide details  
Reply
post #143 of 326

It doesn't really say if he's talking about an Nvidia and AMD proprietary driver or one which Microsoft has created (I would think he means the former). If proprietary drivers are required, I could see Nvidia's driver installation package failing to install if it detects AMD hardware.
post #144 of 326
What this will eventually show, is which development studios are "lazy" or willing to program the best game experience they can with the tools provided.

For all the people that default to "lazy devs" "un optimized software" ect, this will either prove your point or break it.

Frankly though, if what Rick says is true, that is also kind of unnerving. Microsoft has not been absolutely the best thing for PC gaming. "Games for Windows" anyone?
post #145 of 326
Quote:
Originally Posted by xxdarkreap3rxx View Post


It doesn't really say if he's talking about an Nvidia and AMD proprietary driver or one which Microsoft has created (I would think he means the former). If proprietary drivers are required, I could see Nvidia's driver installation package failing to install if it detects AMD hardware.
that is very possible but unlikely because that would violate the agreement with Microsoft
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
post #146 of 326
Quote:
Originally Posted by rickcooperjr View Post

that is very possible but unlikely because that would violate the agreement with Microsoft

What is very possible and what agreement are you referring to?

Edit: If you're on a Windows machine with an Nvidia card could you run "tasklist > %userprofile%/Desktop/p.txt" in command and PM me the contents. I'm pretty sure Nvidia has all sorts of driver related executable running which could be another way of disabling their driver if AMD hardware is detected.
post #147 of 326
Quote:
Originally Posted by DaaQ View Post

What this will eventually show, is which development studios are "lazy" or willing to program the best game experience they can with the tools provided.

For all the people that default to "lazy devs" "un optimized software" ect, this will either prove your point or break it.

Frankly though, if what Rick says is true, that is also kind of unnerving. Microsoft has not been absolutely the best thing for PC gaming. "Games for Windows" anyone?
true but this time it is Windows 10 DX12 for games and keep in mind games are going crossplatform with Xbox ONE / microsoft devices and PC put the math together and Xbox ONE will have windows 10 and DX12 so anybody able to add 2+2 here just a bit of food for thought.

I want to also point out you can stream PC / Xbox ONE games across network from PC / Xbox ONE using windows 10 to a tablet or phone also just a bit of fore thought is all it takes to see what is going on here.
Edited by rickcooperjr - 10/27/15 at 7:59am
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel I7 5820k 4.6ghz core 4.3ghz CPU cache 1.345v asus sabertooth X99 sold R9 290x's am currently running a R9 280A X... 16gb of ddr4 corsair vengeance LPX 3200mhz 1t 1... 
Hard DriveOptical DriveCoolingOS
M.2 Toshiba OCZ RD400 512gb ASUS DVD rw  liquid cooled cpu and gpu's windows 10 pro 64bit 
MonitorKeyboardPowerCase
3 x 32lcd/led 2560x1600 wolfking timberwolf EVGA supernova 1300w G2  Thermaltake core X9 
MouseMouse PadOther
steel series wow cataclysm 14button programable  wolfking sniper Intel PRO PT dual port PCIe server NIC 
  hide details  
Reply
post #148 of 326
Quote:
Originally Posted by rickcooperjr View Post

true but this time it is Windows 10 DX12 for games and keep in mind games are going crossplatform with Xbox ONE / microsoft devices and PC put the math together and Xbox ONE will have windows 10 and DX12 so anybody able to add 2+2 here just a bit of food for thought.

I want to also point out you can stream PC games across network from PC with windows 10 to a tablet or phone also just a bit of for thought is all it takes to see what is going on here.

I certainly hope so, because DX10 didn't do any favors. But with DX12 being low level gives me a small bit of hope.
post #149 of 326
Quote:
Originally Posted by Defoler View Post

You seems to have been living under a rock (from both not knowing what has been going on, nor ability to write in a way it is remotely understandable).

AMD had huge stuttering issues until it FCAT times forced them to fix it. AMD denied it, said it was unfair, said it was not real. Yet they had to bend over and...

AMD put out TressFX as open source (came out with tomb raider). It was "open source" yet it was favouring AMD. And nvidia got that source, after the game was out, not when it was developed. Suddenly there was TressFX and it was open and "shame on nvidia for not supporting something no one told them existed or developed or opened". A double standard, as usual.

AMD keep saying that gameworks is bad. When games supporting it came out, AMD demanded that nvidia give them the source code so they can optimise, else their customers will always have performance issues. That of course was a blunt lie, as a week later AMD brought an update driver which fixed everything. No source was provided.

PhysX today is open source. AMD keep playing the "PhysX is only CPU, PhysX is bad" etc etc. Yet they can run the heavy CPU options under their GPU using their drivers. A couple of guys showed how changes to the open source code can run the PhysX on AMD. But AMD will never do that, since it will reduce their ranting source.

Gameworks belongs to nvidia. Of course it will run better on nvidia because this is their tech. AMD does not support gameworks, and you can disable it. You can also choose MSAA instead of TXAA. They require specific hardware because that is how their hardware works. You don't expect intel to run API for AMD's APU do you? That would be.... silly at best.

Nvidia never "bash" Oxide. They only stated that there were bugs in the AA API calls, and that they run async which does not support yet by nvidia. Behold, when Oxide put a new version and Nvidia put new drivers, guess who was running better? Oxide were working with nvidia to fix those things (despite AMD's claims that nvidia aren't going to have async).

Also, I remind you again, that AMD demanded the gameworks source code. They did not ask, they demanded, that nvidia must give them hairworks, waterworks and everything else, for free, asap, so they can run those tech. Nvidia said: no. The same as Nvidia had to optimise games without the source code, AMD can do that as well.

AMD stated that HBM is going to win over nvidia. That that was their "secret weapon" with much higher memory bandwidth and faster memory. It did not work for them. They leaked that they are going to get the first HBM 2 batch. Then it was found out that they had problems with it.

Also freesync is AMD's answer to g-sync. AMD stated that there will be tons of monitors ready but 2015. They right out said that everything thinking about g-sync should wait for adaptive sync because it is going to "win". Eventually there were maybe 3 by the end of Q2 2015. And it was very limited compared to g-sync performance overall.

You seems to have zero understanding that AMD approach is actively bashing nvidia and their features to make it look like nvidia are using their market to damage AMD. Yet nvidia are just doing nvidia. All they are doing is expanding their library of tricks, while AMD are standing still.

Where to start...
First,PhysX was only made open source at GDC in 2015. And while it is open source to developers,you still need a CUDA enabled device to compute on gpu instead of cpu.That's still proprietary.

As of September,there are more Freesync monitors than Gsync,and they are indeed cheaper.Open source takes longer to develop,not that hard to understand.

Currently,Nvidia does not have DX12 async shaders.
Amelia
(13 items)
 
Professional
(13 items)
 
RCPC#1
(17 items)
 
CPUMotherboardGraphicsRAM
Phenom II X6 1100t MSI 890FX GD65 MSI Radeon HD5670 GSkill RipjawsX DDR3 PC3 12800 2x4GB CL8 
Hard DriveOptical DriveCoolingOS
WD Black 1TB SATA III Samsung BD Zalman 9900MAX Windows 7 64 Professional 
MonitorKeyboardPowerCase
AOC 22" LED Logitech Kingwin Lazer Platinum 500w Fractal Design R3 
Other
Samsung 470 SSD 128GB 
CPUMotherboardGraphicsRAM
AMD Phenom II X6 960T Asus M4A88T-VEVO Asus Strix R7 370 SuperTalent Perfomance 
RAMHard DriveHard DriveOptical Drive
GSkill Snipers Monster Daytona Seagate Barracuda 500GB 7,200 RPM 16Mb cache Memorex DVD/RW 
CoolingOSMonitorKeyboard
Corsair H60 Windows 8N IBM 9494 19" LCD IBM 
PowerCaseMouseMouse Pad
Corsair GS500 In Win H-Frame Wolfking OCZ Behemoth 
Audio
JBL Creature 
  hide details  
Reply
Amelia
(13 items)
 
Professional
(13 items)
 
RCPC#1
(17 items)
 
CPUMotherboardGraphicsRAM
Phenom II X6 1100t MSI 890FX GD65 MSI Radeon HD5670 GSkill RipjawsX DDR3 PC3 12800 2x4GB CL8 
Hard DriveOptical DriveCoolingOS
WD Black 1TB SATA III Samsung BD Zalman 9900MAX Windows 7 64 Professional 
MonitorKeyboardPowerCase
AOC 22" LED Logitech Kingwin Lazer Platinum 500w Fractal Design R3 
Other
Samsung 470 SSD 128GB 
CPUMotherboardGraphicsRAM
AMD Phenom II X6 960T Asus M4A88T-VEVO Asus Strix R7 370 SuperTalent Perfomance 
RAMHard DriveHard DriveOptical Drive
GSkill Snipers Monster Daytona Seagate Barracuda 500GB 7,200 RPM 16Mb cache Memorex DVD/RW 
CoolingOSMonitorKeyboard
Corsair H60 Windows 8N IBM 9494 19" LCD IBM 
PowerCaseMouseMouse Pad
Corsair GS500 In Win H-Frame Wolfking OCZ Behemoth 
Audio
JBL Creature 
  hide details  
Reply
post #150 of 326
Quote:
Originally Posted by Defoler View Post

You seems to have been living under a rock Warning: Spoiler! (Click to show)
(from both not knowing what has been going on, nor ability to write in a way it is remotely understandable).

AMD had huge stuttering issues until it FCAT times forced them to fix it. AMD denied it, said it was unfair, said it was not real. Yet they had to bend over and...

AMD put out TressFX as open source (came out with tomb raider). It was "open source" yet it was favouring AMD. And nvidia got that source, after the game was out, not when it was developed. Suddenly there was TressFX and it was open and "shame on nvidia for not supporting something no one told them existed or developed or opened". A double standard, as usual.

AMD keep saying that gameworks is bad. When games supporting it came out, AMD demanded that nvidia give them the source code so they can optimise, else their customers will always have performance issues. That of course was a blunt lie, as a week later AMD brought an update driver which fixed everything. No source was provided.

PhysX today is open source. AMD keep playing the "PhysX is only CPU, PhysX is bad" etc etc. Yet they can run the heavy CPU options under their GPU using their drivers. A couple of guys showed how changes to the open source code can run the PhysX on AMD. But AMD will never do that, since it will reduce their ranting source.

Gameworks belongs to nvidia. Of course it will run better on nvidia because this is their tech. AMD does not support gameworks, and you can disable it. You can also choose MSAA instead of TXAA. They require specific hardware because that is how their hardware works. You don't expect intel to run API for AMD's APU do you? That would be.... silly at best.

Nvidia never "bash" Oxide. They only stated that there were bugs in the AA API calls, and that they run async which does not support yet by nvidia. Behold, when Oxide put a new version and Nvidia put new drivers, guess who was running better? Oxide were working with nvidia to fix those things (despite AMD's claims that nvidia aren't going to have async).

Also, I remind you again, that AMD demanded the gameworks source code. They did not ask, they demanded, that nvidia must give them hairworks, waterworks and everything else, for free, asap, so they can run those tech. Nvidia said: no. The same as Nvidia had to optimise games without the source code, AMD can do that as well.

AMD stated that HBM is going to win over nvidia. That that was their "secret weapon" with much higher memory bandwidth and faster memory. It did not work for them. They leaked that they are going to get the first HBM 2 batch. Then it was found out that they had problems with it.

Also freesync is AMD's answer to g-sync. AMD stated that there will be tons of monitors ready but 2015. They right out said that everything thinking about g-sync should wait for adaptive sync because it is going to "win". Eventually there were maybe 3 by the end of Q2 2015. And it was very limited compared to g-sync performance overall.

You seems to have zero understanding that AMD approach is actively bashing nvidia and their features to make it look like nvidia are using their market to damage AMD. Yet nvidia are just doing nvidia. All they are doing is expanding their library of tricks, while AMD are standing still.

Quote:
Originally Posted by Redwoodz View Post

Where to start...
First,PhysX was only made open source at GDC in 2015. And while it is open source to developers,you still need a CUDA enabled device to compute on gpu instead of cpu.That's still proprietary.

As of September,there are more Freesync monitors than Gsync,and they are indeed cheaper.Open source takes longer to develop,not that hard to understand.

Currently,Nvidia does not have DX12 async shaders.

Stop bringing the bickering to these threads, nobody cares for your vendor preference, least of all for the taking of pot shots at support for technologies that have barely left the ground. This thread is about MGPU functionality within DX12 with multi-adapter.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [AnandTech] GeForce+Radeon: Previewing DirectX 12 Multi Adapter with Ashes of the Singularity