Overclock.net › Forums › Industry News › Software News › [WinBuzzer] Microsoft Simplifies Multi-GPU Support for DirectX 12 Developers
New Posts  All Forums:Forum Nav:

[WinBuzzer] Microsoft Simplifies Multi-GPU Support for DirectX 12 Developers - Page 7

post #61 of 90
Quote:
Originally Posted by Tsumi View Post

It's not that simple. Every frame is dynamic, and the amount of workload varies from frame to frame. Traditional multi-GPU uses a technique called AFR, or alternate frame rendering, where the GPU alternates between each frame. It is the simplest method, but it comes with limitations. The first is that the software has to predict ahead what the frame will be. If it predicts wrong, the frame has to be redone. Not too much of a problem with a single GPU where it has to predict one frame ahead, but much bigger problems when it has to predict two or three frames ahead. And again, each frame has varying amounts of workload. This is already hard enough to synchronize with GPUs of exact same power, now imagine the nightmare with GPUs of different power.

The alternate method is with SFR, or split frame rendering. Elements in the frame are split up among the GPUs and rendered separately, then stitched together. This method does not have the drawback of needing to predict ahead, but it requires smartly splitting the elements, a feat that is not easily accomplished. Remember that the assets need to be loaded into the GPU VRAM in order for it to render, and if it needs a new set of assets in order to render, that slows down its processing because it now has to wait for data from the CPU. The software would have to smartly guess on average what elements to assign to each GPU. Remember that in 3D rendering, each object on the screen is its own element, rendered from the GPU VRAM as the situation requires.

Multi-GPU is an extremely complicated thing to do right, and with a relative minority of gamers actually caring about multi-GPU support, most developers will settle for good enough.

if async is supported why wouldnt another card be benefitical? I know communication between the 2 or more cards needs to be done but wouldnt it be the same in a way?

edit I have used radeon pro and inspector so I do know what these things are

maybe might be time for new slot standard
Edited by pas008 - 7/11/16 at 9:28am
post #62 of 90
Quote:
Originally Posted by bigjdubb View Post

I think the APU/iGPU usage would be a big hit among laptop and budget gamers. It could also translate over into the console world with APU's and dGPU's in the same system.


Now that I think about it, the upcoming Xbox refresh may be the real reason behind this. Maybe the new Xbox is going to have a dGPU and APU.

I think the situation with the the consoles is the real reason why well see multi gpu support. lets be honest, despite what microsoft keeps saying that really dont care that much about the pc gaming market (although they have gotten much better about it recently) so this does not help the pc gaming market (directly at least)

they have seen the writing on the wall and they cannot afford to have the same console hardware for 7 years or so. having a strong apu (like scorpio) being able to pair up with say a laptop grade gpu (im personally thinking some sort of expansion slot system where you just plug it in. kind of like a nintendo cartridge biggrin.gif) allows them to quickly, easily, and cheaply (on their end) upgrade their console if the software supports it.

They know VR is coming and even scorpio would be very hard pressed to run VR without getting people sick.

while we can still push more out of a single gpu if we want to make the big gains needed for 4K and VR parallelism is the the only real way. Until we move off of silicon at least
post #63 of 90
very simple explanation at 3:22
post #64 of 90
Quote:
Originally Posted by Liranan View Post

Vulkan and DX12 actually no longer need drivers to work with games. Developers now have full control over how their games communicate with hardware, thus removing the middle men AMD and Nvidia.

This does mean that we don't need Nvidia bribing developers into using code that locks AMD or even Intel out, which they most likely will considering their hardware doesn't support any DX12 features yet (we're still one to two years away from Nvidia finally implementing Async).

I think the first guy meant Nvidia not providing SLI on the GTX 1060. (the connectors). I assume those would still be needed unless they go the AMD route?
My PC
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 - 5820k MSI X99S Plus SLI Sapphire R9 Fury NITRO Corsair Vengeance 16GB DDR4 2800Mhz 
Hard DriveHard DriveOptical DriveCooling
Samsung SM951 512GB Seagate Barracuda 500GB Noctua NH-U14S 
OSMonitorKeyboardPower
Windows 7 Ultimate 64Bit Asus MG279Q Logitech G510 Corsair RM750 
CaseMouseMouse PadAudio
Corsair Obsidian 700D Logitech G700 Outplay Sennheiser HD598 
Audio
Tritton PC510HDA (Microphone use only) 
  hide details  
Reply
My PC
(17 items)
 
  
CPUMotherboardGraphicsRAM
Intel i7 - 5820k MSI X99S Plus SLI Sapphire R9 Fury NITRO Corsair Vengeance 16GB DDR4 2800Mhz 
Hard DriveHard DriveOptical DriveCooling
Samsung SM951 512GB Seagate Barracuda 500GB Noctua NH-U14S 
OSMonitorKeyboardPower
Windows 7 Ultimate 64Bit Asus MG279Q Logitech G510 Corsair RM750 
CaseMouseMouse PadAudio
Corsair Obsidian 700D Logitech G700 Outplay Sennheiser HD598 
Audio
Tritton PC510HDA (Microphone use only) 
  hide details  
Reply
post #65 of 90
Quote:
Originally Posted by pas008 View Post

if async is supported why wouldnt another card be benefitical? I know communication between the 2 or more cards needs to be done but wouldnt it be the same in a way?

edit I have used radeon pro and inspector so I do know what these things are

maybe might be time for new slot standard

What does asynchronous compute have to do with multi-GPU rendering?
Millenium Falcon
(24 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 4930k MSI Big Bang Xpower II EVGA GTX 690 Patriot Viper II Sector 7 
Hard DriveHard DriveHard DriveHard Drive
OCZ Deneva 2 Corsair Force 3 Maxtor Western Digital Green 
Optical DriveCoolingCoolingCooling
Samsung BD/DVD-RW Swiftech MCP655 x2 Black Ice GTX 480 Black Ice GTX 280 
CoolingCoolingCoolingCooling
Alphacool Repack Dual D5 Watercool Heatkiller 3.0 Alphacool GTX 690 fullcover Bitspower Big Bang Xpower II fullcover 
OSMonitorKeyboardPower
Windows 8.1 64-bit Professional 3x Dell S2340 Max Keyboard Durandal CoolerMaster V1000 
CaseMouseMouse PadAudio
Azza Genesis 9000B Logitech G700 Roccat Alumic Onkyo HT-S9100THX 
  hide details  
Reply
Millenium Falcon
(24 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 4930k MSI Big Bang Xpower II EVGA GTX 690 Patriot Viper II Sector 7 
Hard DriveHard DriveHard DriveHard Drive
OCZ Deneva 2 Corsair Force 3 Maxtor Western Digital Green 
Optical DriveCoolingCoolingCooling
Samsung BD/DVD-RW Swiftech MCP655 x2 Black Ice GTX 480 Black Ice GTX 280 
CoolingCoolingCoolingCooling
Alphacool Repack Dual D5 Watercool Heatkiller 3.0 Alphacool GTX 690 fullcover Bitspower Big Bang Xpower II fullcover 
OSMonitorKeyboardPower
Windows 8.1 64-bit Professional 3x Dell S2340 Max Keyboard Durandal CoolerMaster V1000 
CaseMouseMouse PadAudio
Azza Genesis 9000B Logitech G700 Roccat Alumic Onkyo HT-S9100THX 
  hide details  
Reply
post #66 of 90
Quote:
Originally Posted by Tsumi View Post

What does asynchronous compute have to do with multi-GPU rendering?

simultaneous graphical and compute workloads which could be distributed to other cards

doesnt that not kinda go hand and hand with multigpu adapter?

just trying to get a better understanding thats all
Edited by pas008 - 7/12/16 at 6:50am
post #67 of 90
Quote:
Originally Posted by PostalTwinkie View Post

Nvidia absolutely supports DX 12, and you damn well know it. Further, A-Sync still has yet to prove to be anything significant. Which is also why supporting it isn't a requirement of DX 12 support, one could imagine.

Stop with knowingly spreading fud.

I'm still waiting for someone to convince me why I should care what Ashes of the Singularity does or does not do. It appears to be a mediocre RTS with interesting technical capabilities that don't translate into good, fun gameplay.

So why does it matter, again, which does what better how?
post #68 of 90
Quote:
Originally Posted by Mand12 View Post

I'm still waiting for someone to convince me why I should care what Ashes of the Singularity does or does not do. It appears to be a mediocre RTS with interesting technical capabilities that don't translate into good, fun gameplay.

So why does it matter, again, which does what better how?
Honestly I don't think I've ever seen anyone play AOTS. Only bench it.

To me it's some weird game that nobody plays but everyone mentions because it makes AMD look better in certain aspects sleepsmiley02.gif
Aurora III
(20 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770k GIGABYTE GA-Z87X-UD4H LGA 1150 Asus GTX 1080 FE Nvidia GTX 1080 FE 
RAMHard DriveHard DriveHard Drive
GSKILL Sniper 32gb 2400mhz Samsung 850 OS Samsung 850 Seagate 7200rpm 
Hard DriveOptical DriveCoolingOS
WD Blue 7200rpm Asus Cheap Optical Drive Corsair H110 Win 10 
MonitorMonitorKeyboardPower
Samsung 7000 40" 4k Qnix QX 2710 2560x1440p 96hz Logitech g710+ Kingwin Lazer 850w Platinum 
CaseMouseMouse PadAudio
NZXT Switch 810 Razer Naga Epic Xtrac Ripper XXL Denon receiver, pol ts200s bookshelf, ts1000 sub. 
  hide details  
Reply
Aurora III
(20 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770k GIGABYTE GA-Z87X-UD4H LGA 1150 Asus GTX 1080 FE Nvidia GTX 1080 FE 
RAMHard DriveHard DriveHard Drive
GSKILL Sniper 32gb 2400mhz Samsung 850 OS Samsung 850 Seagate 7200rpm 
Hard DriveOptical DriveCoolingOS
WD Blue 7200rpm Asus Cheap Optical Drive Corsair H110 Win 10 
MonitorMonitorKeyboardPower
Samsung 7000 40" 4k Qnix QX 2710 2560x1440p 96hz Logitech g710+ Kingwin Lazer 850w Platinum 
CaseMouseMouse PadAudio
NZXT Switch 810 Razer Naga Epic Xtrac Ripper XXL Denon receiver, pol ts200s bookshelf, ts1000 sub. 
  hide details  
Reply
post #69 of 90
Quote:
Originally Posted by davidelite10 View Post

Honestly I don't think I've ever seen anyone play AOTS. Only bench it.

To me it's some weird game that nobody plays but everyone mentions because it makes AMD look better in certain aspects sleepsmiley02.gif

I've never seen anyone play Overwatch, should I assume that everyone bought that game just to see how it performs? Come to think of it, I haven't seen anyone play most of the games available.
Computer
(14 items)
 
  
CPUMotherboardGraphicsRAM
i7-4790k Gigabyte Z97X Gaming 7 Nvidia GTX 1080ti FE (EK block) CORSAIR Vengeance 
Hard DriveHard DriveCoolingOS
Samsung 850 Evo Mushkin Reactor Custom Loop Windows 10 
MonitorKeyboardPowerCase
Ben Q BL3201PT Logitech G110 EVGA 1300w Thermatake Core P5 
MouseAudio
LOGITECH G502 PROTEUS CORE Corsair 2100 
  hide details  
Reply
Computer
(14 items)
 
  
CPUMotherboardGraphicsRAM
i7-4790k Gigabyte Z97X Gaming 7 Nvidia GTX 1080ti FE (EK block) CORSAIR Vengeance 
Hard DriveHard DriveCoolingOS
Samsung 850 Evo Mushkin Reactor Custom Loop Windows 10 
MonitorKeyboardPowerCase
Ben Q BL3201PT Logitech G110 EVGA 1300w Thermatake Core P5 
MouseAudio
LOGITECH G502 PROTEUS CORE Corsair 2100 
  hide details  
Reply
post #70 of 90
Quote:
Originally Posted by Mand12 View Post

I'm still waiting for someone to convince me...

Why? Is it important you're convinced? Do you want to be convinced?

http://www.overclock.net/t/1605554/bethesda-net-doom-now-updated-with-vulkan/
http://www.overclock.net/t/1604750/dsog-report-total-war-warhammer-runs-27-slower-in-dx12-on-nvidia-s-hardware/0_100

AotS is not a fluke, that's why. It's the first of a generation of games based on one of two new APIs that focus on a different method of accessing GPU resources, and that alone makes it important and interesting.

This new generation of games should no longer be artificially limited by any software API limits. They are limited by the hardware architectures and features, which is how it should be. No consumer wins with a lopsided duopoly. Competitive performance between vendors is something we should all be encouraging.
Edited by infranoia - 7/12/16 at 7:30am
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
Parasite
(18 items)
 
  
CPUMotherboardGraphicsGraphics
i7 4770K @ 4.7GHz Z87 MPOWER (MS-7818) Sapphire Radeon 290x @1100/1500 EVGA 1080Ti SC2 Hybrid 
RAMHard DriveHard DriveCooling
G.SKILL 2133 Samsung 850 Pro Caviar Black Corsair H100 
CoolingCoolingOSMonitor
Corsair HG10 Corsair H60 Windows 7 x64 Sony XBR65X850B 
KeyboardPowerCaseMouse
CMSTORM Quickfire XT Corsair AX1200i Antec P280 Logitec G700 
Mouse PadAudio
Black, came with my NeXTcube 25 years ago. Sound Blaster Recon 3D PCIe 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [WinBuzzer] Microsoft Simplifies Multi-GPU Support for DirectX 12 Developers