Overclock.net banner
41 - 60 of 107 Posts
No sli on dx12 unless dev supports it no matter the video card...
Could probably work on dx11 games with older drivers
I think you misunderstood me a bit on this.

I am asking specifically if RTX 3XXX series allow any SLI at driver level in any games DX11 and older. Apparently the above poster i quoted originally implies this is not true. I know for a fact its true with AMD but was not sure if Nvidia did the exact same thing.

In other words:

AMD RDNA/RDNA2 - NO Crossfire in any game at driver level except DX12/Vulkan assuming game is developed with it baked in. Dx11 and older games regardless of CF support or not will not work at all due to driver level changes at launch of generation of RDNA/RDNA2.

Nvidia????
 
Discussion starter · #42 ·
I think you misunderstood me a bit on this.

I am asking specifically if RTX 3XXX series allow any SLI at driver level in any games DX11 and older. Apparently the above poster i quoted originally implies this is not true. I know for a fact its true with AMD but was not sure if Nvidia did the exact same thing.

In other words:

AMD RDNA/RDNA2 - NO Crossfire in any game at driver level except DX12/Vulkan assuming game is developed with it baked in. Dx11 and older games regardless of CF support or not will not work at all due to driver level changes at launch of generation of RDNA/RDNA2.

Nvidia????
I think this answers your question:-
 
  • Rep+
Reactions: dagget3450
To me SLI seemed like more of a way to upgrade in pieces. Buy one reasonable card and then get a second one when you need more performance instead of a new top end card. But Nvidia didn't want to give people a reason to stick with older hardware for longer so they made it so only high end cards support SLI now. If you actually need more performance than a single card will give you, sure it makes sense but beyond that a single card is always preferably.

I am a bit disappointed that DX12 SLI never took off since that would actually allow for doubling of memory instead of just mirroring. If it had taken off older 4GB GPUs could be killing it in SLI today.
To be fair sli support today is abysmal and too much issues. Even at 2x 1080s microstutter was a pain. The way to do it is the same hitman done with dx12 compute but no devs want to do it. It's extra work for a now very narrow niche
 
Gotta think about it like this. Sli 3090 at MSRP is what 3500-4k... Then your gonna want a board that does x16 on both slots so x299 or thread ripper 3-600 dollar+ boards then CPUs 600-1k it's a very slim market up there
 
Gotta think about it like this. Sli 3090 at MSRP is what 3500-4k... Then your gonna want a board that does x16 on both slots so x299 or thread ripper 3-600 dollar+ boards then CPUs 600-1k it's a very slim market up there
We are old enough to remember when the extra PCIE lanes didn't cost 2 arms,legs,eyes,testicles, and a kidney and a half.....

So dumb, that its still treated as such a premium and has almost no use anymore in the mainstream anyways...


I think this answers your question:-
You know its funny, most of the games aren't ones you need 2 gpu's on or bother doing such. I can see something like a flight sim/racing game, maybe some games that are open world or that have long term playability like some MMO's...

At any rate, its mostly dead and i blame them for advertising the hell out of it for a long time and giving zero cares. i got boxes and boxes of video cards/mobos all displating the marketing SLI logos and CF logos..

It sure seems like while DX12 was coming out they sure advertised and talked about multi-gpu like it was a big deal. Now we see it was fluff talk and they are ready to retire it.
 
Like an idiot, I bought a 3080 TI. If I want to SLI I will have to remortgage my house.
I just dropped my ASUS gaming 11G OC SLI setup for ASUS TUF 3080IT, you are right it was not cheap, but I got it much cheaper than what it had been going for IF you could even find one.. I find it a nice step up over my SLI setup and I did all my gaming 4K 60hz, so I got a new monitor same time as the GPU, now I 4K at 144hz... doubt I will see my going back to SLI, but I rode that horse for 10 years across 4 builds.
 
Discussion starter · #47 ·
To me SLI seemed like more of a way to upgrade in pieces. Buy one reasonable card and then get a second one when you need more performance instead of a new top end card. But Nvidia didn't want to give people a reason to stick with older hardware for longer so they made it so only high end cards support SLI now. If you actually need more performance than a single card will give you, sure it makes sense but beyond that a single card is always preferably.

I am a bit disappointed that DX12 SLI never took off since that would actually allow for doubling of memory instead of just mirroring. If it had taken off older 4GB GPUs could be killing it in SLI today.
As I have been using a 32 inch 4K display since January 2014, sli was a necessity in any game.
 
after some research about DX12 Multi GPU I'd hate to tell you all this but implementing this far harder that just having the drivers do it like they did DX11. Moving it to the work to the developers. The problem here is that now not only does the game/game engine, have to support it but it's almost like there is
about 4 different ways to support Multi GPU making even more complex than it was initially though. DirectX - Wikipedia

DirectX 12
See also: Direct3D 12
DirectX 12 was announced by Microsoft at GDC on March 20, 2014, and was officially launched alongside Windows 10 on July 29, 2015.

The primary feature highlight for the new release of DirectX was the introduction of advanced low-level programming APIs for Direct3D 12 which can reduce driver overhead. Developers are now able to implement their own command lists and buffers to the GPU, allowing for more efficient resource utilization through parallel computation. Lead developer Max McMullen stated that the main goal of Direct3D 12 is to achieve "console-level efficiency on phone, tablet and PC".[42] The release of Direct3D 12 comes alongside other initiatives for low-overhead graphics APIs including AMD's Mantle for AMD graphics cards, Apple's Metal for iOS and macOS and Khronos Group's cross-platform Vulkan.

Multiadapter support will feature in DirectX 12 allowing developers to utilize multiple GPUs on a system simultaneously; multi-GPU support was previously dependent on vendor implementations such as AMD CrossFireX or NVIDIA SLI.[43][44][45][46]

Implicit Multiadapter support will work in a similar manner to previous versions of DirectX where frames are rendered alternately across linked GPUs of similar compute-power.

Explicit Multiadapter will provide two distinct API patterns to developers. Linked GPUs will allow DirectX to view graphics cards in SLI or CrossFireX as a single GPU and use the combined resources; whereas Unlinked GPUs will allow GPUs from different vendors to be utilized by DirectX, such as supplementing the dedicated GPU with the integrated GPU on the CPU,or combining AMD and NVIDIA cards. However, elaborate mixed multi-GPU setups requires significantly more attentive developer support.
DirectX 12 is supported on all Fermi and later Nvidia GPUs, on AMD's GCN-based chips and on Intel's Haswell and later processors' graphics units.[47]

At SIGGRAPH 2014, Intel released a demo showing a computer-generated asteroid field, in which DirectX 12 was claimed to be 50–70% more efficient than DirectX 11 in rendering speed and CPU power consumption.[48][49]

Ashes of the Singularity was the first publicly available game to utilize DirectX 12. Testing by Ars Technica in August 2015 revealed slight performance regressions in DirectX 12 over DirectX 11 mode for the Nvidia GeForce 980 Ti, whereas the AMD Radeon R9 290x achieved consistent performance improvements of up to 70% under DirectX 12, and in some scenarios the AMD outperformed the more powerful Nvidia under DirectX 12. The performance discrepancies may be due to poor Nvidia driver optimizations for DirectX 12, or even hardware limitations of the card which was optimized for DirectX 11 serial execution; however, the exact cause remains unclear.[50]

The performance improvements of DirectX 12 on the Xbox are not as substantial as on the PC.[51]

In March 2018, DirectX Raytracing (DXR) was announced, capable of real-time ray-tracing on supported hardware,[52] and the DXR API was added in the Windows 10 October 2018 update.

In 2019 Microsoft announced the arrival of DirectX 12 to Windows 7 but only as a plug-in for certain game titles.[53]
The Wiki list the two version of DX12 mGPU Implicit and Explicit, and what it doesn't tell you is that some of those games are still dependent on the drivers to show the flags for multi-GPU to it still supports the Crossfire or SLI in DX12. The Driver's flag from DX11 are not in place for DX12 and it doesn't work at all. Nvidia just doesn't put the flags out in the drivers for DX11 in Dx12 games. AMD has been putting older DX11 driver flags for multi-gpu for DX12 games so don't be surprised if you see people running dual RX 6900 XT in crossfire and it's working in DX12, but it's not working SLI.

the last problem is there is now no way to tell which game/game engine still needs the old DX11 flags from the drivers, while being ran in/on DX12.

any one see the two RX 6800 XT in (crossfire/Mgpu) ?? AMD Radeon RX 6800 XT in mGPU: 2 x Big Navi GPUs = Insane Performance | TweakTown

are there any 4K screens that go past 165hz yet? I'm thinking some of these high fps games are a waste because I don't see any 240hz 4K screens anywhere.
 
after some research about DX12 Multi GPU I'd hate to tell you all this but implementing this far harder that just having the drivers do it like they did DX11. Moving it to the work to the developers. The problem here is that now not only does the game/game engine, have to support it but it's almost like there is
about 4 different ways to support Multi GPU making even more complex than it was initially though. DirectX - Wikipedia



The Wiki list the two version of DX12 mGPU Implicit and Explicit, and what it doesn't tell you is that some of those games are still dependent on the drivers to show the flags for multi-GPU to it still supports the Crossfire or SLI in DX12. The Driver's flag from DX11 are not in place for DX12 and it doesn't work at all. Nvidia just doesn't put the flags out in the drivers for DX11 in Dx12 games. AMD has been putting older DX11 driver flags for multi-gpu for DX12 games so don't be surprised if you see people running dual RX 6900 XT in crossfire and it's working in DX12, but it's not working SLI.

the last problem is there is now no way to tell which game/game engine still needs the old DX11 flags from the drivers, while being ran in/on DX12.

any one see the two RX 6800 XT in (crossfire/Mgpu) ?? AMD Radeon RX 6800 XT in mGPU: 2 x Big Navi GPUs = Insane Performance | TweakTown

are there any 4K screens that go past 165hz yet? I'm thinking some of these high fps games are a waste because I don't see any 240hz 4K screens anywhere.
I am having a similar conversation in another thread, which might be peaking my interest on Crossfire(mGPU) in Radeon driver software. If AMD is in fact adding mGPU for dx11 games/benchmarks then this is new to me. I tried it a while back and it only worked for dx12.

I am going to investigate this now because I'd like to do some experiments and before it was impossible to get anything other than a few things to work.

This also begs the question of why AMD would be bothering to do this. I personally don't think they are wasting time. I think it may be more of a bug issue. Either way this has peaked my interest again.

I will be doing some testing.
 
AMD might require Crossfire / mGPU for the MCM cards. I am not sure why this would not be done on a hardware / firmware level with MCM. But who knows maybe they are doing testing with the drivers now to see what it needs. Perhaps to get the team up to speed on multi gpu before writing firmware.
 
I still run my two 1080 ti cards in SLI, still get 2x perf in ff14, never even bothered to put my 3090 in here because i'm lazy lol. Bungie killed sli when beyond light came out so perf took a nosedive there but performance sucks in that game for everything now so not a huge loss I guess.
 
AMD might require Crossfire / mGPU for the MCM cards. I am not sure why this would not be done on a hardware / firmware level with MCM. But who knows maybe they are doing testing with the drivers now to see what it needs. Perhaps to get the team up to speed on multi gpu before writing firmware.
In AMDs case if they are in fact adding cf/mGPU into drivers where it wasn't there before on rdna2. Then this could be a good reason why it's happening now. Otherwise it doesn't make any sense.

I am installing my waterblocks on my 6900xt cards to put into one system. I have a feeling I will regret doing this where as they were in separate systems before. After testing I'll put them back in each system but at least they will be under water now. Just have to get another pump before then.
 
In AMDs case if they are in fact adding cf/mGPU into drivers where it wasn't there before on rdna2. Then this could be a good reason why it's happening now. Otherwise it doesn't make any sense.

I am installing my waterblocks on my 6900xt cards to put into one system. I have a feeling I will regret doing this where as they were in separate systems before. After testing I'll put them back in each system but at least they will be under water now. Just have to get another pump before then.
I'm just wondering what games you'll be testing ?
I would Check this list for Compatibility List of games with DirectX 12 support - EverybodyWiki Bios & Wiki
 
First thing I am testing is fire strike, it appears to be working according to HOF results. Which is DX11 and was not previously working for me a while ago when I tested mGPU rdna2

See this thread Evga X570 Dark Mod to Engage Crossfire
I'm not aware any board now "needs to support" as it's should be the only limitations are the cards themselves, the games, & some it is the boards limitation by being only to setup 16x/4x. A lot of times older reviews say the board supports this 8x/8x setup. Like PC gamers review of my MSI MPG gaming edie wifi says is supports it 8x/8x but you can download the PDF and see it only supports 16x/4x.
in fact I'm reading the EVGA PDF says it supports both SLI and AMD Multi-GPU for DX12
 
I'm not aware any board now "needs to support" as it's should be the only limitations are the cards themselves, the games, & some it is the boards limitation by being only to setup 16x/4x. A lot of times older reviews say the board supports this 8x/8x setup. Like PC gamers review of my MSI MPG gaming edie wifi says is supports it 8x/8x but you can download the PDF and see it only supports 16x/4x.
in fact I'm reading the EVGA PDF says it supports both SLI and AMD Multi-GPU for DX12
huh should have checked that 1st, both FTW and DARK does say it supports multi GPU

Font Art Circle Rectangle Handwriting
 
My apologies if I wasn't clear on what I am looking to test.

I am not worried about mother board support but the fact that something like fire strike, which is DX11 never worked before mGPU on AMD(RDNA2). I am looking to see how a dx11 app is using mGPU when mGPU on AMD RDNA2 is limited(supposed to be) to DX12

I am not worried about motherboards or pcie speeds but how or why a dx11 program is "using" mGPU when in previous was not able.
 
I am not worried about motherboards or pcie speeds but how or why a dx11 program is "using" mGPU when in previous was not able.
I get the question you're asking and looking around but besides the two threads you're in, and when RDNA 2 launched, it looks like no one has tried or even talking about it.
All I keep getting for the most part is that 6900XT that OCed 3.3Ghz to take top spot.
Nothing about any mGPUing, but than again I'm at work doing this on free time. Gonna try to look and see if anyone has tired aborad.
 
nvidia did require a physical chip (or maybe it was firmware only) that mother boards had to include for SLI to work and they had to pay for it. Crossfire always worked afaik provided you had 2 16x slots, they also might have required some level of direct cpu access (pci-e lanes) for crossfire too altho i'm not certain on that.
 
41 - 60 of 107 Posts