Originally Posted by blitz6804
I believe the dual-PCIe slot boards did ship with SLI bridges until nVidia complained to DFI. DFI stopped shipping bridges and did a better job of gimping the chipset.
Good to know. I was wondering why it had one if it indeed is not
an SLI board. After looking at that article some, I just had a
moment. I forgot to pay attention to what the NB label was. I believe it just said it was a NF4Ultra.
I don't have time right now to read the whole thing, but did it say that even if you enable it, does it go x8/x8 or does it stay x16/x2?
Originally Posted by thlnk3r
Blitz, which boards were non-sli and only dual-pcie? I remember you guys talking about one DFI board that was capable of SLI.
Looking at the manual it appears that DFI made 3 boards with the NF4 ultra chipset, that I can find in the manual so far. There was one that was not the Ultra or SLI chipset.
Some of the boards that are capable of SLI but are not equipped are the Ultra-D, Ultra-DR, Ultra DxG. The DxG stands for dual express graphics. You were able to run 4 monitors at once supposedly. The DxG is the one I got off of fleabay that hit that astounding 400MHz reference clock.
Originally Posted by N2Gaming
That was a good read. I wish I knew about that when I purchased my NIB NF4 SLI DR.
I could have save almost $100.
I thought you had an expert board? The experts are pretty different than the Ultra-DR/SLI-DR. They had a lot more features not to mention the Sil3114 controller and an extra 4 SATA ports, a debug LED and such.
Originally Posted by txtmstrjoe
A comment about the DFI LANParty UT nF4 Ultras and SLI-modding: This technique of getting SLI on the cheap was only really applicable to the very early revisions of the nF4 Ultras.
Unsurprisingly, nVidia was terribly unhappy with this mod (since it would obviously eat into profits from "legit" SLI chipsets/motherboards), so they demanded that DFI re-engineer their nF4 Ultras to make the SLI mod subsequently impossible. DFI complied, of course. Moreover, nVidia also re-engineered their driver packages to make future SLI-mods all but impossible to implement on boards with the nF4 Ultra chipsets.
(Of course, as far as I know only DFI ever manufactured an nF4 Ultra motherboard with dual PCI-E graphics slots.
This lends some credence to thlnk3r's theory that the nF4 Ultra and the SLI-capable chipsets might all be based on one "super-chipset," but with certain options deleted/deactivated for use on lower-tier products.)
I think they might've given up on the driver aspect of it. IIRC, even with my A8N32, they all used the same 15.23 nforce driver.
And as far as I know, all DFI did was put epoxy over the pins that you join together. Which I actually find kind of funny because that just makes it easier to find which ones you need to join. I've found a couple of articles that show the writer removing the epoxy with a scalpel, exacto or some other fine sharp instrument.
I think that thinker is right on the money. Its cheaper overall to make 1 chip and just cut certain connecting bridges to deactivate features. Rather than make several dies. It's the same way with automotive stuff. I spoke with an engineer from Ford and asked him why Ford doesn't change a certain little part, like the angle of a bend in a metal line. He told me that to even change the bend one degree would cost millions of dollars. That's why you'll see the same engine in a lot of different cars. Or other manufacturers engines in our domestics like the Mercury Villager using a Nissan 3.0 motor.Edited by BlackOmega - 4/7/09 at 10:17am