Originally Posted by JackCY
While my monitor has DP I prefer to use DVI because with DP it detects loss of device when monitor is powered off or disconnected and forces Windows to change resolution or at least change sizes of opened windows etc. With DVI there is no feedback and you can plug&play freely without anything affecting the layout in Windows. DVI is still limited to low resolutions and refresh rates. Any new monitor should have the latest DP and if it's multimedia oriented then also have HDMI.
Beyond some glitches like this I don't really see the use for DVI anymore. DP is so much easier to use and offers higher specs.
There is some issues yes but its very minimal. Driver/Windows update should fix it hopefully eventually. i havent had a single issue with my lg on dp with freesync even with the monitor going to standby ive had no issues been fine for a couple months.
Originally Posted by NuclearPeace
DVI being old only adds to the reason for it to be in pretty much every 480. Since its an old standard, people who don't have the means to get a new monitor will probably only be able to use DVI. Frugal people also are the target market for the 480. Out of all the places to make a stand against DVI, a budget card is probably one of the dumbest places to do it.
It puts a burden on the end user for no good reason. What does the card lose from having a DVI output, and what does it gain from having three displayports? Nobody is going to be doing a multi monitor setup on a $200 budget card.
Most RX 480s should come with 1x DVI-I, 1x HDMI, and 1X DisplayPort. There. Everyone's happy.
I bet more people use multi monitor then cheap khorean monitors or ancient monitors that only have dvi. Hence why the extra dp ports are needed. And also DVI-I is a joke and totally dead and been dead for years, i think you mean DVI-D. The card is pined for DVI so it would still be 3x dp 1x hdmi and 1x dvi but then it would be a dual slot card.
Originally Posted by rdr09
You like to oc, so it should be the 6+8. coupled with gups bios mod. ugh.
True but the 6pin is just a spec, it will easily be able to handle more wattage no problem especially with a quality psu. How can mu1sums or wtv his nickname is trip his power supply when OCed lol. Each card can easily reach 400w and then some when its rated for 375w including the pcie at full load (which it doesnt ever do as far as i know)