Originally Posted by Gary2015
Depends what games you play . I have the 4K one as well but my daily driver is the X34. I will see if it's overkill when I get my cards. I had two SLi 1080 and I wasn't get 100fps in ESO nor GTAV with mods. With BF1 coming out soon I would prefer 100fps constant . With the new Asus 4K 144 hz out next year , it won't be overkill.
With DSR at max my 1080s only did 25fps on ESO
Hmm....yeah i was thinking that perhaps i'd be better off with the X34 due to demanding games. Like i thought to myself "yes it is GENERALLY overkill, but on like The Witcher 3 at Max settings with Hairworks, or 4K Fallout 4 with max Godrays and 100+ mods etc.. i would probably end up at ~80-85fps or something, so perhaps it'd be worth it to get the X34 anyway"
But then there's the X34P coming out SOMETIME this year (i hate Acer's ambiguous vague estimates...."we think it might possibly sort of be kinda ready around Q4 of next year....i guess?" lmao. The X34P tempts me to not get a monitor yet, as it will have 100hz out of box with overclocking still enabled, an Acer rep said minimum 120hz would be possible, and since it uses Displayport 1.4, assuming the limiation is bandwidth and not the panel, i'm betting that ~130-140hz is more likely. Plus it has the joystick controls like the ASUS ROG monitors, and has a matte back finish, and also has the swivel etc.. stand; and it uses one of the higher quality LG "S-IPS" panels instead of the "AH-IPS" panels that have all the backlight bleed issues.
On a side note....is ESO REALLY that demanding? I mean yeah it's online so there's latency and CPU issues etc.. that effect ALL online games; but i didn't think the graphics and whatnot of ESO put it at the level that you would dip to 25fps! And yeah BF1 is one of the main things making me want to try the X34. I'm never a big FPS player, i enjoy the occasional playthrough of Metro 2033/LL, Far Cry 3/4, Fallout games etc.. but i'm never a competitive CoD, BF, etc.. type player. This is why i thought i might be able to get away with 60hz especially when it's in exchange for 4K, since i mostly play RPG, RTS, etc.. type games like Dark Souls, Dragon Age, Skyrim, The Witcher, Total War etc..etc.. but idk...
Originally Posted by HyperMatrix
So I was investigating some FurMark fun. Here's what's happening:
- You can hit 120% TDP on FurMark even with around a 50-60% GPU load on FurMark
- As the card heats up, more power is required for the same performance level
- This results in throttling, in order to stay below TDP
- This cycle continues for quite a while, as the card keeps throttling down to stay within TDP, and the gradual heat build up requiring more and more power for the card to operate, which in turn leads to even more throttling.
Was running a game and at 70% usage everything was fine. Clocked at between 2050-2080MHz. The instant I changed a setting that pushed the GPU to 100% usage, it would instantly drop to 1900-1980MHz.
Even on air, we're going to need a modified bios to take full advantage of these cards.
Yeah even in The Witcher 3 i was getting 89C after only 2-3 minutes even with 80% fan speed; have to crank to 90% or higher to keep it at 86C. I'm guessing this is because Witcher 3 with Ultra settings + Hairworks etc.. was using 85-99% usage. I have NO idea how people like JayztwoCents are running at 83-84C max, unless it really IS just a bad factory TIM application on certain cards.