This seems like the best place for this, so here are a few recordings I have done, with lots more to come.
I have all settings the exact same and I also have "prefer max quality" set in the Nvidia Control Panel (to make it fair except for BF4 (I didn't realise that there was a quality difference, so I also did a vid comparing "let the 3D application decide" and "prefer max quality" which is this one:
The list will be added to as I do more games and everything will be apples to apples.
Gameplay on both was very smooth. Neither stood out when actually gaming for being better than the other and for the record, I am on a ROG Swift monitor and have G-Sync disabled. I am ultra picky when it comes to stuttering, so would notice if there was issues.
I could not see the fps - but that is beside the point. Just look and feel from both cards seem pretty close. Although to me, in the first video, the colors seemed more vibrant on the Fury side.
I could not see the fps - but that is beside the point. Just look and feel from both cards seem pretty close. Although to me, in the first video, the colors seemed more vibrant on the Fury side.
GTA V - Close in fps, but color quality/image quality looks much better on Furyx
Dirt Rally- similar fps (some places more for Tx), but more vibrant colors for Furyx (or is it Placebo for me...lol)
Project Cars- Better FPS on Titan x (Gameworks.. no wonder
) , colors quality - could not tell the difference
BF4- Titan X takes a 7- 12 fps hit when max quality selected, but fps is comparable to Furyx with similar IQ as long as Max quality selected in Nv control panel
I wonder if all the reviewers are leaving Titan x at default NV control panel setting which gives Titan X better benchmarks/review scores than Furyx, since image quality is ignored in all the reviews. Is that a fair statement?
No arguments from me Provost and I think professional reviewers need to state what they are settings things at. If like me they didn't realise that there is a difference, it can actually sway results quite a bit.
I don't see any review actually bother with graphic quality nowadays, I guess part of the reason is there is no standard for deciding better or worse.
No arguments from me Provost and I think professional reviewers need to state what they are settings things at. If like me they didn't realise that there is a difference, it can actually sway results quite a bit.
My GK110s default to quality pre set --but looks like Maxwell's default Texture Filtering is set to "High Performance"? I tried taking a screenshot of NV control panel, but it came out way too tiny , not sure why....
My GK110s default to quality pre set --but looks like Maxwell's default Texture Filtering is set to "High Performance"? I tried taking a screenshot of NV control panel, but it came out way too tiny , not sure why....
It's a filtering/quality comparison, not performance.
I just tried Performance/Quality/High Quality on Dirt Rally on my 960, and they all looked the same. They all looked kind of bad though, so that's probably not a good comparison - not sure the game has the final textures yet.
My 960 defaults to Quality as well.
It's a filtering/quality comparison, not performance.
I just tried Performance/Quality/High Quality on Dirt Rally on my 960, and they all looked the same. They all looked kind of bad though, so that's probably not a good comparison - not sure the game has the final textures yet.
it has a FPS counter. so I doubt it, people will look at fps
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Related Threads
?
?
?
?
?
Ask a question
Ask a question
Overclock.net
27.8M posts
541.2K members
Since 2004
A forum community dedicated to overclocking enthusiasts and testing the limits of computing. Come join the discussion about computing, builds, collections, displays, models, styles, scales, specifications, reviews, accessories, classifieds, and more!