Originally Posted by Rmerwede
Accurate or not, nVidia has gotten exactly what they wanted.
I have no interest in buying a second 7970 now...
I have heard nothing but negative feedback when someone asks "should i get a second 7xxx?". It's "sell it and upgrade to a single card" (mainly Titan), or "If you want multiple GPUs, go nVidia."
I don't care how many people have a smooth experience with crossfire, I will not drop $400+ on a card that "might" work, or one I have to tweak for hours on end to achieve an acceptable level of performance. Shame too... I like the quality of nVidia (most of the time), but every time I read a press release, or hear their reps speak, I vomit in my mouth a little. Also, keeps me from wanting to hand them my money.
This is the problem this article is creating, people act as if Nvidia's SLi setups are just perfect. Truth is no multi-GPU setup is. There are plenty of users around here that are running CFX/trifire/quadfire setups with no issues. Truth be told when your going multi-GPU i wouldn't expect everything to work out just perfect IMO, i don't and i will be picking up my second 7970. Simply because i don't feel like having to sale the XFX i have that is crap and the diamond i picked up.
Eric basically spells it out. It's most likely something most of us wouldn't be able to notice and it's being treated as a fatal error on AMD's part. I'm not saying people aren't experiencing stutter and i'm not saying that there isn't a problem, but to act as if CFX setups are plagued with nothing but problems and it would greatly benefit you to just go green because their SLi setups have no problems is false.
And if CFX doesn't work for me even with the "fix" i'll try some 670s but not with the expectations that everything is going to just work without fault, from my understanding multi-GPU solutions have never been just "perfect"
JMO on the matter, nothing more.
Originally Posted by Majin SSJ Eric
I don't really care about conspiracy theories or anything like that. What bothers me (and probably other AMD users) is the fact that PCPer, in this article, is basically telling us that we CF users are idiots. Believe me, if my CF 7970 had been performing at a level of 20-25 FPS for the past year I would not be sitting here defending it. I may not be very susceptible to microstutter but I can absolutely tell the difference between 100 FPS and 25 FPS and to have my intelligence insulted by saying that I have been only experiencing 25 FPS for the past year is mildly aggravating, not to mention absurd. There is obviously a frame time issue with AMD drivers and they have acknowledged that (and are working on a fix) but this story is blown way out of proportion IMO.
I've been saying all along (since the original TR story) that this issue may be obvious on a graph but as an actual impediment to user experience it is not a big deal. Of course Nvidia and their legion of fanboys has jumped all over it and made it seem as though CF is simply unplayable but as someone who has owned said setup for over a year I find that preposterous. Some people who are sensitive to microstutter may have issues with AMD's CF but the vast majority of us have been enjoying these cards for over a year and its simply outrageous that PCPer is now coming along and trying to claim that we've all been duped by AMD and that we really are only getting 25 FPS in CF. Check my sig and you'll see that I've extensively tested CF against SLI for the past month (7970 vs Titan) and after having run both setups back to back I noticed no massive improvement in smoothness in either setup over the other (except in Crysis 3 and FC3 where the 7970's simply lack the horse power to maintain high enough minimum frame rates).
I'm glad that AMD is working to fix whatever issues are causing the frame time irregularities but I put absolutely no credence in this PCPer story...
Excerpt from the conclusion of Guru's article, the bold is the title not something i'm specifically pointing out.
Edited by Blackops_2 - 4/4/13 at 1:30pm
Originally Posted by Guru3d
Isn't Everybody Overreacting?
Stuttering, measuring anomalies... isn't everybody overreacting? Well yes and no. Small anomalies rendered and which you can experience on screen always have been a part of a graphics card and your overall game experience. For years now we have had that, for years now most of you have not been bothered by it. Aside from a small group enthusiast end-users and analysts, that is the primary context you need to keep in mind when it comes to FCAT measurements, really.
Yeah, average FPS is still (in my opinion) the most important denominator in terms of determining how fast a game can be rendered. Now that doesn't mean I am disqualifying frame time or what I like to call frame experience measurements, contrary. Frame experience measurements in my mindset will help as an extra tool and data-set to show you the relation of render performance versus what you see on screen. Frametime measurements are a tool to detect anomalies that we never really measured. So it is more a question of what can we accept when analyzing anomalies and what not, because some people will totally freak out if they see a couple of latency spikes in a chart. Realistically you'll be hard-pressed to notice it, heck one big massive scary spike in a chart could even something as simple as a game scene change. Frame Time / Frame Experience measurements however are becoming a part of Guru3D test and benchmark methodology. It will sit next towards what we have always shown you, average FPS, as average FPS we still consider to be the best measurement we can fire off at you if you are asking the question "how fast is my graphics cards". But an extra data-set that can detect anomalies obviously is great to have and show.