Originally Posted by trendy
How do you figure? The average frame rate would be lower if the minimum frame rate is lower. I'd call foul if they posted only the maximum frame rate, because any CPU/GPU can pull off a few seconds of a high FPS and that would misrepresent the overall performance of the given hardware configuration.
I'll give you an example...
My old system was a 4.1ghz Phenom 2 x6 running crossfire 5850's clocked at 850Mhz core.
Running Crysis in DX10 very high with 4xAA at 1080p resulted in an average of 55fps and a minimum of 38fps.
Overclocking the 5850's to 1Ghz cores made no difference to the benchmark results at all.
Then I moved to a 2500k, same 5850's, same drivers, same clocks, same game settings only this time average was 60fps and minimum jumped to 48fps.
Overclocking the 5850's to 1Ghz improved things even more.
The 2500k only added 5fps onto the average which could be blown off as error of margin but a 10fps on the minimum? And believe me when I say you could feel the extra smoothness during fire fights on the 2500k.
Minimum frame rate has always been a bigger indication of CPU performance over average and it always will be.