Originally Posted by thlnk3r
I don't know man, I can pretty much tell the difference. I watched Transformers (blue-ray) on a 120Hz screen and it was beautiful. Maybe because it was filmed in a digital format? Who knows...but I can certainly say I can tell a difference (60Hz vs 120Hz).
To be honest Thinker, I think that is because of the way it was shot, and not the refresh rate of the panel because, as far as I understand it, 120hz is just "digital trickery," with no real "substance" to the added frames. There is usually some sort of image processor that extrapolates frames and inserts additional "synthetic" frames where needed to make "120hz."
But their is no true "120hz" panel out their to my knowledge.
I know from experience. I have a 120hz 40" Sony Bravia HDTV, and for some films, they look better than my buds Samsung 60hz 40", both panels have similar specs (his is supposed to have a higher dynamic contrast ratio by 500, but we can't tell the difference."
And either or TV, both playing the same movie, from a PS3 looks the same. We did a side by side test at our previous LAN because we were curious about the "on paper" versus "reality" of it all.
The only true difference I got was that his PS3 ran cooler than mine
(Mine was 90nm/110nm his was 65nm/110nm (CPU/GPU))
Though, the "real" difference comes in when you are playing "3D" games, on an nVidia GPU, as they render double the frames and do an overlay, even though my Bravia is not the "nVidia certified" monitor, it works just fine as it has the image processor in it.
You can actually play on a 60hz pannel, but it makes your games run in 30 FPS, which is really hard on most of us gamers (really kills me when watching webcast stuff)
Either way, hope this was somewhat helpful. I'm taking a small break from my new board + CPU (A8N32 & Opty 185.)
Hopefully I'll get some runs going. And once I hit a brick wall...well, this thread here I come.