1080p = 1080 lines per second --> EVERY second all the lines are refreshed 50 times (Europe)/60 times (USA)--> resulting in lot's of data to be calculated per second
1080i = 540 (half of 1080) lines per second --> second 1 = lines 1, 2, 3, ..., 539 and second 2 = lines 2, 4, 6, ..., 540. Second 3 = lines 1,3,5,... and second 4 = lines 2, 4, 6, ... again. ETC... --> Resulting in half the data to be calculated per second
That's why most TV stuff is 1080i and not 1080p (it's only half the bandwith used)
PC's can do 1080p cause they only send from PC to TV/monitor and do not saturate the TV grid
And to compensate that "loss" they cranck up the refresh rate of the TV's
It used to be 50Hz (Europe), than they raised it to 100Hz TV's and now some (like Sony) uses MotionFlow Technologie and boost it up to 200Hz.
Plasma's can go up to 600Hz...
(But don't think you'll be able to play at 100-200 or 600fps
that's not the same frequency
It's just how you look at it... If you do the math one can say that 720p (effective 720 lines PER sceond) is better than 1080i (only 540 lines per second)
But because it switches so fast (for the eye) it isn't noticable (except fast moving horizontal stuff perhaps...) you'd better stay at 1080 i or p (pwould be best offcourse)
Cause the higher the res on the same screen is more pixels on that screen thus better image.
Here's an example (in Dutch) on "how" you can "look" at it (understand it)Edited by ASUSfreak - 5/10/12 at 11:41am