Is there a difference? I've just spent hours trying to figure out what I should be doing with my 1080i TV, whose native resolution is 1360 x 768.
So I've found the following "information," and they can't all be true and still be logical. Can anyone tell me what is true???
I understand that different connectors will play a part... but I'm just looking for some ground rules when thinking about resolution and how high I can force mine.
1920 x 1080 @ 30hz = 1080p
1920 x 1080 @ 60hz = 1080i
1440 x 900 = 1080i
1360 x 768 = 720p
I've also read that there is no such thing as interlaced when dealing with a signal from an HTPC, since it is all sent at once (progressively).
I've been reading about this so long I can't even think straight anymore.
So can someone tell me, is there any way of utilizing all 1080i lines of my LCD TV by using my computer?
Note: my display does not allow 1:1 pixel mapping, so HDMI currently looks like crap and I'm using VGA at 1360 x 768...
help.
So I've found the following "information," and they can't all be true and still be logical. Can anyone tell me what is true???
I understand that different connectors will play a part... but I'm just looking for some ground rules when thinking about resolution and how high I can force mine.
1920 x 1080 @ 30hz = 1080p
1920 x 1080 @ 60hz = 1080i
1440 x 900 = 1080i
1360 x 768 = 720p
I've also read that there is no such thing as interlaced when dealing with a signal from an HTPC, since it is all sent at once (progressively).
I've been reading about this so long I can't even think straight anymore.
So can someone tell me, is there any way of utilizing all 1080i lines of my LCD TV by using my computer?
Note: my display does not allow 1:1 pixel mapping, so HDMI currently looks like crap and I'm using VGA at 1360 x 768...
help.