Overclock.net banner

is a 1360x768 considered HD?

75K views 14 replies 10 participants last post by  drbaltazar  
#1 ·
and what P does it belong? 360p, 1080p somewhat like that
tongue.gif


thanks
 
#3 ·
These are industry standards- 720p, 720i, 1080p and 1080i. 1080p means that resolution is 1920x1080 and p means that it is progressive scan which further means that the whole picture is displayed at the same time on your monitor. i stands for interlace scan and it means that a picture is drawn line by line starting from the top left and ending at the bottom right of screen. For same resolution, p is considered better as it displays whole picture at same time. Refresh rate means how many frames are displayed per second. A screen with specs 1080p, 60Hz means it is 1920x1080, uses progressive mode and displays 60 frames per second. As far as i know, interlace mode was used by old crt monitors. New lcd screens use progressive mode.

1360x768 is considered hd but not full hd. Such monitors can display 720p but not true 1080p. It doesn't mean however that you can't watch1080p movies. Your monitor will just compress the video to its native resolution.
 
  • Rep+
Reactions: Kentthegamer
#5 ·
I have no clue why 768p was pushed as a standard for screen resolutions... it's completely nonstandard as far as video is concerned. 720p makes soooooo much more sense.
 
#6 ·
Quote:
Originally Posted by Polymerabbit View Post

I have no clue why 768p was pushed as a standard for screen resolutions... it's completely nonstandard as far as video is concerned. 720p makes soooooo much more sense.
768p was pushed due to those 15.6 inch laptop screens. For them, 1366*768 was the sweet spot. It offered best picture quality at reasonable price.
 
  • Rep+
Reactions: Kentthegamer
#8 ·
Quote:
Originally Posted by akafreak View Post

These are industry standards- 720p, 720i, 1080p and 1080i. 1080p means that resolution is 1920x1080 and p means that it is progressive scan which further means that the whole picture is displayed at the same time on your monitor. i stands for interlace scan and it means that a picture is drawn line by line starting from the top left and ending at the bottom right of screen. For same resolution, p is considered better as it displays whole picture at same time. Refresh rate means how many frames are displayed per second. A screen with specs 1080p, 60Hz means it is 1920x1080, uses progressive mode and displays 60 frames per second. As far as i know, interlace mode was used by old crt monitors. New lcd screens use progressive mode.
1360x768 is considered hd but not full hd. Such monitors can display 720p but not true 1080p. It doesn't mean however that you can't watch1080p movies. Your GPU will just compress the video to its native resolution.
Fixed
 
#10 ·
1366x768 is marketed as 720p HD but it's far from 720p. It's just 720p "Capable" I would not call that HD, personally because It's an awful resolution to have. It's just enough scaling to degrade the video quality of a nicely rendered 1280x720 video or image.

I'd rather have a 1280x720 monitor because it is a standard resolution and 720p content won't be stretched on it.

This is a screenshot taken from Portal 2 at maximum settings at a true 720p (1280x720) image saved in a lossless format.
338

This is the same image, scaled to 1366x768 in GIMP, look at the difference in quality, it's due to scaling at a fraction instead of a whole number.
337

This is the same image, scaled to 2560x1440 in GIMP, this did not lose any quality because it was scaled exactly by 2, which is a whole number. This means for every pixel, there are 4 of the same ones. It's a direct size increase while not adding/removing pixels selectively. You can open this in MS Paint and stretch it to 50% H+W and get the exact image you saw first, you can not scale down the 1366x768 one back to the original because it's added pixels to make up for it not being regularly scaled.
338

768p Televisions are one of the reasons why consoles look much worse than PCs in terms of graphics, the graphics are not always a native 720p so they're scaled to 1280x720, then when it's sent to the TV the TV Scaled it AGAIN from 1280x720 -> 1366x768 doubling the degradation of quality as seen in my pictures above. I'd strongly recommend you don't purchase one if that is your reason for asking this question.

Quote:
Originally Posted by Kentthegamer View Post

well, i see, but how would i know if my monitor is a progressive or an interlaced one?
thanks
It should be labeled on it somewhere. If you can't see it, then check for the model number and post it here or google it yourself.
 
#12 ·
Quote:
Originally Posted by Shadow11377 View Post

1366x768 is marketed as 720p HD but it's far from 720p. It's just 720p "Capable" I would not call that HD, personally because It's an awful resolution to have. It's just enough scaling to degrade the video quality of a nicely rendered 1280x720 video or image.
I'd rather have a 1280x720 monitor because it is a standard resolution and 720p content won't be stretched on it.
This is a screenshot taken from Portal 2 at maximum settings at a true 720p (1280x720) image saved in a lossless format.

This is the same image, scaled to 1366x768 in GIMP, look at the difference in quality, it's due to scaling at a fraction instead of a whole number.

This is the same image, scaled to 2560x1440 in GIMP, this did not lose any quality because it was scaled exactly by 2, which is a whole number. This means for every pixel, there are 4 of the same ones. It's a direct size increase while not adding/removing pixels selectively. You can open this in MS Paint and stretch it to 50% H+W and get the exact image you saw first, you can not scale down the 1366x768 one back to the original because it's added pixels to make up for it not being regularly scaled.

768p Televisions are one of the reasons why consoles look much worse than PCs in terms of graphics, the graphics are not always a native 720p so they're scaled to 1280x720, then when it's sent to the TV the TV Scaled it AGAIN from 1280x720 -> 1366x768 doubling the degradation of quality as seen in my pictures above. I'd strongly recommend you don't purchase one if that is your reason for asking this question.
It should be labeled on it somewhere. If you can't see it, then check for the model number and post it here or google it yourself.
i see, very detailed explanation, thanks...

two thumbs up
thumb.gif
thumb.gif
 
#13 ·
Quote:
Originally Posted by Shadow11377 View Post

1366x768 is marketed as 720p HD but it's far from 720p. It's just 720p "Capable" I would not call that HD, personally because It's an awful resolution to have. It's just enough scaling to degrade the video quality of a nicely rendered 1280x720 video or image.
I'd rather have a 1280x720 monitor because it is a standard resolution and 720p content won't be stretched on it.
This is a screenshot taken from Portal 2 at maximum settings at a true 720p (1280x720) image saved in a lossless format.
338
This is the same image, scaled to 1366x768 in GIMP, look at the difference in quality, it's due to scaling at a fraction instead of a whole number.
337
This is the same image, scaled to 2560x1440 in GIMP, this did not lose any quality because it was scaled exactly by 2, which is a whole number. This means for every pixel, there are 4 of the same ones. It's a direct size increase while not adding/removing pixels selectively. You can open this in MS Paint and stretch it to 50% H+W and get the exact image you saw first, you can not scale down the 1366x768 one back to the original because it's added pixels to make up for it not being regularly scaled.
338
768p Televisions are one of the reasons why consoles look much worse than PCs in terms of graphics, the graphics are not always a native 720p so they're scaled to 1280x720, then when it's sent to the TV the TV Scaled it AGAIN from 1280x720 -> 1366x768 doubling the degradation of quality as seen in my pictures above. I'd strongly recommend you don't purchase one if that is your reason for asking this question.
It should be labeled on it somewhere. If you can't see it, then check for the model number and post it here or google it yourself.
Yeah but a PC will render the frames at whatever rez you chose so it doesnt really matter, no scaling(unlike consoles which render at a set resolution. Though OP should buy a new monitor, 1360x768 is a horrid rez to use for a desktop. Pick up a cheap 1080p display and have your mind blown by the extra desktop space and clarity.

I wouldnt really consider 720p HD anyway. Times have changed 720p is soon to be the new SD. 1080P is what I would consider true HD and 4K will be the future True HD
 
#14 ·
Quote:
Originally Posted by Crazy9000 View Post

If it's a computer monitor it's 60hz, which means it's progressive. PC's don't really support interlaced screens typically. If it's made in the past 5 years, it's pretty much garunteed to be progressive
tongue.gif
. Interlaced was mostly seen on the first big DLP TV's, and CRT ones.
oh i see
biggrin.gif


thanks mr.moderator