Originally Posted by DuckieHo
Flaws in your arguement:
1) There is no 720i.
2) Computer monitors and video cards have always been progressive scan.
Flaws in your argument....
Computer Monitors and Video Cards were at one point an interlaced signal just as standard NTSC is interlaced. This was very common untill you started seeing monitors larger than 14" CRT's on the market. It took more VRAM and processing power for a video card to produce a progressive image at higher resolutions at the time i.e. anything above say 800x600. And as such in most cases a monitor would display 800x600 non-interlaced (as it was called back then) or 1024x768 interlaced. Mind you this was all the way up through mid to late 90's.
Also just to clarify a few other mis-statements in this thread. Any resolution above 1024x768 is considered HD, not 1366x768. Though more commonly than not 1280x720 (720P) is considered the first true step in HD while resolutions between 1280x720 and 740x480 are considered ED (enhanced definition) and anything 740x480 and lower is considered SD.
Also someone commented on 1080P being 30 progressive frames per second its actually 60 progressive frames per second. Otherwise you'd still be seeing an interlaced image on any current consumer display as all modern displays are fixed 60hz refresh except for the few newer 120hz displays that are hitting the market.