Originally Posted by Master Freez
Sooo many STUPID
posts from silly people.... Sorry, but it's true.1
) 4k, 7k, 12k, 16k.....it's MUCH MORE shorter name for the new standart than a 1080p or 2160p or 4320p.... Who cares how it's called when it's better and at 95% full with named pixel quantity?
I agree the naming thing isnt that hard to understand and it is shorter/easier to say. Not sure why so much confusion on this and I think its better for it to just be '4k' rather than '3.8k.'
Originally Posted by Master Freez 2
) Only true virgin nerds really care about the pixel density on the TV. Others just using it and happy. Human eye CANNOT see the difference between 300 and 400 pixels per inch after 50 cm from screen and barely catching the difference between +\-10-20 ppi on the same distance.
1920x1080@50 inch is 44 ppi
3840x2160@50inch is 88 ppi
2560x1440@27inch is 108 ppi, 30 inch version is 97 ppi.
1920x1080@27inch is 82 ppi!!!!
You are STILL watching the same pixel density as with 4k tv and not throwing any crap on your monitors, but throwing TONNS of it on a 4k tv!
Not really sure your point about this. I think PPI/pixel density is important on any display but as always it depends on size and your intended viewing distance... I think seeing the difference between 300/400 ppi @ 50 cm is probably pretty close to human eye limits (atleast close to mine). But unlike some other people I believe there is a big benefit to 4k even for video at the 8-12ft range on a 50 inch display. I know some sites with their BS charts say you have to be 2.5 feet away to see the difference when I know that is completely off as I can read 10 point font on a machine set at 75 DPI rendering settings on a 204 PPI display from 2 feet away and at the same settings read from 7 feet away on my 50 inch seiki.
Originally Posted by Master Freez 3
) 4k monitor ALREADY can be used as 4k@60p with Ati Radeon and maybe GeForce with Dual Screen Mode. Yes, DUAL screen mode inside monitor or tv menu (firmware): left half is redering by one port for display and right half by second port. No science, just Eyefinity and nVidia Surround.
Dual Screen mode? For kinda ripping on people you should atleast get the naming correct. I guess you mean Extended desktop (or on nvidia called twinview or dualview) mode. Until Seiki and the few other cheap 4k tv's recently came out every 4k display before that pretty much ran via multi inputs driving a section (half/quadrant/slice) of the display.
One thing to keep in mind is yes on windows vista/7/8 specifically you can only do this currently on an ATI card with ATI eyefinity or an nvidia quadro card with mosiac. Maybe nvidia will make their surround support 2 display mode but I wouldn't hold my breath.
Many cards can do this on windows XP (inlcuding newer nvidia cards). Also on linux all nvidia cards can do this still. Its a windows vista/7/8 specific thing for the most part.
Originally Posted by Master Freez 4
) 4k TV is ALREADY two times better than any 1080p. With direct compare on 50 inch text is much sharper. What else do you want?
Not sure where you come up with this 'two times' number. If your just going by increased resolution than it would be four times higher resolution. Text is sharper? Sure if you use more pixels to render the fonts but that will always be the case. I am sure like me many others want the screen real-estate and won't be making larger text or in order to get 'sharper' text. It will just be smaller and still legible.
Originally Posted by Master Freez 5
) Upscaling working as it should ONLY in twice mode, not in 1.5, not in 1.7. Only TWICE less or TWICE more pixels on both sides. It doesn't make any sense to buy 4k monitor and turn it on at 1440p with BLURRED and much less detailed picture. With 4k you can use ONLY three resolutions for right scaling: 1280x720, 1920x1080 and 3840x2160.
Its funny how you say 'twice' many times. It sounds like what you are saying is you prefer scaling only when it can do perfect pixel height/width doubling IE: 1 pixel goes to four pixels (2x2) or 9 pixels (3x3) or 16 pixels (4x4) where it doesn't have to do interpolation. Honestly I would have preferred this simply due to the simplicity and it looking pretty native and the fact that its very easy to do in hardware and could probably done with 0 input lag where the current scaler appears to have atleast 1 frame of input lag. If your just talking from a visual perspective especially for a TV then using a more advanced scaler that does interpolation can actually result in a nicer looking picture in some cases so I understand why they did not do this.
My IBM T221 only does scaling when its pixel-perfect.
Originally Posted by Master Freez 6
) 120 hz? ARE YOU KIDDING? Your system is 14x GTX Titan on one motherboard? Not? Then HOW...yes, HOW you gonna get 120 frames per second in game at 4k? With low settings? Stop it.
Not everyone runs the latest and greatst games. Also 120hz has usefulness outside of games as well (but lower Hz is usually sufficient for non-gaming). For example 120Hz would help me greatly in the two games I play most:
I can easily run both these games at > 120 FPS @ 3840x2160 even on dated hardware. It really depends on what you use it for. Actually for these games I pretty much have to turn the resolution down to 1920x1080 in order to get the higher refresh. 120Hz @ 3840x2160 would actually be very nice and had this display been display port driven with the right internals which output to LVDS it would have been possible.
Originally Posted by Master Freez 7
) For a FULL potential textures resolutions in games should be 4k or more, without it any difference is upscaling (with some square pixels, if watch closer). And to release that potential you need at least 4GB VRAM.
Originally Posted by Master Freez 8
) Nobody knows how HDMI v1.4 will work with HDMI 2.0. Probably nobody will get 60 FPS from that combo. Why people manufacturing this TV's then? 30 FPS in profressive mod with V-Synch is enough for films or television.
It wasn't made for any usage....
Huh? I like how your going from Hz -> FPS here... Its obvious that HDMI 2.0 is going to require new internals and not work with existing hardware. I don't know why anyone would think it would be any other way. 30Hz is sufficient for *almost* everything except gaming not just films or television.
Originally Posted by Master Freez 9
) Upscaling at 4k resolution even inside the web browser requires A TONNS of CPU power for a flat smooth scrolling at all websites with any pack of the elements. Same problem cause performance decrease on all tablets and smartphones in mobile web browsers. Every time when resolution goes up higher than a hardware side, it gives new problems. So nobody can use this type of screen just on small Intel Atom PC.
Not even sure what prompted this. Increasing rendering size (its not upscaling) is something completely different from Upscaling. Also it usually doesn't take that much CPU usage at all if done right in reality. Now to make sure UI elements that are pixel count based work right the easiest method is what apple did. It just renders everything at 4x higher resolution (using higher resolution images if available and renders larger text) This does not really need all that much CPU usage at all. Then it (in hardware, so some strain on the video card but not CPU intensive) it downscale the image back to whatever the native resolution of the display is.