Overclock.net › Forums › Video Games › PC Gaming › How far off is 4K gaming REALLY? Being totally honest and reasonable
New Posts  All Forums:Forum Nav:

How far off is 4K gaming REALLY? Being totally honest and reasonable - Page 9

post #81 of 88
Quote:
nobody in the world can differentiate from 60 fps and 120 fps when looking at a display.

This is an incredibly ..weird statement

hundreds of ocn members will cite differently, i'd go so far as to say i'd be extremely surprised if i pulled somebody out of a crowd and they couldn't call which video was at 120fps when side by side with a 60fps video on a 120hz monitor.

That's like saying nobody in the world can differentiate between 15 and 30fps when looking at a display - there's not some cutoff point here where benefits disappear. To suggest it so low as 60 is ludicrous to the point where it's hard to take it as a serious argument - why don't you go find a 120hz monitor and look yourself?

Even if you ignore rendering latency, ignore eye tracking motion blur, all of that stuff - the difference in smoothness of motion is INSANE - just like watching 60fps instead of 30 on a 60hz monitor. There is no way in hell that anybody can't see it with a side by side video - basically anyone who has used a monitor like this does not need a side-by-side video to call it straight out.
Quote:
it is easy to see that I am right

It's not about you being right or wrong, though it's very concerning to me if you are not at all open to the possibility of anything aside from "winning" something that is not a fight, but a discussion to enhance public knowledge.
Edited by Cyro999 - 3/30/14 at 6:40am
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
6700k Asus Maximus VIII Hero Gigabyte Aorus Xtreme 1080ti Corsair LPX 2x8GB 3200c16 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Samsung 850 EVO Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus PG258Q (240hz + Gsync) WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Logitech G Pro Qck+ 
  hide details  
Reply
Insert Name Here
(14 items)
 
  
CPUMotherboardGraphicsRAM
6700k Asus Maximus VIII Hero Gigabyte Aorus Xtreme 1080ti Corsair LPX 2x8GB 3200c16 
Hard DriveHard DriveCoolingOS
Old Seagate HDD Samsung 850 EVO Thermalright Silver Arrow SB-E SE Windows 7 Home Premium 64 bit 
MonitorKeyboardPowerCase
Asus PG258Q (240hz + Gsync) WASDKeyboards.com v1 semi custom w/ mx browns, ... Superflower Golden Green HX550 Air540 
MouseMouse Pad
Logitech G Pro Qck+ 
  hide details  
Reply
post #82 of 88
Quote:
Originally Posted by Thready View Post

[...] since you want to think you know more than me, give me an academic link to anything you said about how fast the brain processes information. because I never said we see in frames. I said that the limit to distinguishing movement stops in the 30ish fps range. how fast your eye sends messages to the occipital lobe is one small part of sight. your brain must distinguish movement and that is what makes 120hz monitors unecessary. I may not use the term hz correctly so FORGIVE ME PLEASE. but I know more about how the brain encodes information than you do. I am sorry to put it bluntly like that, but I went to school 4 years to get my degree and I am starting my master's in august. not everyone is different. each person's brain has the same threshold for activation. if we were all different, then psychology wouldn't be a science. when it comes to auditory perception, there is a limit. sure, a few outliers may exist, but for the most part they do not. and let me ask you, where did you get your degree in neurology or psychology? because if you don't have one, then you look silly arguing with someone who does have one. nobody in the world can differentiate from 60 fps and 120 fps when looking at a display. there is so much more to perceiving movement than the frame rate of your monitor. and learn a term called the placebo effect. [...]

I think what's happening here is that you learned things that are about how film works and how cameras record the real world. There's a fundamental difference between that and typical real time calculated computer graphics. It's basically about motion blur. Please try this experiment I suggested here:

http://www.overclock.net/t/1477227/how-far-off-is-4k-gaming-really-being-totally-honest-and-reasonable/50#post_22021848

The difference between film and how your PC works should be obvious to you if you look at that and think about it, and it should be obvious that 60 fps+Hz and 120 fps+Hz are absolutely distinguishable for the human eye and brain for at least the kind of computer graphics like in that mouse pointer experiment.
post #83 of 88
2 years minimum.

1) You need 4K @ 60Hz
2) You need 4K IPS
3) You need price to be low 1K
4) You need universal connection
5) GPU that supports that connection
6) Single GPU that can get close to 60 fps.

Current best GPUs do about 20-30fps in 4K.
Ishimura
(21 items)
 
Silent Knight
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel Core i7 3770K @ 4.6GHz ASRock Z77E-ITX eVGA GTX 1080 Ti Hybrid AMD Radeon R9 16GB DDR3-2400MHz  
Hard DriveHard DriveCoolingCooling
SanDisk Ultra II 960GB Toshiba X300 5TB Corsair H100i GTX eVGA Hybrid Water Cooler  
CoolingOSMonitorKeyboard
4x GentleTyphoon AP-15 Windows 10 Pro 64-Bit Philips Brilliance BDM4065UC 4K Razer BlackWidow Chroma  
PowerCaseMouseMouse Pad
eVGA SuperNOVA 750 G3 Define Nano S Logitech G502 Proteus Core PECHAM Gaming Mouse Pad XX-Large 
AudioAudioAudioAudio
Audioengine D1 DAC Mackie CR Series CR3 Audio-Technica ATH-M50 Sennheiser HD 598 
Audio
Sony XB950BT 
CPUMotherboardGraphicsRAM
AMD Phenom II X4 955 @ 4.2GHz ASUS M4A79XTD EVO AMD Radeon HD 7970 3GB @ 1200/1500 2x 4GB G.SKILL Ripjaws X DDR3-1600 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 60GB WD Caviar Green 1.5TB 2 x Seagate Barracuda 2TB XSPC Raystorm 
CoolingCoolingOSPower
EK-FC7970 XSPC RS360 Windows 10 Pro 64-Bit Corsair TX750 
Case
NZXT Switch 810  
  hide details  
Reply
Ishimura
(21 items)
 
Silent Knight
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel Core i7 3770K @ 4.6GHz ASRock Z77E-ITX eVGA GTX 1080 Ti Hybrid AMD Radeon R9 16GB DDR3-2400MHz  
Hard DriveHard DriveCoolingCooling
SanDisk Ultra II 960GB Toshiba X300 5TB Corsair H100i GTX eVGA Hybrid Water Cooler  
CoolingOSMonitorKeyboard
4x GentleTyphoon AP-15 Windows 10 Pro 64-Bit Philips Brilliance BDM4065UC 4K Razer BlackWidow Chroma  
PowerCaseMouseMouse Pad
eVGA SuperNOVA 750 G3 Define Nano S Logitech G502 Proteus Core PECHAM Gaming Mouse Pad XX-Large 
AudioAudioAudioAudio
Audioengine D1 DAC Mackie CR Series CR3 Audio-Technica ATH-M50 Sennheiser HD 598 
Audio
Sony XB950BT 
CPUMotherboardGraphicsRAM
AMD Phenom II X4 955 @ 4.2GHz ASUS M4A79XTD EVO AMD Radeon HD 7970 3GB @ 1200/1500 2x 4GB G.SKILL Ripjaws X DDR3-1600 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 60GB WD Caviar Green 1.5TB 2 x Seagate Barracuda 2TB XSPC Raystorm 
CoolingCoolingOSPower
EK-FC7970 XSPC RS360 Windows 10 Pro 64-Bit Corsair TX750 
Case
NZXT Switch 810  
  hide details  
Reply
post #84 of 88
Quote:
Originally Posted by deepor View Post

The tablets and phones already have great pixel densities. The fonts look so great on their screens. Text on the desktop looks seriously terrible in comparison. frown.gif

The sharp text even works on the days when I use my contact lenses instead of glasses. My eyesight is quite a bit blurry like that but it strangely seems like the tablet screen's resolution helps with reading.

i don’t know about windows but on linux i have the option for high quality AA on font edges so effectively the fonts edges on a 1080p screen are 1440p or above. However it is quite simple to alter fonts on linux and tinker, i have found some fonts to look truly awful just based on the panel type and sub pixel format of the display. I do agree though overall 23" -24"1080p monitors and above pale in clarity compared with a smart phone. Its probably why tablets and phones are so sucessful because they look as flat and as clear as those fake computer screens you see on movies happysmiley.gif

something else i wondered is.. presumably now if the content is made for 4k then a full scene has access to 8 million colours vs 2 million? Which on a properly calibrated monitor or even tv/projector is going to offer some nicer subtle looking shades and natural colour fields? or is this not the case
Edited by Pip Boy - 3/30/14 at 7:01am
post #85 of 88
Quote:
Originally Posted by phill1978 View Post

i don’t know about windows but on linux i have the option for high quality AA on font edges so effectively the fonts edges on a 1080p screen are 1440p or above. However it is quite simple to alter fonts on linux and tinker, i have found some fonts to look truly awful just based on the panel type and sub pixel format of the display. I do agree though overall 23" -24"1080p monitors and above pale in clarity compared with a smart phone. Its probably why tablets and phones are so sucessful because they look as flat and as clear as those fake computer screens you see on movies happysmiley.gif

something else i wondered is.. presumably now if the content is made for 4k then a full scene has access to 8 million colours vs 2 million? Which on a properly calibrated monitor or even tv/projector is going to offer some nicer subtle looking shades and natural colour fields? or is this not the case

Windows has had smooth fonts for ages.

As for colors.. 4K will simply have more pixels, unless they use 8bit Panels over 6bit ones.
Current 1080p screens using 6bit panels aren't displaying their maximum theoretical color display, it will remain mostly unchanged unless 4K uses 8bit minimum.

Check this out.
http://www.tftcentral.co.uk/articles/content/6bit_8bit.htm
i7 Sandy
(8 items)
 
Skylake Pentium
(8 items)
 
 
CPUMotherboardGraphicsRAM
i7-2600K ASRock Z75 Pro3 GTX 970 16GB 
Hard DriveOSMonitorPower
250GB Samsung 850 Evo WinXP Pro / Win7 Pro 1080p 60Hz Thermaltake Smart M850W 
CPUMotherboardGraphicsRAM
Pentium G4400 ASRock H110M-HDV GTX 550 Ti 8GB (2x4) 
Hard DriveOSPowerOther
250GB SSD Windows 7 500W 300Mbps WLAN 
  hide details  
Reply
i7 Sandy
(8 items)
 
Skylake Pentium
(8 items)
 
 
CPUMotherboardGraphicsRAM
i7-2600K ASRock Z75 Pro3 GTX 970 16GB 
Hard DriveOSMonitorPower
250GB Samsung 850 Evo WinXP Pro / Win7 Pro 1080p 60Hz Thermaltake Smart M850W 
CPUMotherboardGraphicsRAM
Pentium G4400 ASRock H110M-HDV GTX 550 Ti 8GB (2x4) 
Hard DriveOSPowerOther
250GB SSD Windows 7 500W 300Mbps WLAN 
  hide details  
Reply
post #86 of 88
Thread Starter 
Quote:
Originally Posted by Shadow11377 View Post

Windows has had smooth fonts for ages.

As for colors.. 4K will simply have more pixels, unless they use 8bit Panels over 6bit ones.
Current 1080p screens using 6bit panels aren't displaying their maximum theoretical color display, it will remain mostly unchanged unless 4K uses 8bit minimum.

Check this out.
http://www.tftcentral.co.uk/articles/content/6bit_8bit.htm

What is more important, better pixel density or better colors? I think better lighting and color depth. See, the general masses understand terms like 1080p and 4K. But if a TV maker came along and said, "Our 1080p TV has 8bit colors which is 133% of 6bit color depth." People would be like "GET THE F*** OUT OF HERE! NOBODY WANTS THAT!"

The same can be said for OLED. I saw an OLED display once at an electronics show here in St. Louis. The screen was a bit curved too but I didn't notice that as much. The person displaying it said that it didn't follow normal pixel density and it was not exactly 4K but it was close. If they have a 1080p OLED display that would be a huge step up from a regular LED lit 4K TV. But no, people want pixel density and they could care less if the brightness of the LEDs in the back wash out the colors. That is why I have my LED TV at about half brightness. It looks more like a projected image that you would see in the theater. As for OLEDs, their production costs might be a bit prohibitive at this point, and since 4K is easily marketable and cheaper to make, Sony, LG, and Samsung decided that is what they will sell to the consumer. But I will hold out for an OLED display. I see they are on sale for $8,000 and when I am ready to upgrade, I will have the funds from my money market mutual fund manager (because I will have to learn how to play the market in order to get that kind of cash). By then they might be $100 down from $200 on sale at Best Buy. A man can dream.

Lighting and color depth are far more important to me than pixel density.
Zen
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5-6600k Gigabyte Z170XP-SLI RX 480 4GB 24 GB DDR4 2133MHz 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 evo OCZ Vertex 4 Crucial MX 500 256GB WD Black 3 TB 
Hard DriveCoolingOSMonitor
Seagate 1 TB H60 Windows 10 Asus mx27a 
MonitorKeyboardPowerMouse
hp 2009 m Corsair MR Brown Antec Earthwatts 650 Razer Naga 
Audio
Soundblaster Omni 
  hide details  
Reply
Zen
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5-6600k Gigabyte Z170XP-SLI RX 480 4GB 24 GB DDR4 2133MHz 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 evo OCZ Vertex 4 Crucial MX 500 256GB WD Black 3 TB 
Hard DriveCoolingOSMonitor
Seagate 1 TB H60 Windows 10 Asus mx27a 
MonitorKeyboardPowerMouse
hp 2009 m Corsair MR Brown Antec Earthwatts 650 Razer Naga 
Audio
Soundblaster Omni 
  hide details  
Reply
post #87 of 88
Quote:
Originally Posted by Thready View Post

What is more important, better pixel density or better colors? I think better lighting and color depth. See, the general masses understand terms like 1080p and 4K. But if a TV maker came along and said, "Our 1080p TV has 8bit colors which is 133% of 6bit color depth." People would be like "GET THE F*** OUT OF HERE! NOBODY WANTS THAT!"

The same can be said for OLED. I saw an OLED display once at an electronics show here in St. Louis. The screen was a bit curved too but I didn't notice that as much. The person displaying it said that it didn't follow normal pixel density and it was not exactly 4K but it was close. If they have a 1080p OLED display that would be a huge step up from a regular LED lit 4K TV. But no, people want pixel density and they could care less if the brightness of the LEDs in the back wash out the colors. That is why I have my LED TV at about half brightness. It looks more like a projected image that you would see in the theater. As for OLEDs, their production costs might be a bit prohibitive at this point, and since 4K is easily marketable and cheaper to make, Sony, LG, and Samsung decided that is what they will sell to the consumer. But I will hold out for an OLED display. I see they are on sale for $8,000 and when I am ready to upgrade, I will have the funds from my money market mutual fund manager (because I will have to learn how to play the market in order to get that kind of cash). By then they might be $100 down from $200 on sale at Best Buy. A man can dream.

Lighting and color depth are far more important to me than pixel density.

I believe the answer to that question is entirely up to the individual. They're both very important.
Generally, the higher resolution will impress more people because it lets them see more rather than the same, but a little better.

For web browsing and general experiences, 2160p will beat 1080p any day regardless of quality of the panel.
For gaming, it varies quite a bit. Some games will benefit nicely from a better panel, while some won't.. yet both usually benefit from higher res.

If you're doing basic photo editing with lossy JPEGs you might find it nicer to get more on screen rather than better colors, but for professional work I bet most would prefer accurate colors.

As for movie playback, it depends on the source. Lots of DVDs look better than streamed HD movies, so obviously resolution isn't the only important thing here since other factors play a role. If you've got a collection of Blu Ray movies, a 1080p monitor with full colors would win. If you have a lot of downloaded 720p/1080p movies, a 2160p would win since none of those are likely to be limited by the colors of a 6-bit panel and 720p scales better to 2160p than 1080, and 1080p scales perfect to 2160p (in theory) so the resolutions would match better resulting in a better viewing experience.

For gaming, it depends on the games you play. Most modern games would probably benefit from better colors but at the same time gaming isn't all about the pretty graphics, and the extra pixels could lead to being able to see things further in the distance clearer, which could help. At the same time the higher res could totally screw up the UI, so it's really a hit or miss depending on the game.

I know that sniping with a higher resolution is nice in my experience, so I would likely choose the higher res for gaming. Hopefully by the time I upgrade my monitor I will have the choice of both making me not have to choose between the two, as I know I want them both almost equally.
i7 Sandy
(8 items)
 
Skylake Pentium
(8 items)
 
 
CPUMotherboardGraphicsRAM
i7-2600K ASRock Z75 Pro3 GTX 970 16GB 
Hard DriveOSMonitorPower
250GB Samsung 850 Evo WinXP Pro / Win7 Pro 1080p 60Hz Thermaltake Smart M850W 
CPUMotherboardGraphicsRAM
Pentium G4400 ASRock H110M-HDV GTX 550 Ti 8GB (2x4) 
Hard DriveOSPowerOther
250GB SSD Windows 7 500W 300Mbps WLAN 
  hide details  
Reply
i7 Sandy
(8 items)
 
Skylake Pentium
(8 items)
 
 
CPUMotherboardGraphicsRAM
i7-2600K ASRock Z75 Pro3 GTX 970 16GB 
Hard DriveOSMonitorPower
250GB Samsung 850 Evo WinXP Pro / Win7 Pro 1080p 60Hz Thermaltake Smart M850W 
CPUMotherboardGraphicsRAM
Pentium G4400 ASRock H110M-HDV GTX 550 Ti 8GB (2x4) 
Hard DriveOSPowerOther
250GB SSD Windows 7 500W 300Mbps WLAN 
  hide details  
Reply
post #88 of 88
Thread Starter 
Quote:
Originally Posted by Shadow11377 View Post

I believe the answer to that question is entirely up to the individual. They're both very important.
Generally, the higher resolution will impress more people because it lets them see more rather than the same, but a little better.

For web browsing and general experiences, 2160p will beat 1080p any day regardless of quality of the panel.
For gaming, it varies quite a bit. Some games will benefit nicely from a better panel, while some won't.. yet both usually benefit from higher res.

If you're doing basic photo editing with lossy JPEGs you might find it nicer to get more on screen rather than better colors, but for professional work I bet most would prefer accurate colors.

As for movie playback, it depends on the source. Lots of DVDs look better than streamed HD movies, so obviously resolution isn't the only important thing here since other factors play a role. If you've got a collection of Blu Ray movies, a 1080p monitor with full colors would win. If you have a lot of downloaded 720p/1080p movies, a 2160p would win since none of those are likely to be limited by the colors of a 6-bit panel and 720p scales better to 2160p than 1080, and 1080p scales perfect to 2160p (in theory) so the resolutions would match better resulting in a better viewing experience.

For gaming, it depends on the games you play. Most modern games would probably benefit from better colors but at the same time gaming isn't all about the pretty graphics, and the extra pixels could lead to being able to see things further in the distance clearer, which could help. At the same time the higher res could totally screw up the UI, so it's really a hit or miss depending on the game.

I know that sniping with a higher resolution is nice in my experience, so I would likely choose the higher res for gaming. Hopefully by the time I upgrade my monitor I will have the choice of both making me not have to choose between the two, as I know I want them both almost equally.
I guess, but that OLED looked mighty nice when I stole it... No but anyways, it seems like the focus is too much on resolution rather than other things. resolution is easily testable too. just count the pixels. not by hand, but you know what I mean right? its just simple math. but contrast isn't testable to that same degree. there is no standardization when it comes to measuring contrast which is why you can't trust the advertisements.
Zen
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5-6600k Gigabyte Z170XP-SLI RX 480 4GB 24 GB DDR4 2133MHz 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 evo OCZ Vertex 4 Crucial MX 500 256GB WD Black 3 TB 
Hard DriveCoolingOSMonitor
Seagate 1 TB H60 Windows 10 Asus mx27a 
MonitorKeyboardPowerMouse
hp 2009 m Corsair MR Brown Antec Earthwatts 650 Razer Naga 
Audio
Soundblaster Omni 
  hide details  
Reply
Zen
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5-6600k Gigabyte Z170XP-SLI RX 480 4GB 24 GB DDR4 2133MHz 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 evo OCZ Vertex 4 Crucial MX 500 256GB WD Black 3 TB 
Hard DriveCoolingOSMonitor
Seagate 1 TB H60 Windows 10 Asus mx27a 
MonitorKeyboardPowerMouse
hp 2009 m Corsair MR Brown Antec Earthwatts 650 Razer Naga 
Audio
Soundblaster Omni 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: PC Gaming
Overclock.net › Forums › Video Games › PC Gaming › How far off is 4K gaming REALLY? Being totally honest and reasonable