Overclock.net › Forums › Components › Monitors and Displays › Yamakasi Catleap Monitor Club
New Posts  All Forums:Forum Nav:

Yamakasi Catleap Monitor Club - Page 284

post #2831 of 11297
Hey hypermatrix, one thing I have noticed in the few years of having a 2560x1440 screen is that lack of understanding from the general gaming community (not the rate elites) - in regards to what more pixels actually translates to.

This is in Australia, perhaps it's true in other countries as well.

To elaborate, a large number of gamers with below 1920x1080 monitors don't really see the big deal in getting a higher resolution, and it's rarely money holding them back. I understand this sentiment, as I had a 1920x1200 monitor for the culmination of years prior this upgrade.

Console devote's are just a more extreme version of this. And they own many-thousand-dollar Tv's running on hardware that's about as productive as a 200 dollar pc conjured up with spare parts from some deranged factory with purple heroin needles on the ground and stray mutant children run around mumbling things about silk stockings and magic toads.

The end result kind of leaves me understanding why microsoft and sony have comfortably help back on a newer console, why higher resolution output has not really evolved, and why new methods of transferring high resolution signals and high refresh rates have not been expanded on or truly evolved in the span of a teenagers life.

If someone is happy with something, then throw in the occasional improvement/bone (the witcher 2 /console looks better than a first gen xbox game) the gtx 580 is incrementally better than the gtx 280 etc. But don't ever give them something that would actually change the very nature of the scales.

Obviously, I am not a engineer of hardware. So there might be something that I am ignorantly omitting. But that's how it sometimes seems from my standpoint.
post #2832 of 11297
Quote:
Originally Posted by Whitespider999 View Post

Hey hypermatrix, one thing I have noticed in the few years of having a 2560x1440 screen is that lack of understanding from the general gaming community (not the rate elites) - in regards to what more pixels actually translates to.
This is in Australia, perhaps it's true in other countries as well.
To elaborate, a large number of gamers with below 1920x1080 monitors don't really see the big deal in getting a higher resolution, and it's rarely money holding them back. I understand this sentiment, as I had a 1920x1200 monitor for the culmination of years prior this upgrade.
Console devote's are just a more extreme version of this. And they own many-thousand-dollar Tv's running on hardware that's about as productive as a 200 dollar pc conjured up with spare parts from some deranged factory with purple heroin needles on the ground and stray mutant children run around mumbling things about silk stockings and magic toads.
The end result kind of leaves me understanding why microsoft and sony have comfortably help back on a newer console, why higher resolution output has not really evolved, and why new methods of transferring high resolution signals and high refresh rates have not been expanded on or truly evolved in the span of a teenagers life.
If someone is happy with something, then throw in the occasional improvement/bone (the witcher 2 /console looks better than a first gen xbox game) the gtx 580 is incrementally better than the gtx 280 etc. But don't ever give them something that would actually change the very nature of the scales.
Obviously, I am not a engineer of hardware. So there might be something that I am ignorantly omitting. But that's how it sometimes seems from my standpoint.

Few things. Most important of which (imo) is regarding consoles. Consoles, actually do quite well. They provide better graphics per mhz. Or, create better graphics per "performance mark." Reason being consoles are the exact same. Games can be optimized. Someone makes a PS3 game, they know how every single PS3 will perform, and build a game around it accordingly. Whereas a PC maker has to go over 5+ years of hardware and make sure it's compatible across them, along with 2 large video card manufacturers and their differing systems (ie. physx, txaa from nvidia), and onboard like Intel's chips.

Then we take a look at a game like, say...Modern Warfare 3. Blockbuster hit. Not the prettiest graphics, but a $250 PS3 can play it better than a $500 PC can. So you have to hand it to the PS3. But at the same time, the xbox/ps3 have held back the quality of PC games as well, because developers have toned down their system requirements, so it would be an easy port over to the consoles as well (ie. Crysis 2, as opposed to Crysis 1). But despite all this...most console games, don't even run at 720p (which is just a quarter of the resolution of these monitors). Which explains that performance. Now, when it comes to resolution...it's actually a lot tougher than you might think. Apple is pushing hard to expand resolution with all their talks of "retina" displays. The iPad's 2048x1536 resolution at 3.14 MILLION pixels for example, is 50% more pixels than your 1080p TV. But it's hard, because if you make a product that people don't know/think they want, you lose money. So only someone like Apple is ballsy enough to start a trend like this.

Now let's talk about bandwidth for a second. a 2560x1440 picture, is 3.7 million pixels. At standard 60 frames per second, that means 221 million individual pixel signals per second. So whatever the interface is, has to be able to dictate a specific colour to a specific pixel, at a rate of 221 million...every second. Let's put that in a size format we understand. If you have uncompressed video, so basically a 30 frame video is treated as 30 individual pictures (same as your monitor), that requires around 300 MB per second. Or, at 120hz which some of us are running it at, 1.25GigaBYTES (not bits) per second. Meaning a Terabyte hard drive's worth of video signals wouldn't be enough for for 15 minutes of video signal. Compare that to a 1080p bluray encoded video doing 2 hours in under 20gb. So what ends up being the issue is that it takes a "lot" of bandwidth to push out this type of resolution. And because the DVI standard is designed with copper cables, the bandwidth requirement of higher resolutions/refresh rates, exceeds the capacity of what copper itself can handle.

So...why not make a new/better port? Well...each revision of HDMI, for example, has worked on that. DisplayPort has increased it and will do so even further. But you have to remember that things have to come to a consumer-friendly price point before anyone makes it. And that's where it gets expensive. Because as far as technicality goes...yeah, make your DVI cables out of pure gold. That'll help up that resolution and refresh rate. =D And then on the other end of it, is the lcd's themselves. Miniaturization takes time. To make a higher resolution, you have to have the ability to create a panel with, for example, 8.8 million pixels ala 4K resolution. That's no easy task. And let's say that could be done. Games that required SLI to run at 1080p in full res (2 million pixels) now have to process 4.4x more data. And as you mentioned yourself...graphics processing power never goes up that much. wink.gif Which brings us to video cards...other than optimizations and new systems/etc...the biggest jumps happen with...again...miniaturization. Nvidia actually did a very good job this round. If you compare their old card to their new card, you're going to say...ok, it's only 40% more power...but what you're not realizing is, it's 40% more power, on a much smaller die. The old 580, for example, was a die size about 68% larger than the current 680's! So the new one is substantially smaller, and still upto 40% more performance! Nvidia realized this, and intentionally held the power back here. tongue.gif They had too big of a jump and trumped AMD pretty hard. But even if that extra 68% translated to 68% more performance (doesn't really work that way), it still wouldn't be enough to power higher up displays. So Nvidia doesn't bother trying to.

So at the end of the day...it's a long cycle of one displays, video cards, software, interface systems, etc...all waiting to see who makes the jump, and if it's profitable, and if they can even keep up with it before they try.

As for Microsoft and Sony, yeah. They'll take their time. The longer they wait, the better performance they can put in their new consoles. If they were to build a system today...with a $250 price point, or even a $400 price point...honestly, they'd be hard pressed to get more than maybe 50% increased performance over their old product. And even that's being optimistic, and mostly relying on having some decent ram for once. They'd rather wait a little until they can build a system that can put out full 1080p, with good resolution, so at least they'll be covered in the nearly 100% of households who own 1080p tv systems now.

Why would I spend 6k on my computer system though, for example? Why do people SLI? And why is a 120hz 2560x1440 monitor beneficial? Well...in FPS's, that's a huge edge. Higher refresh rate lets you track people better. Larger screen + Higher Resolution means you can spot that pesky sniper way in the back of the map, instead of it being a giant blur. There are tons of advantages. But there simply isn't the processing power to handle it. And I've given up convincing people on the differences in resolution after I had people tell me they can't tell the difference between DVD and BluRay. I've given up on humanity in that regard. But...here I am, enjoying my monitor. Beating 1080p'ers with their 60hz monitors in FPS's. Never been happier. And could never go back to 60hz. tongue.gif And yes, I honestly feel like I have a huge edge and my scores in MW3 reflect it.

That...was long. Ok. I'm done. tongue.gif

p.s. Regarding The Witcher 2. That is a terrible game, when it comes to graphics. It's so inefficient. And looks terrible! I mean, I'd show it off as a masterpiece of DX9...but DX11 games blow it out of the water.

Also....difference between 1080p and 1440p? Easy to show why it's better. Tell them this. (obviously click the pics to view them in a larger format, though still not at 2560x1440 view, sadly. not sure why it shrinks the large one)

This is what you see:
338

And this is what I see:
338

Which one would you rather be looking at if you were playing online right now?
Edited by HyperMatrix - 4/16/12 at 7:57am
post #2833 of 11297
Quote:
Originally Posted by sonicBlue View Post

For anyone looking to try out some pretty graphics @2560x1440 there is the free world of warcraft trial.

Even though it's an old game I find the level of detail is just stunning.

That's partly because they revamped the entire continents of Kalimdor and part of the Eastern kingdoms for the Cataclysm release. BTW eww gross Alaince . . .


Quote:
Originally Posted by sonicBlue View Post

I don't have a very powerful computer (Athlon 64 x2 4200+ , asus 6770, 2GB ram) but managed to get it running smoothly with the following settings:

Shadow quality to "Fair"
Multisampling to 1x (don't need it at 2560x1440!)
Anisotropic 2x
Ground clutter high
Everything else at "good"
Direct X11
64-bit client

If you get around a lot of water, you'll may have to drop the "liquid detail" setting to fair and, if you start to get FPS dips out in the world lower your "sunshafts" setting. "Shadow quality", "liquid detail" , "sunshafts", and "view distance" will have the largest impact on older cards. Anything that's a 6950 or higher should be able to have everything maxed out at the 2560x1440 resolution but, only if have a fairly decent CPU. WoW is far more taxing on the CPU than the GPU but obviously the GPU can make a substantial difference.
    
CPUMotherboardGraphicsRAM
Phenom II X4 830 Biostar TA990FXE XFX GeForce 9500 GT (PV-T95G-ZAFG) G Skill F3-12800CL9D-8GBRL 
Hard DriveOptical DriveCoolingOS
WDC WD10 01FALS-42K1B0 Samsung SH-222BB (CDDVDW ) Coolermaster Hyper 212 Evo Windows 7 Professional 64 
KeyboardPowerCaseMouse
Logitech G11 Ultra XF500BK (500 watt ATX) What's a case? MadCATZ Cyborg M.M.O. 7 
Audio
SPDIF out to Onkyo HTR520 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Phenom II X4 830 Biostar TA990FXE XFX GeForce 9500 GT (PV-T95G-ZAFG) G Skill F3-12800CL9D-8GBRL 
Hard DriveOptical DriveCoolingOS
WDC WD10 01FALS-42K1B0 Samsung SH-222BB (CDDVDW ) Coolermaster Hyper 212 Evo Windows 7 Professional 64 
KeyboardPowerCaseMouse
Logitech G11 Ultra XF500BK (500 watt ATX) What's a case? MadCATZ Cyborg M.M.O. 7 
Audio
SPDIF out to Onkyo HTR520 
  hide details  
Reply
post #2834 of 11297
What impressed me about the witcher 2 was the combination of technical and artistic. I'd also argue that it looks pretty incredible @ 2560x1440p. It has immense depth and detail that many, many, many dx11 games have not come even remotely close to matching for me. Sure, it had grainy shadows and pretty extreme popup at ultra. But I can't help but disagree on that count.

Honestly, the witcher 2 and crysis 2 in ultra are some of the most impressive visual examples I can think of. Crysis 2 on dx11 ultra = Incredible. Dx9 = horrid. Reason being, something to do with the hdr lighting. It's blurry in dx9 and defined in dx11. And once you get that parallax/tessellation kicking in, as well as ultra shading (which offers world reflections and improved ssao/hdao from what I can visually tell) it's hard not to frown at skyrim. Although I love the **** out of that game. Here is hoping for the skyrim graphics extender releasing in a human timeframe. (If you actively dislike the game, you might have just lost interest by this point)

In regards to your actual reply - without going on a quotation-reply-quotation system. Which requires a bunch of time. - I agree with pretty much everything you said.

I actually plan on getting the Catleap assuming the overclockable models are reintroduced in the overclock request thread. Although I don't think the competitive edge is what I will be ultimately interested in. I believe when paired with a gtx 680 or two - the general performance will be a lot more open ended. What i mean by that is - I have found certain vsync glitches and issues with games based on having a pure-60hz display. It's a long post in itself. However allowing the framerate to actually veer as far into the refresh as it wants - will provide me with a better performance overall. I can just find the general framerate level - and depening on the type of game and engine, get the optimum refresh and vsync/no vsync level for that game. And get far smoother results to boot.

I think the general smoothness of a game is the next big thing I want to perfect in my build. And multi-amd-gpu's tend to bring with them a bunch of frame perception issues, combine that with a 60hz monitor (although I have at least been enjoying 2560x1440 for the past few years) - and there are definitely some things I can do to change my entire gaming enjoyment.

Also, I can definitely see frames above 60fps on 120hz screens that I have messed around with (but never purchased, due to being unable to go back to 1920x1080 TN/LED). I am not exactly sure how far into the 120fps range I can actually see. But it's past 80fps.

That's another reason I think this screen will be beneficial. Smoooooottttthhhhh.


Anyway, just wanted to mention I like your youtube videos as well. I would like a few more rage spouts and dump people who say you are not getting 120hz. But other than that, it served as a good hype factor tool for me going ahead with the upgrade.
post #2835 of 11297
Quote:
Originally Posted by HyperMatrix View Post

Took me forever but I finally have a guide up on how you can record games and do general screen capture in high quality at our monitor's 1440p resolution. Check it out. http://www.youtube.com/watch?v=fvfPXn5VQ0w
3-part video. File links included in video description. Took me a lot longer than I thought it would to do it all and publish them/etc...So you better enjoy it! wink.gif
p.s. And yes. The guide was most definitely recorded and uploaded in 1440p as well. Haha. I'm a High-Res freak!

Watched all the screen capture vids already smile.gif ... Great stuff - especially for people whom are new to it ...
a couple questions if you dont mind ...
1. What do you think about the program called dxtory? Have a friend that likes this program says it works better then fraps. Although yes it does cost a lil
2. Does using the capture program slow down your gameplay? - is it a detriment to gameplay?
3. When you compress what about x264 codec ... would this be better? I dont know much about them just know I often see things with this codec So if using a program like dxtory where it does some compression is this not advised because of the slow down to your computer?

thanks for the vids!
    
CPUMotherboardGraphicsRAM
2700k Asus p8z68-v pro/gen3 6950 1gig 1536 shaders(899/1308)  soon to be 32 gigs of 2133 ripjaw x 
Hard DriveCoolingMonitorPower
256gb M4 crucial SSD, 3 640 aaks WD Corsair H80 (controller problem thou) Soon to be 27" catleap AX850 
Case
Lian-li pc7fnwb 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
2700k Asus p8z68-v pro/gen3 6950 1gig 1536 shaders(899/1308)  soon to be 32 gigs of 2133 ripjaw x 
Hard DriveCoolingMonitorPower
256gb M4 crucial SSD, 3 640 aaks WD Corsair H80 (controller problem thou) Soon to be 27" catleap AX850 
Case
Lian-li pc7fnwb 
  hide details  
Reply
post #2836 of 11297
is it worth it to get the perfect pixel? what are the odds of the normal ones having dead pixels/bleed?
Flagship
(16 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k Gigabyte GA-Z68-Ud3p MSI R7950 Twin Frozr 3GB Corsair Vengeance 2x4gb 
Hard DriveOptical DriveCoolingOS
Samsung F3 1Tb LG 22x Corsair H50 Win 7 HP 64 
MonitorKeyboardPowerCase
Yamakasi Catleap WASD CODE 104 Cherry White Corsair HX650w Corsair 600T SE 
MouseMouse PadAudioOther
Razer DeathAdder Black Edition Artisan Kai.g3 X-fi Xtrememusic Logitech Z-5500 
  hide details  
Reply
Flagship
(16 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k Gigabyte GA-Z68-Ud3p MSI R7950 Twin Frozr 3GB Corsair Vengeance 2x4gb 
Hard DriveOptical DriveCoolingOS
Samsung F3 1Tb LG 22x Corsair H50 Win 7 HP 64 
MonitorKeyboardPowerCase
Yamakasi Catleap WASD CODE 104 Cherry White Corsair HX650w Corsair 600T SE 
MouseMouse PadAudioOther
Razer DeathAdder Black Edition Artisan Kai.g3 X-fi Xtrememusic Logitech Z-5500 
  hide details  
Reply
post #2837 of 11297
Quote:
Originally Posted by Lgros max View Post

RivaTuner show 98 to 101 fps in game with vsync , still look choppy compare to 60hz 60fps. Weird even 61hz look choppy but, less than 100hz.

Like others have said, this is very likely frame dropping; you can test it by using the refresh rate multitool and seeing if you're getting any black squares in the middle of your strings of lit squares. If so, you're dropping frames and you won't be able to benefit from higher refresh rates with that monitor.
Monolith
(15 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 C0 4GHz EVGA X58 3X SLI ATi Radeon 4870x2 6GB (3x2GB G-Skill DDR3-1600 9-9-9-24) 
Hard DriveHard DriveOptical DriveCooling
120GB Vertex2 1TB WD Black LG GH22NS30 Custom watercooling (dual 6" heatercores, 2xMCW... 
MonitorMonitorKeyboardPower
Yamakasi Catleap Q270 Lenovo L220x Xarmor U9BL-S (custom controller) PCP&C Silencer 750W 
CaseMouseAudio
Lian-li PC-G70 Logitech G700 Asus Xonar DX 
  hide details  
Reply
Monolith
(15 items)
 
  
CPUMotherboardGraphicsRAM
i7 920 C0 4GHz EVGA X58 3X SLI ATi Radeon 4870x2 6GB (3x2GB G-Skill DDR3-1600 9-9-9-24) 
Hard DriveHard DriveOptical DriveCooling
120GB Vertex2 1TB WD Black LG GH22NS30 Custom watercooling (dual 6" heatercores, 2xMCW... 
MonitorMonitorKeyboardPower
Yamakasi Catleap Q270 Lenovo L220x Xarmor U9BL-S (custom controller) PCP&C Silencer 750W 
CaseMouseAudio
Lian-li PC-G70 Logitech G700 Asus Xonar DX 
  hide details  
Reply
post #2838 of 11297
Quote:
Originally Posted by JumboShrimp View Post

is it worth it to get the perfect pixel? what are the odds of the normal ones having dead pixels/bleed?

Not at all...it is a waste of money because perfect pixel doesn't guarantee no dead pixels only that it falls under Korean regulation(up to 5 dead pixels is okay). The chances of getting a display with no dead pixels is very good as the majority have gotten them. I've had 3 displays with no dead pixels. As far as back light bleed...that seems to be very common but very minimal. You wont see it except slightly with a black screen from certain angles.
post #2839 of 11297
Quote:
Originally Posted by HyperMatrix View Post

I think it is as alamone said. The pcb in the monitor will accept the input of 100hz, and tells the video driver that, but when it is sending the signal over to the panel from the pcb itself, it goes through the scaler and is unable to cope with displaying that refresh rate.

Oh found my problem this morning. I disabled rivatuner from msi afterburner on starup, I use fraps instead to see fps in game. Games are running butter smooth at 100hz. I'm really happy now. thumb.gif
post #2840 of 11297
awesome thanks
Flagship
(16 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k Gigabyte GA-Z68-Ud3p MSI R7950 Twin Frozr 3GB Corsair Vengeance 2x4gb 
Hard DriveOptical DriveCoolingOS
Samsung F3 1Tb LG 22x Corsair H50 Win 7 HP 64 
MonitorKeyboardPowerCase
Yamakasi Catleap WASD CODE 104 Cherry White Corsair HX650w Corsair 600T SE 
MouseMouse PadAudioOther
Razer DeathAdder Black Edition Artisan Kai.g3 X-fi Xtrememusic Logitech Z-5500 
  hide details  
Reply
Flagship
(16 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k Gigabyte GA-Z68-Ud3p MSI R7950 Twin Frozr 3GB Corsair Vengeance 2x4gb 
Hard DriveOptical DriveCoolingOS
Samsung F3 1Tb LG 22x Corsair H50 Win 7 HP 64 
MonitorKeyboardPowerCase
Yamakasi Catleap WASD CODE 104 Cherry White Corsair HX650w Corsair 600T SE 
MouseMouse PadAudioOther
Razer DeathAdder Black Edition Artisan Kai.g3 X-fi Xtrememusic Logitech Z-5500 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › Yamakasi Catleap Monitor Club