Need help, was doing another b to c display transplant to get my b display back to b hardware. Then I noticed half of it was washed out, then I TOOK THE WHOLE THING APART!
You'll notice my right half of leds have twice as many lit, the ones that are not lit on the left half. Like the signal is inverted to the wrong leds
There's two sticks of LED's and I switched them around to see if it was a faulty stick but it wasn't it was the signal to the right stick that was wrong for some reason. I'm not good with electrical stuff like that so I'm not sure how to solve this problem. PLEASE HELP!!!!
...3D Vision 2 w/ Trine 2 HyperMatrix w/ Catleap @ 120Hz?
Will it work?
I don't think this would be a particularly good experience; the panel takes about 8-11ms to switch frames (tested by a korean reviewer) and at 120Hz you have an 8.3ms frame time; the monitor would spend the entire frame transitioning between the two states (each perspective has a different pixel state) and the end result would be a lot of ghosting and bleed-over between each eye. The extra refresh is useful at producing smoother frames, but when you try and alternate rapidly between two disparate images things will probably go downhill fairly quickly.
Input Lag/Video Playback/Overclocking Part 1 of 2 (Click here for Part 2)
Looking for updates on the 100Hz Catleap, check this out.
Apr-24-2012: Fixed link to 100 Hz Catleap info.
Originally Posted by Talfrey
So I have been meaning to ask, How noticeable is the extra HZ? Given that the panel isn't intended to go that high, I wonder how much of a difference it really makes.
Depending on your use (1st person shooter vs. office/internet) and what your eyes are used to (competitive gaming or AV expert), you will notice a bigger or smaller difference. I asked more from an gaming input lag perspective. From a smoothness perspective, the effects are well know from folks that have used CRTs in the past. See the "60Hz vs 85Hz pictures" for an idea what this looks like. The higher frames that are not multiples of 30 (e.g. 60/120) can cause video judder and tearing, see "Video Playback / Judder".
The lack of a scaler and on-screen display make the monitor cheaper and provides the reduces input lag. This means your external controls are limited to basic brightness buttons and your limited to 720p and native 1440p resolutions. You can display other resolutions and adjust other color parameters, but will need to use your graphic card (e.g. turn on resolution scaling).
Originally Posted by sfsilicon
Post this a while back, but I didn't get a response. Could someone help me with the following question.
1) Does the input lag get effected by the overclocking refresh rate?
2) If it is effected by the input lag could someone help quantify what the improvement would be for 65Hz, 75Hz, 85Hz, 97Hz vs. default 60Hz?
I play 1st person shooters and was intrigued by the option to OC the Catleap. I read the posts with the pictures where fast motion results in a smoother rendering of the motion (more frames). I'm just unclear if you notice any difference in 1st person shooter games when running 60Hz vs 97Hz besides smoother gameplay. From my experience input lag has a higher impact on getting kills vs. a smooth frame rate.
Originally Posted by HyperMatrix
Unless you're running a terrible terrible monitor, network lag is probably 10x worse than your input lag. Smoother motion makes it easier to see things and focus on them. So it does make a huge difference in fast action FPS games. Input lag for me is indiscernible with most monitors, including this one. From the moment I push a button, until the action is performed on the monitor is instantaneous. I had a 240hz samsung tv before that had about 60ms-70ms input lag. THAT I noticed quite easily and returned the unit immediately.
Not really directly answering your question...because it doesn't really have "another effect" that you'd notice, other than your eyes being able to see and keep up with everything happening on screen. I play Planetside which is a 12 year old mmofps and going fromm 60hz to 82hz right now I already notice much better performance and ease of play. Can't wait till I get an nvidia card and hit 100hz. For FPS games, don't discount the extra hz. They're awesome.
Originally Posted by silberx
Firstly, assuming you get one of these monitors without a scaler (DVI only input) they have pretty much nothing between the video card's output and the panel's display that would add much input lag; the displays thus have correspondingly low input lag already. About the increased refresh rate, this does two things; firstly it improves your framerate, meaing you get new visual updates from the game more often; this obviously helps and is well understood. The second benefit is that at higher refresh rates, the maximal delay before your monitor gets updated with a new frame is reduced as well; for example, with VSYNC on at 60Hz a frame update happens every 16.67ms; this means that the image on screen may be delayed for as much as 16ms, assuming nothing else is delaying extra frames (buffering or render delays in the game engine). The same logic applies even when VSYNC is off, but relates to parts of frames instead of whole frames so it's less clear to explain that way. If your monitor is running at 100Hz, that 16.67ms delay (which again, is a best-case assuming no other delays are stacking up) is reduced to 10ms.
The 7ms difference may not seem like much, but throw a couple extra frames of render delay in there from either game pipeline limitations or poor VSYNC implementations (basically anything DirectX) and suddenly you're looking at a 20ms difference between drawing at 60Hz and 100Hz; combine this with the overall faster update rate (smoother, more fluid animation) and it can be a pretty big difference to a high level player.
Short version: Increases in refresh rate *will* improve your input lag (but not by a massive amount), but depending on you personally it may or may not be a big difference as the input lag is already low on these displays due to the lack of a scaler or overdrive.
Originally Posted by sfsilicon
Thanks for the detailed reply. I got the SE version without scaler. So I think I am good for now and just OC the screen to 65Hz. Still trying to decide if I should pay for the board upgrade if it becomes available.
Originally Posted by sfsiliconHyperMatrix
115% yes. =) Well worth it.
Currently only 2B serial model Catleaps are known to OC over 65Hz. This is due to the PCB boards used in the 2B's (serial code refers to manufacturing month code). 2B's are generally not available anymore, but some lucky folks get older stock from newer sellers, directly from Korea or when buying less popular (more expensive models). There are two efforts going on to get a >65Hz overclocking monitors: a) ScribbyDaGreat is trying to build replacement 2B boards. See the Catleap OP for more info. b) bQvle was working to do a limited production Catleap run with one of the Catleap seller. The thread was locked, but hopefully by mid May 2012 the monitor will be available for sale.
>85Hz OC'ing will depend on your graphic card and your 2B monitor. HyperMatrix has hit 105Hz on a 6970. Only the 680GTX is known go past the magical 120Hz mark with 125Hz being the max stable refresh rate.
I finally got SLI to run 100hz at 32mil colours. But I've not been able to get it to do 120hz. As for the single cards, almost all games worked fine at 120hz, 32 mil colours. The only game that wouldn't work was Planetside, which is 12 years old. Dirt 3, WoW, and BF3 all worked fine automatically grabbing the 120hz from the desktop set resolution.
Originally Posted by HyperMatrix
After more testing, I couldn't even get to 104hz without running into issues. I think pixel clock boosting on SLI just doesn't work. You can run it up to 400 and it works fine, but beyond that...no way. So far, anyway. Maybe will be resolved in a future driver, but currently...SLI = 100hz (or 101hz?) cap.
Single-carders though...honestly, the GTX 680 is phenomenal.
Hmm, have you experienced any stability issues like some others have reported? Some say they sometimes wake their PC up to a monitor displaying only half the image haha
Maybe we should ask nvidia why it doesn't let it run past 400 in SLI