Overclock.net › Forums › Components › Monitors and Displays › Home-made Arduino scanning LED backlight to simulate 480Hz or 960Hz in a 120Hz LCD?
New Posts  All Forums:Forum Nav:

Home-made Arduino scanning LED backlight to simulate 480Hz or 960Hz in a 120Hz LCD?

post #1 of 21
Thread Starter 
[Crossposted from HardFORUM, I'm told I should post here too]

Goal: Eliminate motion blur on an LCD, and allow LCD to approach CRT quality for fast-motion.

Scanning backlights are used in some high end HDTV's (google "Sony XR 960" or "Samsung CMR 960"). These high end HDTV's simulate 960 Hz using various techniques, including scanning backlights (sometimes also called "black frame insertion"). The object of this is to reduce motion blur greatly by pulsing (flickering) the LCD -- scanning the backlight (flicker), like a CRT would scan the phosphor (flicker) These home theater HDTV's are expensive, and scanning backlights are not really taken advantage of (yet) in desktop computer monitors. Although there are diminishing returns beyond 120Hz, it is worth noting that 120Hz eliminates only 50% of motion blur versus 60Hz, however, 480Hz eliminates 87.5% of motion blur versus 60Hz. Scanning backlights can simulate the motion-blur reduction of 480Hz, without further added input lag, and without needing to increase actual refresh rate beyond the native refresh rate (e.g. 120Hz). Your graphics card would not need to work harder.

I have an idea of a home-made scanning backlight, using an Arduino project, some white LED strips, and a modified monitor (putting Arduino-driven white LED strips behind the LCD glass)

Most LCD's are vertically refreshed, from top to bottom.
The idea is to use a homemade scanning backlight, by putting the LCD glass in front of a custom backlight driven by an Arduino project:

Parts:
1. Horizontal white LED strip segments, put behind the LCD glass. The brighter, the better! 4 or 8 strips.
2. Arduino controller (to control LED strip segments).
3. 4 or 8 pins on Arduino connected to a transistor connected to the LED strip segments.
4. 1 pin connected to vertical sync signal (could be software such as a DirectX program that relays the vertical sync state, or hardware that detects vertical sync state on the DVI/HDMI cable). The vsync signal ideally needs to be precise, this might still be possible to do over USB, if you can make it sub-millisecond precision (or use timecoding on the signal to compensate for the USB timing fluctuations) If done using software signalling over USB, you can eliminate this pin.

The Arduino controller would be programmed to flash the LED strip on/off, in a scanning sequence, top to bottom. If you're using a 4-segment scanning backlight, you've got 4 vertically stacked rectangles of backlight panel (LED strips), and you flash each segment for 1/4th of a refresh. So, for a 120Hz refresh, you'd flash one segment at a time for 1/480th of a second.

The Arduino would need to be adjustable to adapt to the specific refresh rate and the specific input lag specific to the monitor:
- Refresh rate automatically configured over the USB
- Configurable on/off setting, to stop the flicker when you aren't doing fast-motion stuff (FPS gaming, video camera playback, etc.)
- Upon detecting signal on the vsync pin the Arduino would begin the flashing sequence to the first segment. This permits synchronization of the scanning backlight to the actual output.
- An adjustment would be needed to compensate for input lag (either via a configurable delay or via configuring the flash sequence on a different segment than the first segment.)
- Configurable pulse length, to optimize image quality with the LCD.
- Configurable panel flash latency and speed (to sync to the LCD display's refresh speed within a refresh) -- this would require one-time manual calibration, via testing for elimination of tearing/artifacts. For example, a specific LCD display might only take 1/140th of a second to repaint a single 120Hz frame, so this adjustment allows compensation for this fact.
- Configurable number of segments to illuminate -- e.g. illuminate more segments at a time, for a brighter image at trade-off (e.g. simulating 240Hz with a double-bright image, by lighting up two segments of a scanning backlight rather than 480Hz)
- If calibrated properly, no extra input lag should be observable (at most, approximately 1-2ms extra, simply to wait for pixels to fully refresh before re-illuminating backlight).
- No modifications of computer monitor electronics is necessary; you're just replacing the backlight with your own, and using the Arduino to control the backlight.
- Calibration should be easy; a tiny computer app to be created -- just a simple moving test pattern and a couple or three software sliders -- adjust until motion looks best.

Total cost: ~$100-$150. Examples of parts:
- $35.00 (RadioShack) -- Arduino Uno Rev 3. You will need an Arduino with at least 4 or 8 output pins and 1 input pin. (e.g. most Arduino)
- $44.40 (DealExtreme) -- LED tape -- White LED's 6500K daylight LED's, 50 watts worth (5meter of 600x3528 SMD LED 6500K).
- Plus other appropriate components as needed: power supply for LED's, wire, solder, transistors for connecting Arduino pins to the LED strips, resistors or current regulators or ultra-high-frequency PWM for limiting power to the LED's, etc.

LED tape is designed to be cut into segments, (most LED tape can be cut in 2 inch increments). Google or eBay "White LED tape". A 5 meter roll of white LED tape is 600 LED's at a total 50 watts, and this is more than bright enough to illuminate a 24" panel in 4 segments, or can be doubled-up. These LED tape is now pretty cheap off eBay, sometimes as low as under $20 for chinese made rolls, but I'd advise 6500K full-spectrum daylight white LED's with reasonably high CRI, or color quality will suffer. Newer LED tape designed for accent lighting applications, would be quite suitable, though you want it daylight white rather than warm white or cold white -- to match the color of a typical computer monitor backlight. For testing purposes, cheap LED tape will do. You need extra brightness to compensate for the dark time. A 4-segment backlight that's dark 75% of the time, would ideally need to be 4 times brighter than the average preferred brightness setting of an always-on backlight. For even lighting, a diffuser (e.g. translucent plastic panel, wax paper, etc) is probably needed between the LED's and the LCD glass.

This project would work best with 120Hz LCD panels on displays with fast pixel responses, rather than 60Hz LCD panels, since there would not be annoying flicker at 120Hz (since each segment of the scanning backlight would flicker at 120Hz instead of 60Hz), and also that the pixel decay would need to be quick enough to be virtually completed

Scanning backlight is a technology already exists in high end home theater LCD HDTV's (960Hz simulation in top-model Sony and Samsung HDTV's -- google "Sony XR 960" or "Samsung CMR 960"), and most of those include motion interpolation and local dimming (Turning off LED's behind dark areas of screen), in addition to scanning backlight. We don't want input lag, so we skip the motion interpolation. Local dimming is complex to do cheaply. However, scanning backlight is rather simple -- and achievable via this Arduino project idea. It would be a cheap way to simulate 480Hz (or even 960Hz) via flicker in a 120Hz display, by hacking open an existing computer monitor.

Anybody interested in attempting such a project?

[EDIT: I've now greenlighted proceeding with this project, and purchased the initial parts for experimentation]
[EDIT2: This is an old post from 2012, now archived for historical reasons -- Arduino Scanning Backlight on Blur Busters Forums. The project is now described at Electronics Hacking: Creating a Strobe Backlight]
Edited by mdrejhon - 12/18/13 at 7:22pm
post #2 of 21
Thread Starter 
Crossposting some of my replies from HardForum:
Quote:
Originally Posted by Mark Rejhon 
Fast motion on a CRT at 120Hz is still much sharper than LCD at 120Hz. Phosphor on a CRT decays in a millisecond, while the LCD pixels at 120Hz is generally continuously displayed for the full 1/120th second. Not everyone cares, but some of us do -- that's why some of us love CRT (including those threads of those users loving Sony 24" widescreen CRT), and even though we enjoy 120Hz LCD, some of us miss the motion clarity of CRT.

Consider motion blur. An example of a fast panning scene is moving across the screen at 1 inch every 1/60th of a second. Let's say, your eye is tracking a sharp object during the screen pan. So strictly by the numbers for fast-panning motion moving at 1 inch every 1/60 second.
At 60Hz, the motion blur is 1" thick (entry level HDTV's, regular monitors) ...
At 120Hz (or CMR 120), the motion blur is 0.5" thick (120Hz computer monitors) ...
At 240Hz (or CMR 240), the motoin blur is 0.25" thick ...
At 480Hz (or CMR 480), the motion blur is 0.125" thick ...
At 960Hz (or CMR 960), the motion blur is 0.0625" thick (CRT style, high end HDTV's) ...

The same effect is simulated at a lower refresh rate, when using flicker. A 120Hz display with a backlight briefly pulsed at 1/960th of a second, will have the same motion blur elimination as 960Hz. A 60Hz CRT with 1ms phosphor decay (essentially pulsed 1/1000sec), would be similar.
For a more detailed discussion about motion blur, see this thread. However, until then, it appears it's is something that's possible to apparently do ourselves with homebrew parts, without a display manufacturer's involvement.
Quote:
Older scanning backlight technologies had a long bright cycle with only a short dark cycle.
ama_diagram.jpeg
(Original: TFT Central)
However, scanning backlights have been achieving 75%:25% dark:bright cycles in some motion-enhancement technologies on current HDTV displays (Sony XR 960, Samsung CMR 960), though it also combines unwanted motion interpolation. It should be possible to do the same (or go even better, 87.5%:12.5% dark:bright cycle) for a scanning backlight in a computer monitor, and skip the motion intepolation.

The response curve of a pixel is flatter nowadays right before the next refresh: LCD display makers have been able to do this in recent years; high-speed camera snapshots of a LCD panel (1/1000th sec) still show a usable LCD image void of time-based artifacts (e.g. temporal dithering) nowadays, although there's some gamma differences. --- LCD makers have had to do this, to mimize ghosting as much as possible for 3D with active-shutter glasses.
Quote:
To clarify, I'm talking about only 1 flash per frame (brief 1/960th second), per segment of scanning backlight. That means a single 1/960th second flash per refresh behind already-refreshed LCD pixels.

A scanning backlight, under a high-speed camera, would look almost the same as a CRT, like this:

(YouTube of a high-speed camera on CRT scanning)

This is a 60Hz CRT, and you notice that phosphor decay is about 2 milliseconds (2ms out of a 16ms refresh, or 1/8th screen height of phosphor still illuminated brightly). An eight-segment high speed scanning backlight (2ms flash per segment), illuminating in sequence, from top to bottom, would look similiar to a CRT in this very same high-speed video. Even better, would 120Hz (instead of 60Hz) and use 1ms flash per segment (instead of 2ms). You'll need a very bright backlight to compensate for the long dark time (dark 7/8ths of the time, bright 1/8ths of the time). You can make the scanning backlight adaptive to the current refresh, so that the scanning backlight would work at any refresh all the way from 60Hz through 120+Hz, though at the cost of more motion blur at lower refreshes (e.g. 1/480sec flash for 60Hz, gradually reducing to 1/960sec flash for 120Hz), to achieve the same image brightness.

The shorter the illumination (for a given area of LCD), the less motion blur, thanks to the persistence of vision. If the LCD pixel is dark 7/8ths of the time (thanks to backlight being turned off 7/8ths of the time, invisibly waiting for LCD pixels to refresh), you reduce 7/8ths of the motion blur -- 87.5% reduction in motion blur.

Again, you do a high-speed illumination only once per frame. You could do the whole backlight all at once, but that doesn't take into account of the LCD refresh. You really want to do high-speed flash the backlight only behind an already-refreshed portion of LCD, while waiting for different part of LCD to finish refreshing. Thus, why a scanning backlight is better than a whole-backlight strobe.

Edited by mdrejhon - 9/13/12 at 9:20pm
post #3 of 21
Thread Starter 
Appending yet more of my replies from other forum, for further relevant detail:
Quote:
Motion blur is directly proportional to illumination period.
A backlight that's dark 50% of the time, will reduce motion blur by 50%.
A backlight that's dark 90% of the time, will reduce motion blur by 90%.

As long as LCD pixels are fast enough to mostly finish refreshing before the next frame, it can be done. Today's modern 3D LCD's, 120Hz displays, and active shutter glasses have recently made sure of this fact -- it's practically a prerequisite or 3D wouldn't have been possible with today's LCD. This is wonderful news for someone who wants to do a scanning backlight. For a 120Hz refresh, 1/120th second equals 8.3333 milliseconds. For a 2 milisecond LCD, you've got 8 millisecond time "budget" to do your "deed", as an example:

A single 8ms refresh (1/120th second) can be enhanced with a scanning/strobed backight:
2ms -- wait for LCD pixel to finish refreshing (while in the dark)
5ms -- wait a little longer for most of ghosting to disappear (while in the dark)
1ms -- flash the backlight quickly. (1/960th second or 1/1000th second)
(All these values could be adjustable in the Arduino scanning backlight project, to reduce input lag, etc)

Presto -- you've eliminated the motion blur through the backlight strobing. Perfect CRT-quality motion in an LCD!
It can also be done simply by strobing the whole backlight (during the VSYNC interval), but for maximum image quality (minimum ghosting/minimum crosstalk), you want to illuminate a practically fully-refreshed part of LCD while waiting for a different part of LCD to finish refreshing (pixels changing gradually from one color to the next, taking a few milliseconds to do so). It's also simpler on a power supply to illuminate only about 15 watts at a time out of a 100 watt LED panel (example only), so a scanning backlight simplifies power supply requirements since the same-size section of a scanning backlight can always be illuminated at any given time (constant power). Therefore, a scanning backlight (CRT-style) is better, and you'll get exactly the same "feel" of a CRT, given a sufficiently fast enough illumination (LED illumination time similiar to the time of phoshor decay on a CRT).

Since the backlight is turned off during the bad part of the LCD pixel response, you've eliminated/masked the pixel response speed from being the limiting factor in motion blur in an LCD. Thus, for modern LCD panels (those good enough for alternate-frame 3D), are also sufficiently fast enough for high-speed scanning backlights that are dark almost 90% of the time.
Quote:
Summary:
1. LCD's are now fast enough to finish refreshing before the next frame (requirement of 3D LCD's)
2. LED's are now bright and cheap enough (requirement of extra brightness needed in ultra-short flashes in scanning/strobed backlight)
3. Today's 120Hz LCD's, means that flicker of a scanning-backlight, will not bother most people. (3D LCD's brought us 120Hz LCD's)
4. Controllers for scanning backlights are now cheap (it can be done with an Arduino)
5. Scanning backlights make it possible for LCD blur to be *better* than the LCD's own response speed.

Some argue that this is impossible to overcome LCD blur; but they're completely wrong. You simply strobe the backlight at a different cycle in a refresh, while keeping the backlight turned off while the LCD pixel is refreshing. The only pre-requisite is that the LCD is finished refreshing before the next frame; this provides you with a window to keep the backlight turned off, and you simply strobe the backlight once, quickly, during the practically idle part of the LCD refresh. This is already achieved -- 3D LCD's made it critically necessary. The strobe can be faster than the LCD's own refresh. (e.g. give LCD 7ms to finish refreshing, do the strobe during the final 1ms -- during the single 8ms cycle of a 1/120th second frame). So, see, LCD *can* technically be better than CRT, if the backlight flashes only once per refresh, and the flashes are shorter than the phosphor decay (for the CRT) -- i.e. 1/1920th second flashes of a sufficiently bright backlight.

It wasn't done before now, and it's already being done in $3000 high-end LCD HDTV's (although at only 75%:25% dark:bright cycles, to have 75% reduction in motion blur on framerate-interpolated 240Hz) in the 'expensive' HDTV's (Again, google "Sony XR 960" or "Samsung CMR 960" -- several of these HDTV's are the $3000 models and up), but we don't need the extra motion interpolation "garbage" or input-laggy enhancements (undesirable for gamers).

If we make the electronics lean, mean, and specifically motion blur elimination, all we need is a simple scanning backlight that is dark 90% of the time (which requires and sufficiently short and super-bright illumination, which is no longer too expensive to do with LED. The technology is here today, it's already been done, and it's time to bring it to computer monitors (and speed up the illumination pulses to 1/960th second or even faster 87.5%:12.5% dark:bright in a high-speed scanning backlight that reduces motion blur by 87.5%. [Edit: This is for an 8 segment scanning backlight with 1 segment illuminated at a time]. A monitor manufacturer that adds a scanning backlight needs to go "FTW" -- For The Win -- and do a scanning backlight that reduces motion blur dramatically, e.g. 87.5%. A dramatic jump is necessary for a dramatic reduction in motion blur, necessary to do to at least equal CRT (or be better than) -- in order to catch the rave reviews from CRT afficanados in order to feed continued sales of a high end monitor with a high-speed scanning backlight.
Let's bring this technology to computer monitors (even if we have to do a homebrew scanning backlight add-on). No reason to keep this cool tech limited only to high-end $3000 HDTV's, since this tech can be done as an Arduino project using only $100 in parts! Many people (like me) would pay over double this extra (Over $200 extra) for a LCD computer monitor that has LESS motion blur than a CRT!
Edited by mdrejhon - 9/13/12 at 9:10pm
post #4 of 21
While I really, really like this idea, I have a few questions.

1 - When going from 0V to full voltage (whatever the spec is for the LEDs you use), how long does it take the LED to reach full brightness?
2 - When going from on to off, how long does it take the LED light to decay?
3 - Do you realize how many strips of LEDs you're going to have to make?
4 - When the LEDs are dark 90% of the time, it means that the overall image will end up looking 90% darker than when fully lit. Are there LEDs available on the market that are bright enough to make up for the time that they are off (and also small enough for this application)?
5 - Do these LEDs output the proper wavelengths of light to allow the monitor to properly display sRGB (or even Adobe98)?
Server
(11 items)
 
  
CPUMotherboardRAMHard Drive
Intel Core i5-3470 ASRock Z77 Extreme6 16 GB G.Skill Sniper DDR3-1600 1TB WD Caviar Black 
Hard DriveCoolingOSCase
4TB WD Caviar Red Cooler Master Hyper 212+ VMWare ESXi NZXT Crafted Series Tempest 410 
OtherOtherOther
LSI 9280-16i4e RAID Card Intel I350 Quad Port Gigabit NIC Intel Pro/1000 PT Dual Port Gigabit NIC 
  hide details  
Reply
Server
(11 items)
 
  
CPUMotherboardRAMHard Drive
Intel Core i5-3470 ASRock Z77 Extreme6 16 GB G.Skill Sniper DDR3-1600 1TB WD Caviar Black 
Hard DriveCoolingOSCase
4TB WD Caviar Red Cooler Master Hyper 212+ VMWare ESXi NZXT Crafted Series Tempest 410 
OtherOtherOther
LSI 9280-16i4e RAID Card Intel I350 Quad Port Gigabit NIC Intel Pro/1000 PT Dual Port Gigabit NIC 
  hide details  
Reply
post #5 of 21
Thread Starter 
Quote:
Originally Posted by Manyak View Post

While I really, really like this idea, I have a few questions.
1 - When going from 0V to full voltage (whatever the spec is for the LEDs you use), how long does it take the LED to reach full brightness?
Nanoseconds. LED's can change state virtually instantaneously. It's even used in LED projectors with single chip DLP's -- so LED's are already changing color in those, about a thousand times a second. That's the great thing: It won't be the limiting factor. LED's are very PWM-friendly, too.
Quote:
2 - When going from on to off, how long does it take the LED light to decay?
Nanoseconds for pure LED's (ones without phosphor). The white LED's use phosphor similar to CCFL which has a decay time, some decay in less than a millisecond. This potentially become a limiting factor (high speed camera footage is needed), experimentation is needed. This can also be eliminated by using R/G/B LED's instead of white LED's. In this case, the LED speed is not the limiting factor -- the power supply switching will be slower than the LED's. Power transistors can switch power in less than a microsecond. The rise/fall can be eliminated as being a factor.
Quote:
3 - Do you realize how many strips of LEDs you're going to have to make?
It's simpler than you think now. In the last 2 years alone (just the last 2 years), there's been amazing price drops in LED strips. Common 5-meter strip of LED (50-watts, $11 on eBay, 600 LED) would be used for simplicity. I'd only need to make less than 100 solder connections for 1,200 LED's because they're already prewired into cuttable strips! They're often used for accent lighting. These are used for house/museum/bar/luxury accent lighting applications, so the good ones already have high color quality (Color Rendering Index -- CRI -- of better than 80), which is suitable for backlight use. If you cheap out, these LED's are available on eBay for $11 (search "white 600 LED strip") excluding power supply, requires your own wiring. Perfect for homebrew projects. Though for sake of argument, high-quality 6500K high-CRI strips would cost about $30-$40 in single unit quanities, more expensive than the eBay stuff.
Reference: Searches "white 600 LED strip", "white 600 LED tape", and refining searches with color temperatures ("6500K", "daylight white", etc).

A typical 24" monitor LED backlight consumes about 15 watts at brightness levels adjusted for comfort in a typical home computer at night. For an 8-segment scanning backlight that's dark 7/8ths of the time, we need 8 times more brightness than needed. So we need approx 100 watts of LED's. Fortunately, you can do 100 watts of LED's with just two 5-meter strips (16-foot) of LED costing less than $40 each, and cut them into 24-inch long strips (long enough to be put behind a 27"-diagonal glass, giving me flexibility to switch LCD's in the future for testing). I'd get 8 strip segments out of 50-watts (1 strip), or 16 strip segments out of 100-watts (2 strips). The cuttable strips have solder connections every two inch. (Dotted lines marks where they permit you to use scissors to cut the LED ribbon strip) Each strip requires 3 solder connections, so I would need 48 solder connections if I'm using 50 watts of LED's, or 96 solder connections if I'm using 100 watts. The LED's are already connected in parallel, so most of my soldering is adding jumper wires between the cut strips, since for a 32-strip backlight and 8-segment scanning, I'd be using 4 strips per segment. Presto

Common LED strips
- Comes in reels of 5-meter (16-foot), typical cost ~$10 to $50+ depending on where you purchase.
- LED strips can be cut in 2 inch increments.
- LED strips are flexible (ribbons) with surface-mounted #5050 LED's and surface-mounted current-limiting resistors.
- These cuttable strips have solder pads once every two inches
- These strips have a dotted line that indicates where you cut them. (Others require other inch increments, such as 4 inch increments).
- The most densely-packed strips have 600 LED's on them, a total of 50 watts of LED's, which is more LED power than the backlight in existing monitors.

So just a simple transistor is needed for power switching. Arduino output into the gate of transistor, and then the two other transistor pins in series with connection one of the eight LED segments. I only need to cut a 16-foot strip it into approximately 16 strips (2 strips per scanning backlight segment for an 8-segment backlight). Less than solder connections required for a homemade scanning backlight, including solder connections to the transistor (one transistor per scanning backlight segment), unless I'm just using a breadboard for prototyping, then I only need solder connections.on the LED strip themselves!

If I need to use RGB instead of white (so that I gain nanoseconds-speed switching), there is also RGB strips available too, though diffusing them would be a little bit more challenging. RGB strips are often used for color-changing lighting too, using a power supply that PWM's to dim the R/G/B. I've purchased these strips before for house accent lighting before, and man -- they are plenty bright and compact (some only a few millimeters wide). Dozens of strips can be laid side by side, so it's theoretically possible to cram 200-500 watts of LED's in the space of a 24" LCD panel. Heat dissipation becomes a concern though, if you're lighting all of that in a small space at once! (But that's not a problem, if we're lighting up only 15 watts worth at any one time, which LED backlights are already doing)
Quote:
4 - When the LEDs are dark 90% of the time, it means that the overall image will end up looking 90% darker than when fully lit. Are there LEDs available on the market that are bright enough to make up for the time that they are off (and also small enough for this application)?
A 5-meter strip of LED is 50 watts worth of LED. Many computer monitors use only 15 watt. LED's are already more than bright enough -- they are used in some projectors. And you see those advertising Jumbotrons (billboards, New York Times Square, Las Vegas)? They have to run blindingly bright just to be visible in daylight. (Cities even force them to dim them at night, by bylaw!) That's not even the limiting factor: Just add more LED's, more densely packed.
Quote:
5 - Do these LEDs output the proper wavelengths of light to allow the monitor to properly display sRGB (or even Adobe98)?
Today's LED allows you to greatly exceed the sRGB color space -- you see high end LED computer monitors claiming to exceed 100% (especially the ones that also include R/G/B LED's). That said, obviously, if I'm homebrewing, I may just settle for something that's merely superior to a CCFL backlight (Good LED strips are -- due to accent lighting -- because of use in casinos, museums, high-end bars, high-end establishments, custom home renovations, etc) even if not Adobe98, just to save cost, for simplicity -- at least for a prototype Arduino project (once I have the time to start it, unless someone else does). Mind you, there's lots of crappy LED's and the cheap $11 chinese strips may do the bill; but fortunately, prices are cheap enough that one can buy multiple strips and light up magazines, etc (or even use a Spyder colorimeter) to find the best one to use.
Edited by mdrejhon - 9/14/12 at 7:44am
post #6 of 21
Alright, you've convinced me smile.gif

Count me in.
Server
(11 items)
 
  
CPUMotherboardRAMHard Drive
Intel Core i5-3470 ASRock Z77 Extreme6 16 GB G.Skill Sniper DDR3-1600 1TB WD Caviar Black 
Hard DriveCoolingOSCase
4TB WD Caviar Red Cooler Master Hyper 212+ VMWare ESXi NZXT Crafted Series Tempest 410 
OtherOtherOther
LSI 9280-16i4e RAID Card Intel I350 Quad Port Gigabit NIC Intel Pro/1000 PT Dual Port Gigabit NIC 
  hide details  
Reply
Server
(11 items)
 
  
CPUMotherboardRAMHard Drive
Intel Core i5-3470 ASRock Z77 Extreme6 16 GB G.Skill Sniper DDR3-1600 1TB WD Caviar Black 
Hard DriveCoolingOSCase
4TB WD Caviar Red Cooler Master Hyper 212+ VMWare ESXi NZXT Crafted Series Tempest 410 
OtherOtherOther
LSI 9280-16i4e RAID Card Intel I350 Quad Port Gigabit NIC Intel Pro/1000 PT Dual Port Gigabit NIC 
  hide details  
Reply
post #7 of 21
Thread Starter 
Endorsement from John Carmack on twitter. Sounds like I am on the right track!!! !!! !!!

Mark Rejhon @mdrejhon
@ID_AA_Carmack I'm researching home-made Arduino scanning backlight (90%:10% dark:bright) using 100-200 watts of LED's. 120hz.net/showthread.php…

John Carmack @ID_AA_Carmack
@mdrejhon Good project. You definitely want to find the hardware vsync, don't try to communicate it from the host.
Edited by mdrejhon - 9/15/12 at 9:04pm
post #8 of 21
Thread Starter 
(Warning, geek stuff below. Requires programming and basics electronics knowledge)
(Crossposted to equivalent concurrent thread in hardforum, overclock)

After hearing back from John Carmack of iD software saying that this is a good project, I'm proceeding with some preliminary 'build' research, e.g. creating a small-scale breadboard trailblazer for this project. I've created electronics before, and I have programmed for more than 20 years, but this will be my first Arduino project. I've been researching, including Arduino's, to determine the best way to program it for a scanning backlight experiment.

Goals For Scanning backlight:

- At least 8 segments.
- Reduce motion blur by 90%. (Ability to be dark 90% of the time)
- Tunable in software. (1/240, 1/480, 1/960, and provisionally, 1/1920)
- Manual input lag and timing adjustment.
___

1. Decide a method of VSYNC detection.

Many methods possible. Will likely choose one of:
....(software) Signalling VSYNC from computer, using DirectX API RasterStatus.InVBlank() and RasterStatus.ScanLine .... (prone to CPU and USB timing variances)
....(hardware) Splicing video cable and use a VSYNC-detection circuit (easier with VGA, harder with HDMI/DP, not practical with HDCP)
....(hardware) Listen to 3D shutter glasses signal. It's conveniently synchronized with VSYNC. (however, this may only work during 3D mode)
....(hardware) Last resort: Use oscilloscope to find a "VSYNC signal" in my monitor's circuit. (very monitor-specific)

Note: Signalling the VSYNC from the host is not recommended (John Carmack said so!), likely due to variances in timing (e.g. CPU, USB, etc). Variances would but this gives maximum flexibility for switching monitors in the future, and makes it monitor-independent. I could stamp microsecond timecodes on it to compensate (RasterStatus.ScanLine may play a role in 'compensating'). In this situation, an LCD monitor's natural 'input lag' plays into my favour: It gives me time to compensate for delays (wait shorter/longer until 'exaxctly' the known input lag) caused by timing fluctuation. I can also do averaging algorithms for the last X refreshes (e.g. 5 refreshes) to keep things even more accurate. The problem is that Windows is not a real time operating system, and there's no interrupt/event on the PC to catch InVBlank behavior. Another idea is almost randomly reading "ScanLine" and almost randomly transmitting (with a USB-timing-fluctuation-compensation timecode) it to the Arduino, and letting the Arduino calculate timings needed. This is far more complex software-wise, but far simpler and more flexible hardware-wise, especially if I want to be able to test multiple different LCD's with the same home-made scanning backlight.
___

2. Verify the precision requirements that I need.

- What are the precision requirements for length of flashes (amount of time that backlight segment is turned on)
- What are the precision requirements for sequencing (lighting up the next segment in a scaning backlight)
- What are the precision requirements for VSYNC (beginning the scanning sequence)

Milliseconds, microseconds? Experimentation will be needed. People who are familiar with PWM dimming, already know that microseconds matter a great deal here. Scanning backlights need to be run very precisely, sub-millisecond-level jitter _can_ be visually noticeable, because 1.0 millisecond versus 1.1 millisecond variance means a light is 10% brighter! That 0.1 millisecond makes a mammoth difference. We don't want annoying random flicker in a backlight! It's the same principle as PWM dimming -- if the pulses are even just 10% longer, the light is 10% brighter -- even if the pulse in PWM dimming are tiny (1ms versus 1.1ms pulses). Even though we're talking about timescales normally not noticeable to the human eye, precision plays an important role here because the many repeated pulses over a second, _adds_ up to a very noticeably brighter or darker picture. (120 flashes of 1.0 millisecond equals 120 milliseconds. But, 120 flashes of 1.1 milliseconds equals 132 milliseconds) So we must be precise here; pulses must not vary from refresh to refresh. However, we're not too concerned with the starting brightness of the backlight -- if the backlight is 10% too dim or too bright, we can deal with it -- it's the consistency between flashes that is more important. The length of the flash is directly related to the reduction in motion blur, the shorter the flash, the less motion blur, and since we're aiming for 1/960th second flash (with a hopeful 1/1920th second capability), that's approximately 1 millisecond.

As long as the average brightness remains the same over approximately a flicker fusion threshold (e.g. ~1/60sec), variances in the flicker timing (VSYNC, sequencing) isn't going to be as important as precision of flashes, as long as the flashes get done within the flicker fusion threshold. There may be other human vision sensitivities and behaviors I may not have taken into account, so experimentation is needed.

Estimated precision requirements:
Precision for length of flashes: +/- 0.5 millisecond
Precision for consistency of length of flashes: +/- one microsecond
Precision for sequencing: +/- somewhere less than 1/2 the time of a refresh (e.g. (1/120)/2 = 4 milliseconds)
Precision for VSYNC timing: +/- somewhere less than 1/2 the time of a refresh (e.g. (1/120)/2 = 4 milliseconds)

Goal of precision requirements is to better these requirements by an order of mangitude, for a safety margin for more sensitive humans and for errors. That means length of flashes would be precise to 0.1 microseconds.
This appears doable with Arduino. Arduino's are already very precise and very synchronous-predictable; Ardunio projects include TV signal generators -- THAT requires sub-microsecond precision for good-looking vertical lines in a horizontally-scanned signal.
Example: http://www.javiervalcarce.eu/wiki/TV_Video_Signal_Generator_with_Arduino
___

3. Arduino synchronization to VSYNC

...(preferred) Arduino Interrupt method. attachInterrupt() on input pin connected to VSYNC. However, at 120Hz, the VSYNC is less than a millisecond long, so I'll need to verify if I can detect short pulses via attachInterrupt() on Arduino. Worse comes to worse, I can add a simple toggle circuit inline to the VSYNC signal, to make that signal changes only 120 times a second (e.g. on for even refreshes, off for odd refreshes), which is a frequency low enough to be detectable using Arduino. attachInterrupt() can interrupt any in-progress delays, so this is convenient, as long as I don't noticeably lengthen the delay beyond my precision requirements.
...(alternate) Arduino Poll method. This may complicate precise input lag compensation since I essentially need to do 2 things at the same time precisely (one for precise VSYNC polling and input lag compensation, the other for precise scanning backlight timing). I could use two Arduinos running concurrently, side by side -- or run an Arduino along with helper chips such as an ATtiny chip -- to keep my precision requirements for my 2 precise tasks.

I anticipate being able to use the Interrupt method; but will keep the poll method as a backup plan.
___

4. Dimming ability for scanning backlight

...(preferred) Voltage method. A voltage-adjustable power supply to the backlight segments. (Note: A tight voltage range can dim LED's from 0% through 100%)
...(alternate) PWM method. Dimming only during the time a backlight segment is considered 'on'. e.g. a 1/960th second flash would use microsecond delays to PWM-flicker the light over the 1/960th second flash, for a dimmed flash. A tight PWM loop on an Arduino is capable of microsecond PWM (it can do it -- Arduino software is already used as a direct video signal generator).

The dimming of the backlight shouldn't interfere with its scanning operation. Thus, simplest method to not interfere, is to use a voltage controlled power supply that can dim the LED's simply using voltage. Adding PWM to a scanning backlight is far more complicated (especially if I write it as an Arduino program) since I can only PWM only during the intended flash cycle; or I lose the motion-blur-eliminating ability.
___

5. Adjustable Input lag compensation

...(preferred) Use the Arduino micros() function to start a scanning sequence exactly X microseconds after the VSYNC signal.

Hopefully this can be done in the same Arduino, as I have to keep completing the previous scanning backlight refresh sequence (1/120th second), while receiving a VSYNC signal. Worse comes to worse, I can use two separate Arduinos's or an Arduino running along with an ATtiny (one for precisely listening to VSYNC and doing input lag compensation, another one to do precise backlight sequencing). If I use attachInterrupt() for VSYNC interrupt on Arduino, I can capture the current micros() value and save it to a variable. Wait for the current scanning-backlight sequence to finish, and then I start watching micros() to time the next scanning backlight refresh sequence.

___

6. Precise sequencing of backlight segments.

...(preferred) Tiny delays are done on Arduino with delayMicroseconds(). Perfect for sequencing the scanning light segments. Turn one backlight segment on, delay, turn off, repeat for next backlight segment.
...(alternate) Use the PWM outputs (six of them) of an Arduino, or use a companion component to do the pulsing/sequencing for me. These PWM outputs can be configured to pulse in sequence. However, these outputs won't give me the precision needed for a highly-adjustable scanning backlight capable of simulating "1920Hz"

The tiny delays on the Arduino is currently my plan. I also need to do input lag compensation, so I have to start sequencing the backlight at the correct time delay after a VSYNC. I am also aware that interrupt routines (attachInterrupt()) will delay the delay, but I plan to keep my interrupt very short (less than 0.5 microsecond execution time, see precision requirements at top) to make this a non-issue.

Even though my goal is "960Hz" equivalence, I want to be able to play with "1920Hz" equivalence just for experimentation and overkill's sake, and simply litreally "pwn" the "My LCD is better than CRT" prize, even though it will probably require a 200-watt backlight to do so without a dim picture.
___

Likely Steps

-- The next step is to download an electronics schematic creator program wuitand create the schematic diagram. Virtual Breadboard (http://www.virtualbreadboard.com/) has an electronics circuit simulator including an Arduino emulator. It would work perfectly for testing needs to run in slow-motion mode for visual verification of behavior, although it won't be timing-precise, it would at least allow me to visually test the code in slow-motion even before I buy the parts.
-- After that, the subsequent step is to breadboard a desktop prototype with 8 simple LED's -- more like a blinky toy -- that can run at low speed (human visible speeds) and/or high speed (scanning backlight).
-- Finally, choose the first computer monitor to hack apart. Decide if I want to try taking apart my old Samsung 245BW (72Hz limit) or buy a good high speed panel (3D 120Hz panel). My Samsung is very easy to take apart, and it is disposable (I want to replace it with a Catleap/Overlord 1440p 120Hz or similar within two or three months) so it is a safe 'first platform' to test on, even though its old technology means its response speed will cause more ghost after-images than today's 3D 120Hz panels, it will at least allow a large amount of testing before risking a higher-end LCD to it.
-- Create a high-power backlight (200watts). This will be the fun part of the project, buying 20 meters of 6500K LED tape and cramming all 2,400 LED's in a 2-foot wide 16:9 rectangle (suitable for 24"-27" panels). This might be massive overkill given, but I want to eventually nail the "1920Hz"-equialence "My LCD is better than CRT" prize. Only 10-20 watts of LED's would be lit up at a time, anyway. Appropriate power supply, switching transistors for each segment (25+ watt capable), etc. Attach it to the Arduino outputs, put LCD glass in front, and tweak away.
___

Although I do not expect many people here are familiar with Arduino programming, I'd love comments from anybody familiar with an Arduino, to tell me if there's any technical Arduino gotchas I should be aware of.
post #9 of 21
Thread Starter 
Update. I've designed a draft schematic. There may be errors, and there's no protection (e.g. overcurrent, overvoltage, etc), but this shows how relatively simple an Arduino scanning backlight really is. Most of the complexity is in the timing, synchronization -- still relatively simple Arduino programming.

ArduinoScanningBacklight_schem960.png

Full size version: LINK
Edited by mdrejhon - 9/15/12 at 7:59pm
post #10 of 21
This is pretty awesome man, its a great read!
The Lie v2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i5 4690k ASRock z97 Extreme 4 EVGA GTX 1070 Crucial 
Hard DriveHard DriveHard DriveCooling
Crucial MX300 WD Caviar Blue 500 Gb Crucial MX300 Corsair H100i 
OSOSMonitorKeyboard
Manjaro Win10 lg29um58 Ducky One TKL RGB 
PowerCaseMouseMouse Pad
OCZ Modxtreme 600w INWin 303 Logitech G602 Rubber 
AudioOther
HD598 Lots of fans 
  hide details  
Reply
The Lie v2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i5 4690k ASRock z97 Extreme 4 EVGA GTX 1070 Crucial 
Hard DriveHard DriveHard DriveCooling
Crucial MX300 WD Caviar Blue 500 Gb Crucial MX300 Corsair H100i 
OSOSMonitorKeyboard
Manjaro Win10 lg29um58 Ducky One TKL RGB 
PowerCaseMouseMouse Pad
OCZ Modxtreme 600w INWin 303 Logitech G602 Rubber 
AudioOther
HD598 Lots of fans 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
Overclock.net › Forums › Components › Monitors and Displays › Home-made Arduino scanning LED backlight to simulate 480Hz or 960Hz in a 120Hz LCD?