Overclock.net › Forums › Industry News › Hardware News › [Blur Busters] World's First Input Lag Test of G-SYNC
New Posts  All Forums:Forum Nav:

[Blur Busters] World's First Input Lag Test of G-SYNC - Page 6

post #51 of 218
got only so far as the method goes, but this method is flawed. it tests the entire lag from the press of a button to the update on the screen. also -- primitive method and just from that point i dont like it smile.gif
HexaCandy
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel Xeon W3680 (ES) Asus P6T Deluxe GigaByte GTX 670 Kingston KHX1600C9D3/4GX 
Hard DriveHard DriveCoolingMonitor
Intel SSD 330 180GB WD Green 3TB x2 EKWB Supreme HF + 240 rad HP ZR2740w 
KeyboardPowerCaseMouse
Steelseries 6Gv2 cheapo 750w (needs replacing) CM 690 II Advanced Razer Deathadder 
Mouse PadAudioOther
Razer Goliathus Asus Xonar Essence STX Speakers: RP 5 
  hide details  
Reply
HexaCandy
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel Xeon W3680 (ES) Asus P6T Deluxe GigaByte GTX 670 Kingston KHX1600C9D3/4GX 
Hard DriveHard DriveCoolingMonitor
Intel SSD 330 180GB WD Green 3TB x2 EKWB Supreme HF + 240 rad HP ZR2740w 
KeyboardPowerCaseMouse
Steelseries 6Gv2 cheapo 750w (needs replacing) CM 690 II Advanced Razer Deathadder 
Mouse PadAudioOther
Razer Goliathus Asus Xonar Essence STX Speakers: RP 5 
  hide details  
Reply
post #52 of 218
Quote:
Originally Posted by bencher View Post

Yes that 1ms counts.

Another useless tech IMO person who doesn't understand the tech

Fix'd thumb.gif
Glorious 4K
(23 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930k @ 4.625 GHz ASUS X99-A USB 3.1 EVGA GTX 1080 Ti @ 1950/12200 Corsair Vengeance LPX 32GB DDR4-2666 C15 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 240GB Seagate 600 Pro 240 GB WD Black 4TB Corsair H110 
CoolingCoolingCoolingCooling
Noctua NF-A14 x2 Corsair SP140 x2 Corsair AF120 x1 Corsair SP120 x4 
CoolingOSMonitorKeyboard
EVGA Titan X Hybrid AIO Windows 10 Professional x64 Sharp Aquos 4K LC-60UD27U Logitech K520 
PowerCaseMouseAudio
Corsair AX1200 Corsair Obsidian 450D Logitech Wireless Gaming Mouse G700 Sound Blaster Z Sound Card 
AudioAudioOther
Logitech Z906 Speakers Razer Chimaera 5.1 Headphones Sunbeam Rheosmart 6 fan controller 
  hide details  
Reply
Glorious 4K
(23 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930k @ 4.625 GHz ASUS X99-A USB 3.1 EVGA GTX 1080 Ti @ 1950/12200 Corsair Vengeance LPX 32GB DDR4-2666 C15 
Hard DriveHard DriveHard DriveCooling
OCZ Agility 3 240GB Seagate 600 Pro 240 GB WD Black 4TB Corsair H110 
CoolingCoolingCoolingCooling
Noctua NF-A14 x2 Corsair SP140 x2 Corsair AF120 x1 Corsair SP120 x4 
CoolingOSMonitorKeyboard
EVGA Titan X Hybrid AIO Windows 10 Professional x64 Sharp Aquos 4K LC-60UD27U Logitech K520 
PowerCaseMouseAudio
Corsair AX1200 Corsair Obsidian 450D Logitech Wireless Gaming Mouse G700 Sound Blaster Z Sound Card 
AudioAudioOther
Logitech Z906 Speakers Razer Chimaera 5.1 Headphones Sunbeam Rheosmart 6 fan controller 
  hide details  
Reply
post #53 of 218
Quote:
Originally Posted by KyadCK View Post

Quote:
Originally Posted by DuckieHo View Post

Quote:
Originally Posted by Mand12 View Post

The idea has been proven worthy, yes, and nVidia deserves credit for developing it, manufacturing hardware, and breaking it onto the market well before anybody else. FreeSync is a complete hoax of a marketing ploy at this point, designed to undercut G-Sync's rollout during CES. It still requires a new hardware development effort, which means it's not something everyone will be getting for free with DP 1.3. It's a complete falsehood.

Just to add... NVIDIA did the same with CUDA. NVIDIA evangelized and built the toolsets for GPGPU. They saw the need and pushed the market... so they reaped the first-mover rewards.

That's not saying proprietary implementations are good. It's saying that niche markets sometime require proprietary implementation to demonstrate the need or quickly adapt to the market.


http://hgpu.org/?page_id=3529

Exactly. nVidia's propritaryness may be annoying, but damn if it doesn't get results when they really want it.

I must be rare, because I don't care about G-Sync or FreeSync now. I have UltraSharps, and I'm not trading them in for anything. I care that it becomes a standard feature eventually, overall improving the industry. Which is how most things are done anyway, very little in the tech world is done overnight.

I am with you. I have Korean IPS and the color quality and viewing angles make up for the little blur and input delay I have.

With DP 1.3 around the corner, G-Sync is nothing but an early adopter tax, but this time it's pretty huge.

The biggest problem with G-Sync, is that I could see Nvidia not implementing the standards in DisplayPort and forcing people to use G-Sync over the less proprietary solution. I kind of expect Nvidia to do this, they sort of did the same by deprecating Aegia PhysX add in cards and forcing PhysX to only work on Nvidia GPUs.

It really would be a shame if Nvidia dropped the features of DP 1.3 to force people into G-Sync, because it means it's going to divide everything up, and instead of everyone getting this technology, monitor vendors are going to have to decide if they want to support G-Sync or all of DP 1.3.

I haven't heard any proof that Nvidia would drop the features of DP 1.3 that would enable sync without G-Sync, but given Nvidia's history, it seems very likely. They have always been rather fond of making this proprietary and locking out other solutions. Look how well open source Bullet Physics runs on OpenCL on Nvidia and compare that to PhysX. Same sort of deal.

Given the high price of G-Sync monitors and the module, I have a feeling that 2014 is going to be a year where you choose between 1440p monitors with G-Sync or 4k monitors with DP 1.3 at the same price. And I might sound a little tin-foil hat here, but by Nvidia only letting G-Sync on specific monitors, it helps deal with their entire line-up's issues at higher resolutions, because Nvidia can essentially block G-Sync from going on any higher resolution monitors, to keep them from looking bad. Then, it can put reviewers into a position where they have to choose between G-Sync and higher resolution, and there's no clear winner. It would be one of those "Well the 4k looks very nice, but there's input lag so it might just be worth it to go G-Sync on a 1440p monitor!"
Tyrant Kuma
(13 items)
 
Starscythe
(13 items)
 
Mobility
(6 items)
 
CPUCPUCPUCPU
Opteron 8431 Opteron 8431 Opteron 8431 Opteron 8431 
MotherboardGraphicsRAMHard Drive
Super Micro h8qme-2+ Sapphire 4870 Samsung DDR2 ECC 667mhz western digital caviar blue 
CoolingCoolingOSPower
FX 8350 stock cooler FX 6300 stock cooler Gentoo Linux 2x Silverstone PSU 
Case
Custom Fabbed Steel Case 
CPUMotherboardGraphicsRAM
AMD FX 8350 gigabyte 990FXA-UD5 7970 Mushkin Enhanced Blackline 16GB 
OSOSMonitorKeyboard
Gentoo Linux Windows 7 x64 Yamakasi Catleap Q270 Corsair K90 
PowerCaseMouseAudio
Silverstone Strider Gold Evolution 850W 80 PLUS... Antec 1200 Logitech G9x Asus Xonar D2X 
Other
XSPC RS360 Raystorm with custom radiator mounts 
CPUGraphicsRAMOS
a4 5000 Radeon HD 8330 8GB G. Skill DDR3 1600 9-9-9 Gentoo Linux 
OSOther
Windows 7 Lenovo Thinkpad x140e 
  hide details  
Reply
Tyrant Kuma
(13 items)
 
Starscythe
(13 items)
 
Mobility
(6 items)
 
CPUCPUCPUCPU
Opteron 8431 Opteron 8431 Opteron 8431 Opteron 8431 
MotherboardGraphicsRAMHard Drive
Super Micro h8qme-2+ Sapphire 4870 Samsung DDR2 ECC 667mhz western digital caviar blue 
CoolingCoolingOSPower
FX 8350 stock cooler FX 6300 stock cooler Gentoo Linux 2x Silverstone PSU 
Case
Custom Fabbed Steel Case 
CPUMotherboardGraphicsRAM
AMD FX 8350 gigabyte 990FXA-UD5 7970 Mushkin Enhanced Blackline 16GB 
OSOSMonitorKeyboard
Gentoo Linux Windows 7 x64 Yamakasi Catleap Q270 Corsair K90 
PowerCaseMouseAudio
Silverstone Strider Gold Evolution 850W 80 PLUS... Antec 1200 Logitech G9x Asus Xonar D2X 
Other
XSPC RS360 Raystorm with custom radiator mounts 
CPUGraphicsRAMOS
a4 5000 Radeon HD 8330 8GB G. Skill DDR3 1600 9-9-9 Gentoo Linux 
OSOther
Windows 7 Lenovo Thinkpad x140e 
  hide details  
Reply
post #54 of 218
Quote:
Originally Posted by sdlvx View Post

Given the high price of G-Sync monitors and the module, I have a feeling that 2014 is going to be a year where you choose between 1440p monitors with G-Sync or 4k monitors with DP 1.3 at the same price. And I might sound a little tin-foil hat here, but by Nvidia only letting G-Sync on specific monitors, it helps deal with their entire line-up's issues at higher resolutions, because Nvidia can essentially block G-Sync from going on any higher resolution monitors, to keep them from looking bad. Then, it can put reviewers into a position where they have to choose between G-Sync and higher resolution, and there's no clear winner. It would be one of those "Well the 4k looks very nice, but there's input lag so it might just be worth it to go G-Sync on a 1440p monitor!"

Why in the world would they block G-Sync on 4k monitors? At CES, they actually showed Asus 32 inch 4K PQ321Q with a retrofitted G-Sync module just to prove that the technology works at 4K just as well as 1080p. It's manufacturers who don't want to push 4K and G-Sync at the same time. I'm waiting to pick up the first 4K G-Sync monitor that comes out and nothing short of that and I expect they're coming out in Q3 or Q4.
Hydra TH10A
(15 items)
 
Nova Vault
(5 items)
 
 
CPUMotherboardGraphicsRAM
5960x Rampage V Extreme 3-Way SLI GTX 980 Ti Hydro Copper 16 GB Dominator Platinum 
Hard DriveCoolingCoolingCooling
480GB Intel 730 SSD Raid 0 x4 Custom Loop 4x 480 Alphacool Radiators Aquacomputer Aqualis 
MonitorKeyboardPowerCase
Acer B326HK Razer Black Widow 2014 Corsair AX1500i  Case Labs TH10A 
MouseMouse PadAudio
Razer Naga 2014 Megasoma 2 Creative ZxR 
CPUMotherboardRAMHard Drive
Xeon D-1541 Supermicro X10SDV-TLN4F 128GB Crucial DDR4 2133 ECC 10x WD Red 4TB  
Case
Caselabs Nova  
CPUGraphicsRAMHard Drive
2.9 Ghz Radeon Pro 560 4GB 16 GB 1TB SSD 
OSMonitor
OS 10.11 15 Inch Retina 2880x1800 
  hide details  
Reply
Hydra TH10A
(15 items)
 
Nova Vault
(5 items)
 
 
CPUMotherboardGraphicsRAM
5960x Rampage V Extreme 3-Way SLI GTX 980 Ti Hydro Copper 16 GB Dominator Platinum 
Hard DriveCoolingCoolingCooling
480GB Intel 730 SSD Raid 0 x4 Custom Loop 4x 480 Alphacool Radiators Aquacomputer Aqualis 
MonitorKeyboardPowerCase
Acer B326HK Razer Black Widow 2014 Corsair AX1500i  Case Labs TH10A 
MouseMouse PadAudio
Razer Naga 2014 Megasoma 2 Creative ZxR 
CPUMotherboardRAMHard Drive
Xeon D-1541 Supermicro X10SDV-TLN4F 128GB Crucial DDR4 2133 ECC 10x WD Red 4TB  
Case
Caselabs Nova  
CPUGraphicsRAMHard Drive
2.9 Ghz Radeon Pro 560 4GB 16 GB 1TB SSD 
OSMonitor
OS 10.11 15 Inch Retina 2880x1800 
  hide details  
Reply
post #55 of 218
Quote:
Originally Posted by sdlvx View Post

I am with you. I have Korean IPS and the color quality and viewing angles make up for the little blur and input delay I have.

With DP 1.3 around the corner, G-Sync is nothing but an early adopter tax, but this time it's pretty huge.

The biggest problem with G-Sync, is that I could see Nvidia not implementing the standards in DisplayPort and forcing people to use G-Sync over the less proprietary solution. I kind of expect Nvidia to do this, they sort of did the same by deprecating Aegia PhysX add in cards and forcing PhysX to only work on Nvidia GPUs.

It really would be a shame if Nvidia dropped the features of DP 1.3 to force people into G-Sync, because it means it's going to divide everything up, and instead of everyone getting this technology, monitor vendors are going to have to decide if they want to support G-Sync or all of DP 1.3.

I haven't heard any proof that Nvidia would drop the features of DP 1.3 that would enable sync without G-Sync, but given Nvidia's history, it seems very likely. They have always been rather fond of making this proprietary and locking out other solutions. Look how well open source Bullet Physics runs on OpenCL on Nvidia and compare that to PhysX. Same sort of deal.

Given the high price of G-Sync monitors and the module, I have a feeling that 2014 is going to be a year where you choose between 1440p monitors with G-Sync or 4k monitors with DP 1.3 at the same price. And I might sound a little tin-foil hat here, but by Nvidia only letting G-Sync on specific monitors, it helps deal with their entire line-up's issues at higher resolutions, because Nvidia can essentially block G-Sync from going on any higher resolution monitors, to keep them from looking bad. Then, it can put reviewers into a position where they have to choose between G-Sync and higher resolution, and there's no clear winner. It would be one of those "Well the 4k looks very nice, but there's input lag so it might just be worth it to go G-Sync on a 1440p monitor!"

another crystal ball having wizard. pls, tell me, how saggy are my moobs in 2025.
post #56 of 218
Quote:
Originally Posted by liskawc View Post

got only so far as the method goes, but this method is flawed. it tests the entire lag from the press of a button to the update on the screen.
That's correct, but the method is not flawed.

I actually wrote about this -- further down the article, I explain why this method of test is done, and why it's more honest than other lag tests. It tests the entire human chain lag, so it's easy to mathematically calculate the differential between VSYNC OFF latency versus GSYNC latency, via the honest, whole chain method.

It's simple, it's deterministic, it's honest, it's real-world games.
1. Run full chain test with VSYNC OFF
2. Run full chain test with GSYNC
3. Compute difference.
Simple!

It was necessary to do the whole chain:
- to include whatever the display is doing (hardware-based GSYNC latency)
- to include whatever the driver is doing (software-based GSYNC latencies)
- to include whatever the game is reacting to GSYNC (software-based GSYNC latencies)
This makes the full chain method more honest and less flawed.



I do not trust most sites' hard numbers on input lag, without additional data on what part of the chain is measured, using what.
A lot of sites have not been always specific on what part of input lag chain.

- Most input lag tests uses test equipment, rather than real-world games.
- Lag relative to scanout? (Remember, top edge of screen has less lag than bottom edge of scree)
- Average lag?
- Whole button-to-pixels lag?
- Display-specific lag?
- CRT measurement method?
- SMTT measurement method?
- Leo Bodnar measurement method? (A veritable black box: Does it include HDMI transciever delay too? Does it account for the ~0.5ms-1.0ms latency of the Vertical Back Porch in the signal timings? Does it measure to 50% midpoint of pixel transition? Or measure to "first light" of a pixel?)
- Are you maeasuring lag from signal input to pixel illumation?
- Including or excluding cable lag?
- Lag from signal input to pixel illumation?
- Lag to first faint visibility of LCD pixel (early in pixel transition cycle?) Pixels don't react instantly, y'know.
- Lag to final full visibility of LCD pixel (late in pixel transition cycle?)
- And, even CRTs have input lag, if you're measuring bottom-edge input lag with a Leo Bodnar, because of the scanout delay (for framebuffered architectures).
- Sometimes it's not easily possible to measure certain parts of the chain.
- I also have a photodiode osilloscope (same as prad and TFTCentral), however, it's not suitable for real-world game testing.
- Most of the above lag tests do not measure real-world gaming.

They actually end up measuring different parts of the input lag chain, and differential measurements often is out of sync with non-differential measurements.

Also -- there is nothing primitive about high speed cameras with 1ms accuracies.
Sometimes simple, primitive & deterministic is the most honest and scientific.
- Deterministic button press (human input from hands), with <1ms error margin
- Deterministic screen reaction (output to human eyes), with <1ms error margin

Therefore, the method is not flawed, at least when compared to a lot of other lag measurement methodologies.
Or if you prefer: All lag test methods ever done in human history, are all flawed, to varying degrees, even mine is flawed. But less flawed than the other lag tests for this specific purpose: real-world real-game input lag tests. -- It's simple as 1. Run full chain test with VSYNC OFF, 2. Run full chain test with GSYNC, and 3. Compute difference. It doesn't invalidate other lag test methods, and they are all useful for different reasons, but this is one of the most useful lag tests, because it's real-world. Future tests will include LightBoost, ULMB, VSYNC ON, and others.

I do personally believe, that this is one of the best and most honest input lag tests (real-world human, real-world game, "where it matters") that any review site on the Internet, can do, to do reliable input lag tests of any brand new, exciting "miracle voodoo technology" (such as GSYNC) so you include whatever the display is doing (hardware-based input latency) and whatever the driver is doing (software-based GSYNC latencies) and whatever the game is reacting to GSYNC (software-based GSYNC latencies). Thusly, the test is not flawed in that it doesn't miss anything in the chain that GSYNC might be influencing one way or another. Therefore, the whole-chain method is one of the best, most honest methods of measuring input lag, from a human perspective.
Edited by mdrejhon - 1/16/14 at 1:10pm
post #57 of 218
Quote:
Originally Posted by DuckieHo View Post

Look at the BF4 or Crysis 3 graphs?

Um just like I was saying there is only (G-Sync vs V-Sync OFF) on those graphs.

But I'm left guessing what the difference would be between. (G-Sync vs V-Sync ON) or (V-Sync ON vs V-Sync OFF)

I know there is a huge difference between (V-Sync ON vs V-Sync OFF) Because I can feel it, I just always wondered what the measurable difference was.

Edit : just realized in the comments it says there will be more testing later for what I was also pointing out.

Jabbadab says:
Quote:
More interesting would be to see how lag is different with normal vsync on vs g-sync. And maybe even forced tb+vsync on from nvidia inspector. And how lag differs between g-sync vs adaptive vsync.
Interesting test nonetheless thanks for that!

Chief Blur Buster says:
Quote:
Agreed. Eventually, we’ll do these tests in additional articles.

These tests were time consuming, so I ran out of time to do VSYNC ON passes — but we already know VSYNC ON almost never has less input lag than VSYNC OFF. However, we definitely want to do more input lag tests, in additional situations where we are interested in seeing lag results (LightBoost, ULMB, new fps_max values, VSYNC ON, other games, etc).

Edited by UNOE - 1/16/14 at 12:48pm
 
4x r290's
(9 items)
 
3x 7970's
(16 items)
 
CPUMotherboardGraphicsGraphics
i7 3930K 4.8Ghz offset ASUS X79 Rampage IV Extreme GTX 780 with EK waterblock GTX 780 with EK waterblock 
GraphicsRAMHard DriveHard Drive
GTX 780 with EK waterblock  Samsung 16GB (4x4gb) 2400Mhz 10-12-12-31-1t @ 1... 256GB Samsung 950 Pro 256GB Samsung 840 Pro 
CoolingCoolingOSMonitor
RX360, EX360, EX360, and 480 Rad's XSPC Raystorm CPU Block Dual boot (other unnamed OS) and Win8 Pro QNIX QX2710 27" 2560x1440 @ 112hz 
KeyboardPowerCase
Corsair K90 Corsair AX1200 Corsair 700D Modded w/ External Radiators 
CPUMotherboardGraphicsRAM
Intel i5 3570K GA-Z77X-UD3H 4x R9 290's 8GB 1600Mhz 
Hard DriveOSPowerCase
1TB Hard Drive Win8 Corsair Dual PSU - AX1200 and HX650 Open Case 
Other
Thee 16x Risers and One 1x Riser 
CPUMotherboardGraphicsRAM
i7 3930K 5Ghz Bench ASUS X79 Rampage IV Extreme 3x AMD 7970's w/ EK water blocks Samsung 16GB (4x4gb) 2400Mhz 10-12-12-31-1t @ 1... 
Hard DriveCoolingCoolingMonitor
128GB Samsung 830  RX360, RX240, EX360 XSPC Raystorm CPU Block QNIX QX2710 27" 2560x1440 @ 114hz 
KeyboardPowerCase
Corsair K90 Corsair AX760i Corsair 700D Modded w/ External Radiators 
  hide details  
Reply
 
4x r290's
(9 items)
 
3x 7970's
(16 items)
 
CPUMotherboardGraphicsGraphics
i7 3930K 4.8Ghz offset ASUS X79 Rampage IV Extreme GTX 780 with EK waterblock GTX 780 with EK waterblock 
GraphicsRAMHard DriveHard Drive
GTX 780 with EK waterblock  Samsung 16GB (4x4gb) 2400Mhz 10-12-12-31-1t @ 1... 256GB Samsung 950 Pro 256GB Samsung 840 Pro 
CoolingCoolingOSMonitor
RX360, EX360, EX360, and 480 Rad's XSPC Raystorm CPU Block Dual boot (other unnamed OS) and Win8 Pro QNIX QX2710 27" 2560x1440 @ 112hz 
KeyboardPowerCase
Corsair K90 Corsair AX1200 Corsair 700D Modded w/ External Radiators 
CPUMotherboardGraphicsRAM
Intel i5 3570K GA-Z77X-UD3H 4x R9 290's 8GB 1600Mhz 
Hard DriveOSPowerCase
1TB Hard Drive Win8 Corsair Dual PSU - AX1200 and HX650 Open Case 
Other
Thee 16x Risers and One 1x Riser 
CPUMotherboardGraphicsRAM
i7 3930K 5Ghz Bench ASUS X79 Rampage IV Extreme 3x AMD 7970's w/ EK water blocks Samsung 16GB (4x4gb) 2400Mhz 10-12-12-31-1t @ 1... 
Hard DriveCoolingCoolingMonitor
128GB Samsung 830  RX360, RX240, EX360 XSPC Raystorm CPU Block QNIX QX2710 27" 2560x1440 @ 114hz 
KeyboardPowerCase
Corsair K90 Corsair AX760i Corsair 700D Modded w/ External Radiators 
  hide details  
Reply
post #58 of 218
Quote:
Originally Posted by BigMack70 View Post

Fix'd thumb.gif
Quote:
With VSYNC OFF averages of 72ms and 74ms, this is very similar to G-SYNC averages of 77ms and 74ms respectively. The variability of the averages appears to fall well below the noise floor of the high variability of Battlefield 4, so Blur Busters considers the differences in averages statistically insignificant. During game play, we were unable to feel the input lag difference between VSYNC OFF versus G-SYNC.


I am pretty sure I understood it.
Intel Killer
(14 items)
 
  
CPUMotherboardGraphicsRAM
AMD 8320 4.4ghz Gigabyte 990fx XFX R9 290 (1100/1250) 8GB DDR3 1866 
Hard DriveCoolingOSMonitor
750GB Momentus XT Corsair H80i Windows 8 3x 23" 1080p 
KeyboardPower
Logitech Mk 710 OCZ 1000 watts 
  hide details  
Reply
Intel Killer
(14 items)
 
  
CPUMotherboardGraphicsRAM
AMD 8320 4.4ghz Gigabyte 990fx XFX R9 290 (1100/1250) 8GB DDR3 1866 
Hard DriveCoolingOSMonitor
750GB Momentus XT Corsair H80i Windows 8 3x 23" 1080p 
KeyboardPower
Logitech Mk 710 OCZ 1000 watts 
  hide details  
Reply
post #59 of 218
Quote:
Originally Posted by bencher View Post

I am pretty sure I understood it.
In that case, I apologize -- my earlier reply was not necessary thumb.gif

Next time, quote context (like you just did).

Because I get lots of naysayers telling me 1ms-2ms persistence (LightBoost/ULMB) doesn't matter. Context becomes important here.
Edited by mdrejhon - 1/16/14 at 12:56pm
post #60 of 218
Quote:
Originally Posted by jimlaheysadrunk View Post


another crystal ball having wizard. pls, tell me, how saggy are my moobs in 2025.

 

2025 here. They are quite saggy. Might need a push-up brassiere.

 

 

OT though, I'm curious what HDD/SSD was used in these tests. Because game code is falliable. Access times to assets can vary with millisecond differences, all of which can attribute to skewing actual results.


Edited by Kinaesthetic - 1/16/14 at 1:01pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [Blur Busters] World's First Input Lag Test of G-SYNC