Overclock.net › Forums › Components › Monitors and Displays › [ASUS] RoG Swift PG278Q Discussion Thread
New Posts  All Forums:Forum Nav:

[ASUS] RoG Swift PG278Q Discussion Thread - Page 255  

post #2541 of 8206
Quote:
Originally Posted by PostalTwinkie View Post

Do you honestly believe the production model is going to be WORSE than the model they allowed to be seen in public?

The answer to that would be "No.".

The panel observed in the wild is the panel that is getting used, the only unknown spec was what coating they were going to use; AG vs Non-AG. The rest of the specs have been known, to even imply there is going to be some drastic change that somehow makes this panel worse right now is just insane.

Overall no, but there could be regression in image quality during G-Sync or something like that; haven't they been working on the G-Sync implementation to get it stable and so on? Maybe there needs to be an addition in display latency to make it work here, for example. Of course the underlying factors are way different so this example isn't so good, but recall how LightBoost screwed with colors, brightness, contrast, and so on, especially in earlier models.

If this weren't Asus, on pretty much every monitor I'd be worried maybe none of the pixel overdrive settings are any good unless there is proof to the contrary.

Say VG248QE's default gamma curve is so wrong I wonder if it was intentional to make certain things easier to see in games. If so, other gaming-oriented models could have similar characteristics. Maybe they're not paying too much attention to panel uniformity on shipping units and the models people saw were relatively good.

Overall, my expectation (predicted most likely scenario) would be that it's a good monitor for gaming, bringing one of the best experiences available (edit: which could well be generous... I mean, usually things are close to the mean, duh). But we'll see. No need to be so certain about anything yet.
Edited by mikeaj - 4/15/14 at 11:35am
post #2542 of 8206
Quote:
Originally Posted by mikeaj View Post

Overall no, but there could be regression in image quality during G-Sync or something like that; haven't they been working on the G-Sync implementation to get it stable and so on? Maybe there needs to be an addition in display latency to make it work here, for example.

Existing test results of G-Sync demonstrate otherwise. Unless you can come up with a reason why it would suddenly all go to crap and yet Asus would still move foward even after that, we have no reason to not believe them.
post #2543 of 8206
Quote:
Originally Posted by Mand12 View Post

Existing test results of G-Sync demonstrate otherwise. Unless you can come up with a reason why it would suddenly all go to crap and yet Asus would still move foward even after that, we have no reason to not believe them.

Except that the existing DIY kits are all defective according to nvidia's latest action in delaying all monitors with integrated gsync.
Overkill
(19 items)
 
  
CPUMotherboardGraphicsRAM
i7 5820k @ 4.625Ghz ASUS Rampage V Extreme GTX 1080 Ti 32GB G.Skill RIPJAWS 3200Mhz CL14 
Hard DriveHard DriveHard DriveHard Drive
Intel 750 400GB Samsung 850 Pro 512GB Samsung 850 EVO 500GB WD Black 4TB 7200RPM 
Hard DriveCoolingOSMonitor
Hitachi 2TB 7200RPM Corsair H80i GT Windows 10 Professional Acer Predator X34 3440x1440 100hz IPS 
KeyboardPowerCaseMouse
Razer BlackWidow Chroma Corsair RM1000i SilverStone RV02 Razer DeathAdder 3G 
AudioAudioAudio
TEAC HA-501 HRT Music Streamer II+ DAC Sennheiser HD650 
  hide details  
Overkill
(19 items)
 
  
CPUMotherboardGraphicsRAM
i7 5820k @ 4.625Ghz ASUS Rampage V Extreme GTX 1080 Ti 32GB G.Skill RIPJAWS 3200Mhz CL14 
Hard DriveHard DriveHard DriveHard Drive
Intel 750 400GB Samsung 850 Pro 512GB Samsung 850 EVO 500GB WD Black 4TB 7200RPM 
Hard DriveCoolingOSMonitor
Hitachi 2TB 7200RPM Corsair H80i GT Windows 10 Professional Acer Predator X34 3440x1440 100hz IPS 
KeyboardPowerCaseMouse
Razer BlackWidow Chroma Corsair RM1000i SilverStone RV02 Razer DeathAdder 3G 
AudioAudioAudio
TEAC HA-501 HRT Music Streamer II+ DAC Sennheiser HD650 
  hide details  
post #2544 of 8206
Quote:
Originally Posted by mikeaj View Post

Overall no, but there could be regression in image quality during G-Sync or something like that; haven't they been working on the G-Sync implementation to get it stable and so on? Maybe there needs to be an addition in display latency to make it work here, for example. Of course the underlying factors are way different so this example isn't so good, but recall how LightBoost screwed with colors, brightness, contrast, and so on, especially in earlier models.

If this weren't Asus, on pretty much every monitor I'd be worried maybe none of the pixel overdrive settings are any good unless there is proof to the contrary.

Say VG248QE's default gamma curve is so wrong I wonder if it was intentional to make certain things easier to see in games. If so, other gaming-oriented models could have similar characteristics. Maybe they're not paying too much attention to panel uniformity on shipping units and the models people saw were relatively good.

Overall, my expectation (predicted most likely scenario) would be that it's a good monitor for gaming, bringing one of the best experiences available (edit: which could well be generous... I mean, usually things are close to the mean, duh). But we'll see. No need to be so certain about anything yet.

The G-sync "issue" is getting the timing set, each panel model has to have the G-Sync module calibrated to it. It wasn't as easy as just building a G-sync module and putting it in, they ended up having to take additional steps in the production process to match the modules to the panels.

As of right now there is no indication that G-Sync degrades the picture quality either, if that was the case we would be hearing about it, since G-sync has been out in use for a few months now. G-Sync != UMLB, they are two different feature sets.

I guess I just find it interesting that people on this forum are so quick to crap on this display and say so many negative things about it, when they have never seen it themselves. Especially in the face of people who have seen the display and have reported that it looks nothing like your average TN panel.

Which is something that makes complete sense since this is a new TN panel using new production methods and better materials. It is an 8-bit panel, purpose built, compared to the average 6-bit panel that every other TN panel on the market is.

As far as I am aware this is the first TRULY 8-bit TN panel seen in the consumer market space, as compared to the standard 6-bit TN with dithering. To give an idea on what that means; 6-bit without dither produces just over 262,000 colors, a true 8-bit panel produces over 16.7 million colors!

The standard TN panel is 6-bit with FRC, dithering, which allows it to artificially produce just over 16 million colors. The problem with dithering is it does impact image quality, especially color reproduction.
Quote:
Originally Posted by littledonny View Post

Except that the existing DIY kits are all defective according to nvidia's latest action in delaying all monitors with integrated gsync.

Really? So all the people running them right now aren't capable of it now? They broke?

The last comment I heard from Nvidia is they are adjusting the timing on each module for the specific panels outside the already done Asus panels. So unless you have a legitimate source that states all kits are defective, keep it to yourself.

EDIT:

Yup, just looked, can't find a single source stating the blatant lie you just put up here.
Edited by PostalTwinkie - 4/15/14 at 12:03pm
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
    
CPUMotherboardGraphicsRAM
Intel i7 5820K AsRock Extreme6 X99 Gigabyte GTX 980 Ti Windforce OC 16 GB Corsair Vengeance LPX 
Hard DriveHard DriveCoolingOS
Samsung 840 EVO 250GB - HDD Speed Edtition Samsung SM951 512 GB - I still hate Samsung!  Noctua NHD14 Windows 10 
MonitorMonitorMonitorKeyboard
Achieva Shimian QH270-Lite Overlord Computer Tempest X27OC  Acer Predator XB270HU Filco Majestouch 2 Ninja 
PowerCaseMouseMouse Pad
Seasonic X-1250 Fractal Design R5 Razer Naga Razer Goliathus Alpha 
AudioAudio
AKG K702 65th Anniversary Edition Creative Sound Blaster Zx 
  hide details  
post #2545 of 8206
Quote:
Originally Posted by NoobasaurusWrex View Post

Thanks for the extra info. smile.gif
I am using the 337.50 Beta drivers. Maybe that is why?
Could be the reason.
Quote:
Originally Posted by mikeaj View Post

Overall no, but there could be regression in image quality during G-Sync or something like that; haven't they been working on the G-Sync implementation to get it stable and so on? Maybe there needs to be an addition in display latency to make it work here, for example. Of course the underlying factors are way different so this example isn't so good, but recall how LightBoost screwed with colors, brightness, contrast, and so on, especially in earlier models.
Not to be rude but since you seem to be pretty interested in G-sync, why not take the time to inform yourself about it before making guesses?
I say that because the G-sync kits have been out for a while now and we do know already that none of what you wrote above is happening.

- no image quality loss in G-sync mode
- No image quality loss in ULMB mode (Except of course the inevitable loss in average brightness output)
- No added input lag in G-sync mode

You can find all that info on the net, mainly on the BlurBusters site.
post #2546 of 8206
To get back to my earlier post about testing with Single Card 780 ti vs SLI and Gsync, I tested in both Metro: Last Light as well as Battlefield 4, and I'm happy to report there was no difference in stutter or anything else performance related (other than framerate, of course) when using one card or two with Gsync.

Maybe its the specific cards I'm using, or maybe the Beta Driver fixes the issues, but I don't have a single problem with Gsync in my setup. I would advise anyone who has issues to grab the beta and see if your problems go away.
post #2547 of 8206
I have no interest in G-Sync monitors for myself (to me it's more a passing curiosity), but you're right in that I probably ought to see and read more of it before speculating on it. If you can point to the testing and data that I don't know about with respect to G-Sync, this monitor, or anything else, go ahead, please.

But I was referring to this monitor's implementation, which may not necessarily be the same as what's currently available. For one, the larger resolution here means much more data needs to be sent and processed than on existing G-Sync monitors. Existing G-Sync test results are good, but some things and details may be different from model to model? I mean, the technology is defined and operates a certain way, but still.

Early on (before anybody had products available) it was my impression that there were significant challenges associated with keeping consistency with the image with unequal frame durations, and this was one of the factors behind the minimum refresh allowed. For example, consider that the R, G, and B subpixels do not transition and behave identically. If the frame duration is different than another's, a different percentage of the frame is spent in the pixel transition times rather than steady state, if you will. This should have some minor impact on color consistency unless compensated for or the pixel transition times are negligible, and it's not necessarily something that would show up in general bench testing of colors and so on. Also, isn't G-Sync only activated in games? How do you even test for slight color deviations like this?

By the way, I thought I remembered that ULMB reduces contrast (which I would interpret as lower image quality generally) in addition to brightness, and it says that on blurbusters, for example. You just listed brightness above. I didn't see if anything said by how much or if it would be different on this monitor model.

Anyhow, I probably jumped in too late in a very long thread, apologies.
post #2548 of 8206
Quote:
Originally Posted by benlavigne11 View Post

I don't think DP 1.3 would really help for this monitor. 4k is where the bandwidth becomes a concern.

I wasn't talking about an dp 1.3 version of this monitor. Sorry, I was unclear. I was taking about monitors in general that support dp 1.3 (like a 4k 120hz).
post #2549 of 8206
Quote:
Originally Posted by littledonny View Post

Except that the existing DIY kits are all defective according to nvidia's latest action in delaying all monitors with integrated gsync.

Source, please?
post #2550 of 8206
Quote:
Originally Posted by littledonny View Post

Except that the existing DIY kits are all defective according to nvidia's latest action in delaying all monitors with integrated gsync.

That was a rumor that was debunked a few weeks ago...

Also we have members here with the kit installed that have had stellar results.
Murder Box II
(18 items)
 
Home PC
(15 items)
 
 
CPUMotherboardGraphicsRAM
Intel 8700k Gigabyte Aorus Z370 Gaming 7 Zotac 1080Ti AMP Extreme Edition G. Skill Trident Z - 32GB/ 3200Hz/CL14/Dual 
Hard DriveCoolingOSMonitor
Sandisk Extreme Pro 480GB Corsair H100i V2 - w/ML120 Pro Fans  Windows 10 Home 64 Bit Acer Z35P 1440P G-Sync 
KeyboardPowerCaseMouse
Delux T9 Pro / Logitech K360 EVGA Supernova G2 1000W Corsair 750D Airflow Mionix Naos 7000 
Mouse PadAudioAudioOther
Perixx DX-3000LAL Aluminum Sound Blaster Z Logitech Z906 - 5.1 Maxnomic Commander S-III Gaming Chair 
CPUMotherboardGraphicsRAM
Intel G3440 Asus B85M-E/CSM EVGA GTX 950 Patriot Viper 3 - 2x8GB 1600hz 
Hard DriveOptical DriveCoolingOS
Samsung EVO 850 - 500GB + 4TB/3TB x 2/2TB WD Green LG CH12LS28 Bluray Scythe Big Shuriken 2 w/Corsair SP120 Fan Windows 10 64 Bit 
MonitorKeyboardPowerCase
Samsung 75" TV.... GooBang Doo MX3 Corsair CX430 V2 NMEDIAPC HTPC 6000B w/ Pro LCD Module 
Mouse
GooBang Doo MX3 
  hide details  
Murder Box II
(18 items)
 
Home PC
(15 items)
 
 
CPUMotherboardGraphicsRAM
Intel 8700k Gigabyte Aorus Z370 Gaming 7 Zotac 1080Ti AMP Extreme Edition G. Skill Trident Z - 32GB/ 3200Hz/CL14/Dual 
Hard DriveCoolingOSMonitor
Sandisk Extreme Pro 480GB Corsair H100i V2 - w/ML120 Pro Fans  Windows 10 Home 64 Bit Acer Z35P 1440P G-Sync 
KeyboardPowerCaseMouse
Delux T9 Pro / Logitech K360 EVGA Supernova G2 1000W Corsair 750D Airflow Mionix Naos 7000 
Mouse PadAudioAudioOther
Perixx DX-3000LAL Aluminum Sound Blaster Z Logitech Z906 - 5.1 Maxnomic Commander S-III Gaming Chair 
CPUMotherboardGraphicsRAM
Intel G3440 Asus B85M-E/CSM EVGA GTX 950 Patriot Viper 3 - 2x8GB 1600hz 
Hard DriveOptical DriveCoolingOS
Samsung EVO 850 - 500GB + 4TB/3TB x 2/2TB WD Green LG CH12LS28 Bluray Scythe Big Shuriken 2 w/Corsair SP120 Fan Windows 10 64 Bit 
MonitorKeyboardPowerCase
Samsung 75" TV.... GooBang Doo MX3 Corsair CX430 V2 NMEDIAPC HTPC 6000B w/ Pro LCD Module 
Mouse
GooBang Doo MX3 
  hide details  
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
This thread is locked  
Overclock.net › Forums › Components › Monitors and Displays › [ASUS] RoG Swift PG278Q Discussion Thread