Overclock.net › Forums › Components › Monitors and Displays › [ASUS] RoG Swift PG278Q Discussion Thread
New Posts  All Forums:Forum Nav:

[ASUS] RoG Swift PG278Q Discussion Thread - Page 233  

post #2321 of 8206
Quote:
Originally Posted by Hasty View Post

Better one GPU than a SLI. (frame pacing, scaling, compatibility issues, ...) But then again for 1440p you need a SLI of 780 at least. Yes, even if you fiddle for hours with graphical settings including advanced tweaking.
I'm assuming here you intend to play modern AAA games. (If you plan on playing games like counter strike or quake 3, then even a single 770 will do the trick.)

I hope I'm not repeating myself too much. I'm just worried you wouldn't get a satisfactory result if you don't pair such a demanding display with high end hardware.
And since the display will cost 800 dollars. That would be hard to swallow.

You mention future proofing. And I think that's where your decision can be made. It all depends on your future plans for hardware upgrade.

I'd like to mention that other brands have announced G-sync monitors. So keep an eye on them. They will likely be 1080p and cheaper. So that may be more reasonable.
It's an exciting time for gaming displays with things moving faster and new technologies being implemented. Next year there will be other monitors announced with further refinements.

I love people like you. Always exagerating the truth. Fact is thst my 670 play just fine games like bf4 and metro. Sli 780p... turn down msaa8x a bit and enjoy.
post #2322 of 8206
Quote:
Originally Posted by Amperial View Post

Never implied, i know.
Need to type abit more clear. I just think most people have the fear going below 60 FPS. Any guy who got a decent card wants to push games atleast over 60 while G-Sync is just right for that range..
Ofc going higher than 60 is nice indeed.. while many people think it's some barrier to break through without knowing it's more or less monitor hz related.

..and no G-Sync is capped at 177 afaik.

The biggest benefit of G-Sync is eliminating the fear of going below 60 FPS. In normal monitors, there's actually legitimate reason to fear it - significant tearing and stuttering issues, particularly if you're using vsync. The difference between 60.1 FPS and 59.9 FPS with vsync on is more than you would think based on such a small framerate change. But it's become this magic number solely because nearly all displays are 60 Hz now. I remember when they used to not be - my last CRT could go to 85, I think.

And while G-Sync is capped now, that's not an inherent limit. It's the limit that Nvidia can get the module to run at, but there's nothing fundamental to G-Sync that would prevent it from working at even higher max refresh rates. It'd just take a better, faster, stronger G-Sync module to do it, one that isn't worth developing at this point due to the lack of displays that would benefit from it.
post #2323 of 8206
Quote:
Originally Posted by Hasty View Post

You preach G-sync as if it was the best experience one can get on a gaming monitor. And that is wrong.

- First off there is the incompatibility between G-sync mode and the low persistence ULMB mode.

- Secondly G-sync at max refresh rate and over has virtually no benefits over V-sync at max refresh rate.


Don't get me wrong. G-sync is awesome for what it does. And I'm already sold on it. I bought my GPU from Nvidia for G-sync and I'll probably jump on the Rog Swift as soon as it's available to me.

But let's be clear about G-sync mode:

Using G-sync mode isn't using the monitor to it's full potential and getting the best experience it can provide.
G-sync is there to remove the tearing and the micro stuttering from games for which the frame rates aren't stable above the max refresh rate of the monitor.

That's what G-sync does. It makes low fluctuating frame rates significantly more "playable". Nothing more.

You forgot to mention that while GSync is enabled, there is almost 0 input lag. Something that keeps many people from using Vsync.
post #2324 of 8206
Quote:
Originally Posted by kingduqc View Post

I love people like you. Always exagerating the truth. Fact is thst my 670 play just fine games like bf4 and metro. Sli 780p... turn down msaa8x a bit and enjoy.

You are running your 670 at 1440P and 120hz or higher?

What is the point of getting an $800 monitor unless you can actually push it at least a reasonable level?


In fact why get a 120hz 1440P monitor and then hobble it with a $200 video card?

Just fine has very different connotations for different players.

If you think MSAA is going to make that much of a difference on a 1440P monitor, especially at the higher frame rates this monitor is designed for then you might be a wee bit confused.
Edited by Robilar - 3/27/14 at 2:03pm
Murder Box II
(18 items)
 
Home PC
(15 items)
 
 
CPUMotherboardGraphicsRAM
Intel 6800K Asus X99-A II Zotac 1080Ti AMP Extreme Edition G. Skill Trident Z - 32GB/ 3200Hz/CL14/Quad  
Hard DriveCoolingOSMonitor
Sandisk Extreme Pro 480GB Corsair H100i V2 - w/ML120 Pro Fans Windows 10 Home 64 Bit Acer Z35P 1440P G-Sync 
KeyboardPowerCaseMouse
Steelseries Merc  EVGA Supernova G2 1000W Corsair 450D Mionix Naos 8200 
Mouse PadAudioAudioOther
Inwin Batmat Aluminum Sound Blaster Z Logitech Z906 - 5.1 Maxnomic Commander S-III Gaming Chair 
CPUMotherboardGraphicsRAM
Intel G3440 Asus B85M-E/CSM EVGA GTX 950 Patriot Viper 3 - 2x8GB 1600hz 
Hard DriveOptical DriveCoolingOS
Samsung EVO 850 - 500GB + 4TB/3TB x 2/2TB WD Green LG CH12LS28 Bluray Scythe Big Shuriken 2 w/Corsair SP120 Fan Windows 10 64 Bit 
MonitorKeyboardPowerCase
Samsung 75" TV.... GooBang Doo MX3 Corsair CX430 V2 NMEDIAPC HTPC 6000B w/ Pro LCD Module 
Mouse
GooBang Doo MX3 
  hide details  
Murder Box II
(18 items)
 
Home PC
(15 items)
 
 
CPUMotherboardGraphicsRAM
Intel 6800K Asus X99-A II Zotac 1080Ti AMP Extreme Edition G. Skill Trident Z - 32GB/ 3200Hz/CL14/Quad  
Hard DriveCoolingOSMonitor
Sandisk Extreme Pro 480GB Corsair H100i V2 - w/ML120 Pro Fans Windows 10 Home 64 Bit Acer Z35P 1440P G-Sync 
KeyboardPowerCaseMouse
Steelseries Merc  EVGA Supernova G2 1000W Corsair 450D Mionix Naos 8200 
Mouse PadAudioAudioOther
Inwin Batmat Aluminum Sound Blaster Z Logitech Z906 - 5.1 Maxnomic Commander S-III Gaming Chair 
CPUMotherboardGraphicsRAM
Intel G3440 Asus B85M-E/CSM EVGA GTX 950 Patriot Viper 3 - 2x8GB 1600hz 
Hard DriveOptical DriveCoolingOS
Samsung EVO 850 - 500GB + 4TB/3TB x 2/2TB WD Green LG CH12LS28 Bluray Scythe Big Shuriken 2 w/Corsair SP120 Fan Windows 10 64 Bit 
MonitorKeyboardPowerCase
Samsung 75" TV.... GooBang Doo MX3 Corsair CX430 V2 NMEDIAPC HTPC 6000B w/ Pro LCD Module 
Mouse
GooBang Doo MX3 
  hide details  
post #2325 of 8206
Quote:
Originally Posted by Robilar View Post

You are running your 670 at 1440P and 120hz or higher?

What is the point of getting an $800 monitor unless you can actually push it at least a reasonable level?

Because it's the only 1440p G-Sync monitor available, mostly. At 40 FPS, this will still be a way, way better experience than just about any other monitor out there. The only other one even close will be the other ASUS that was able to take the G-Sync mod kit.

Now, could you take your current rig running at 40 FPS and get better performance by dumping that 800 bucks into a new GPU or two? Sure. But it's at least reasonable to take advantage of the rather significant upgrade this monitor provides even to weaker systems, knowing that you will have a LOT of headroom to grow up into as you improve your GPU over time.
post #2326 of 8206
Quote:
Originally Posted by LaBestiaHumana View Post

You forgot to mention that while GSync is enabled, there is almost 0 input lag. Something that keeps many people from using Vsync.
V-sync and G-sync have the same input lag at 144Hz.

Also nothing has "almost 0 input lag" in the current tech of monitors.
post #2327 of 8206
Quote:
Originally Posted by Hasty View Post

V-sync and G-sync have the same input lag at 144Hz.

Also nothing has "almost 0 input lag" in the current tech of monitors.

This isn't true. Tests have revealed that G-Sync has roughly the same input lag as having vsync off. Vsync by its very nature causes input lag, especially if you triple-buffer.
post #2328 of 8206
Quote:
Originally Posted by Mand12 View Post

Because it's the only 1440p G-Sync monitor available, mostly. At 40 FPS, this will still be a way, way better experience than just about any other monitor out there. The only other one even close will be the other ASUS that was able to take the G-Sync mod kit.

Now, could you take your current rig running at 40 FPS and get better performance by dumping that 800 bucks into a new GPU or two? Sure. But it's at least reasonable to take advantage of the rather significant upgrade this monitor provides even to weaker systems, knowing that you will have a LOT of headroom to grow up into as you improve your GPU over time.

That's the beauty if this screen, it can be used with. 660ti just fine, while you wait for next gen Maxwell.
post #2329 of 8206
Quote:
Originally Posted by Mand12 View Post

Because it's the only 1440p G-Sync monitor available, mostly. At 40 FPS, this will still be a way, way better experience than just about any other monitor out there. The only other one even close will be the other ASUS that was able to take the G-Sync mod kit.

Now, could you take your current rig running at 40 FPS and get better performance by dumping that 800 bucks into a new GPU or two? Sure. But it's at least reasonable to take advantage of the rather significant upgrade this monitor provides even to weaker systems, knowing that you will have a LOT of headroom to grow up into as you improve your GPU over time.

Honestly after checking recent stuff.. who knows whats future proof.
Just looking at nVidia or Smartphones (for example Sony plans to release a new flagship phone every 6 months, lol).
post #2330 of 8206
Quote:
Originally Posted by Mand12 View Post

This isn't true. Tests have revealed that G-Sync has roughly the same input lag as having vsync off. Vsync by its very nature causes input lag, especially if you triple-buffer.
Actually it is true. And it has been tested.

http://www.infinite.cz/blog/herni-monitory-input-lag-gsync

From his tests:

- V-sync 144Hz => 39ms

- G-sync 144Hz => 39ms

Confirmed by Blur Busters:
Quote:
In terms of input lag GSYNC will begin to behave as VSYNC ON when frame rates caps out to maximum.
http://forums.blurbusters.com/viewtopic.php?f=5&t=389#p3611

Note: there is a work around by using an in game frame cap. (See my post about Counter Strike.)
Edited by Hasty - 3/27/14 at 2:34pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
This thread is locked  
Overclock.net › Forums › Components › Monitors and Displays › [ASUS] RoG Swift PG278Q Discussion Thread