Overclock.net › Forums › Components › Monitors and Displays › [ASUS] RoG Swift PG278Q Discussion Thread
New Posts  All Forums:Forum Nav:

[ASUS] RoG Swift PG278Q Discussion Thread - Page 234  

post #2331 of 8206
Quote:
Originally Posted by Amperial View Post

Honestly after checking recent stuff.. who knows whats future proof.
Just looking at nVidia or Smartphones (for example Sony plans to release a new flagship phone every 6 months, lol).

Nothing is future proof, new stuff always gets pitched at us containing new features that last years stuff didn't have.

So while last years model may be fully functional, it won't have that new feature everyone is bragging about.

Sure, this monitor looks to be future proof, but that won't stop some company from releasing a 4K GSync 120hz monitor alongside new GPUs that can get decent frame rates.
post #2332 of 8206
Quote:
Originally Posted by Amperial View Post

I also got a 3gb 780 "only".
Yet monitors like that one are more future proof which is a well investment. I am personally the guy who seeks for a good price / performance ratio with longevity.

Though you guys to actually rethink the whole gaming situation.
Do you notice how demanding things are atm? We are at some point where you have to pull alot of money while you can't still get good framerates on high resos. Kinda hate how it's turning.. nVidia releasing a monster 3k card doesn't make it better after all.

Badly optimized games that take more vram than needed.
nVidia pushing prices higher and higher.
While you don't get the frames you expect for that kind of money.

I personally keep my 780 for quite some time until we get GPUs that can push the current stuff with ease.

It's a symptom of increasing resolution in games, which increases immersion and fidelity. 1080p is a cakewalk for even mid-range GPUs. 10 years ago, 1600x1200 was high resolution gaming, with a lot of people still on resolutions like 1280x1024. 1440p is 80% more pixels than 1080p. If you were to step back down to a 17" 1280xz1024 monitor, it would be obvious why current gaming setups require more expensive hardware to run at the same frame rates.
Overkill
(19 items)
 
  
CPUMotherboardGraphicsRAM
i7 5820k @ 4.625Ghz ASUS Rampage V Extreme GTX 1080 Ti 32GB G.Skill RIPJAWS 3200Mhz CL14 
Hard DriveHard DriveHard DriveHard Drive
Intel 750 400GB Samsung 850 Pro 512GB Samsung 850 EVO 500GB WD Black 4TB 7200RPM 
Hard DriveCoolingOSMonitor
Hitachi 2TB 7200RPM Corsair H80i GT Windows 10 Professional Acer Predator X34 3440x1440 100hz IPS 
KeyboardPowerCaseMouse
Razer BlackWidow Chroma Corsair RM1000i SilverStone RV02 Razer DeathAdder 3G 
AudioAudioAudio
TEAC HA-501 HRT Music Streamer II+ DAC Sennheiser HD650 
  hide details  
Overkill
(19 items)
 
  
CPUMotherboardGraphicsRAM
i7 5820k @ 4.625Ghz ASUS Rampage V Extreme GTX 1080 Ti 32GB G.Skill RIPJAWS 3200Mhz CL14 
Hard DriveHard DriveHard DriveHard Drive
Intel 750 400GB Samsung 850 Pro 512GB Samsung 850 EVO 500GB WD Black 4TB 7200RPM 
Hard DriveCoolingOSMonitor
Hitachi 2TB 7200RPM Corsair H80i GT Windows 10 Professional Acer Predator X34 3440x1440 100hz IPS 
KeyboardPowerCaseMouse
Razer BlackWidow Chroma Corsair RM1000i SilverStone RV02 Razer DeathAdder 3G 
AudioAudioAudio
TEAC HA-501 HRT Music Streamer II+ DAC Sennheiser HD650 
  hide details  
post #2333 of 8206
Quote:
Originally Posted by littledonny View Post

which increases immersion and fidelity. 1080p is a cakewalk for even mid-range GPUs
Not even. My 780ti can't get perfectly stable 120fps V-synced @1280x960 in Crysis3. It dips frequently below the 120fps threshold.

I agree with Amperial on the current situation. Getting high frame rates on modern games requires ridiculously expensive investments in high end hardware.

And I don't see that changing when most games are developed primarily for consoles with a target of 60 or 30 fps.
On top of that what sells is the beautiful trailers on youtube at 30fps.
post #2334 of 8206
Quote:
You are running your 670 at 1440P and 120hz or higher?

It wasn't implied that it had to be running at 120 fps. But yes most of my games that I play are sitting over 200fps all the time maxed out in 1440p
Quote:
What is the point of getting an $800 monitor unless you can actually push it at least a reasonable level?

-No tearing
-No added input lag
-A monitor can also go trough 2-4 gpu upgrade cycle before being replaced so the gpu configuration you got now and the one you have in 2-3 year is totally different. How long 1080p was the highest you could get for a 120hz monitor, a freaking long time if you ask me. Sale trough time basically
-Even AAA games are just fine and look stunning on high settings while maintaining 50-70 fps with a single 670 and with G sync it makes it a perfectly playable experience.

Quote:
In fact why get a 120hz 1440P monitor and then hobble it with a $200 video card?

Here is some of the most popular games games out there:

Counterstrike Go
Dota 2
League of legends
Team fortress 2
Garry's mod

All of those can be maxed out at 1440p 120hz with a potato and are widely played.

The others are single player game, where 40-50 fps with gsync is a fine experience.


Quote:
If you think MSAA is going to make that much of a difference on a 1440P monitor, especially at the higher frame rates this monitor is designed for then you might be a wee bit confused.

and this is why I said "turn down msaa8x a bit and enjoy."

You don't need Sli 780 to enjoy 1440p, that is all. I know you that X cards don't play X game at 1440p maxed out 4xmsaa and that you can't possibly buy a 1440p is this is not the game but frankly this is just missing the point cause no one actually play benchmark 3:rolleyes:
post #2335 of 8206
I have 670 SLI and I'm getting this monitor as soon as I can. But if I have to turn down some settings in games to get above 100 FPS at 1440p than so be it.

I'd rather play 1080p on low than 720p on high, just like I'd rather play 1440p on low than 1080p on high. thumb.gif
Enthoo Luxe
(17 items)
 
  
CPUMotherboardGraphicsRAM
i7 5820k 4.5Ghz MSI X99A SLI KRAIT EDITION 1080 Ti Strix G.SKILL Ripjaws 4 16GB DDR4 3000 Mhz 
Hard DriveCoolingOSMonitor
OCZ Vertex 3 120gb boot drive / (2) WD 1 TB RAID 0 Corsair H100i GTX Windows 10 ASUS PG278Q Swift 
KeyboardPowerCaseMouse
Razer BlackWidow 2017 Corsair HX1000i Phanteks Enthoo Luxe Kone Pure 2017 
AudioAudio
Sound Blaster X7 Philips Fidelio X2 
  hide details  
Enthoo Luxe
(17 items)
 
  
CPUMotherboardGraphicsRAM
i7 5820k 4.5Ghz MSI X99A SLI KRAIT EDITION 1080 Ti Strix G.SKILL Ripjaws 4 16GB DDR4 3000 Mhz 
Hard DriveCoolingOSMonitor
OCZ Vertex 3 120gb boot drive / (2) WD 1 TB RAID 0 Corsair H100i GTX Windows 10 ASUS PG278Q Swift 
KeyboardPowerCaseMouse
Razer BlackWidow 2017 Corsair HX1000i Phanteks Enthoo Luxe Kone Pure 2017 
AudioAudio
Sound Blaster X7 Philips Fidelio X2 
  hide details  
post #2336 of 8206
Quote:
Originally Posted by geggeg View Post

Hey Kenji! Good to see another POTNer here.

I'm surprised im still remembered wink.gif Nice to see you too

Quote:
Originally Posted by Amperial View Post

Never implied, i know.
Need to type abit more clear. I just think most people have the fear going below 60 FPS. Any guy who got a decent card wants to push games atleast over 60 while G-Sync is just right for that range..
Ofc going higher than 60 is nice indeed.. while many people think it's some barrier to break through without knowing it's more or less monitor hz related.

Of course we all want to push over 60, its just really hard, Theres a kind of wall in this stuff where you start paying a lot more for smaller and smaller increments of performance. IE, You could Quad SLI Titan Zs for $6k if you wanted to, That might just let you run Crysis 3 at 1440p 120hz biggrin.gif

Quote:
Originally Posted by Mand12 View Post

Because it's the only 1440p G-Sync monitor available, mostly. At 40 FPS, this will still be a way, way better experience than just about any other monitor out there. The only other one even close will be the other ASUS that was able to take the G-Sync mod kit.

Now, could you take your current rig running at 40 FPS and get better performance by dumping that 800 bucks into a new GPU or two? Sure. But it's at least reasonable to take advantage of the rather significant upgrade this monitor provides even to weaker systems, knowing that you will have a LOT of headroom to grow up into as you improve your GPU over time.

This is a fair point. and if i think about my largest annoyance when I moved to my 2410 its definitely SCREEN TEARING.... it was something I NEVER encountered till I got the 2410... No clue why in all honesty as I had been gaming on LCDs for years up to that point...

As for upgrades, Im seeing my system as "mid life" at the moment, 3 years down and Im hoping for another 2 out of the motherboard/CPU, Seems my 2600k has held up very well...
Quote:
Originally Posted by Amperial View Post

Honestly after checking recent stuff.. who knows whats future proof.
Just looking at nVidia or Smartphones (for example Sony plans to release a new flagship phone every 6 months, lol).

Very true, I'd say 1440p is going to be getting cheaper fast because its lifespan is short at this time, Samsung already has a 4k 60hz panel at the $800 point and 4k for PC gaming is certainly the future, 4k also has an advantage in the fact that a 4k monitor panel can be used using the same techniques as 4k TVs, 1440p panels do not share that luxury...

Now thats not to say you shouldnt buy 1440p, 1440p still has a good sweet spot in terms of hardware requirements, resolution and size ATM, but I would expect in the next 2 years 1440p is going to get very rare as 4k gets cheaper and GPUs get to the point of pushing 4k, and when they're pushing 4k, 1440p 120hz will be very easy to hit.

Quote:
Originally Posted by littledonny View Post

It's a symptom of increasing resolution in games, which increases immersion and fidelity. 1080p is a cakewalk for even mid-range GPUs. 10 years ago, 1600x1200 was high resolution gaming, with a lot of people still on resolutions like 1280x1024. 1440p is 80% more pixels than 1080p. If you were to step back down to a 17" 1280xz1024 monitor, it would be obvious why current gaming setups require more expensive hardware to run at the same frame rates.
s

Keep in mind the Xbox 360 and the PS3 rarely if ever even hit 720p, This is the thing I always have to point out to people when they ask why gaming PCs cant "keep up" with a 7 year old console... Lots of games were rendered at 540p and upscaled if i remember right...
Kagutsuchi Mk 2
(20 items)
 
   
CPUMotherboardGraphicsRAM
Intel Core i7-6700k @ 4.635ghz Asus Maximus VIII Formula Asus Strix 1080 Ti 11gb, 2ghz Core 11ghz Memory G Skill Trident Z 32gb DDR4-3200 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 EVO 500gb Samsung 840 EVO 500gb 600gb Western Digital Velociraptor  5tb Western Digital Black 
Hard DriveHard DriveCoolingOS
4tb Western Digital Red 6tb Western Digital USB 3.0 External Corsair H110i Windows 10 Pro 
MonitorKeyboardPowerCase
Asus RoG PG279Q Logitech G910 Orion Spark Corsair HX1000i NZXT H440 
MouseAudioAudioAudio
Logitech G900 Chaos Spectrum Sennheiser Game One Phillips Fidelio X2 Klipsch RB20 Soundbar 
CPUGraphicsRAMHard Drive
Core i7-3740QM GTX 680M 2gb 16gb DDR3-1600 2x 256gb SSD  
OS
Win 7 Professional 
  hide details  
Kagutsuchi Mk 2
(20 items)
 
   
CPUMotherboardGraphicsRAM
Intel Core i7-6700k @ 4.635ghz Asus Maximus VIII Formula Asus Strix 1080 Ti 11gb, 2ghz Core 11ghz Memory G Skill Trident Z 32gb DDR4-3200 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 EVO 500gb Samsung 840 EVO 500gb 600gb Western Digital Velociraptor  5tb Western Digital Black 
Hard DriveHard DriveCoolingOS
4tb Western Digital Red 6tb Western Digital USB 3.0 External Corsair H110i Windows 10 Pro 
MonitorKeyboardPowerCase
Asus RoG PG279Q Logitech G910 Orion Spark Corsair HX1000i NZXT H440 
MouseAudioAudioAudio
Logitech G900 Chaos Spectrum Sennheiser Game One Phillips Fidelio X2 Klipsch RB20 Soundbar 
CPUGraphicsRAMHard Drive
Core i7-3740QM GTX 680M 2gb 16gb DDR3-1600 2x 256gb SSD  
OS
Win 7 Professional 
  hide details  
post #2337 of 8206
Quote:
Originally Posted by Mand12 View Post

No, buy this now, enjoy the benefits of G-Sync even though you can't push 120 FPS, and then sit back and enjoy knowing that your monitor will never be the bottleneck for many years..
Have to agree with this to some extent, a single card (should) be enough for 1440p even if @ 60fps. The question is how long would having 2 770's or 780's in SLi be enough for say 100fps average frame rate in games? If it means having to upgrade both cards every 1-2 years to maintain that frame rate in newer games, then it can become a bit excessive. I guess really it just depends on how good future video cards are going to become, the 880 GTX series and R9 390 are going to have to be a bigger leap forward than usual in performance.

I will see how things go, for now i will stick with my 1080p 120hz screen and see how performance is with the next lot video cards that come out. This is the reason i have so little interest in 4k, it just runs so badly at the moment even with 2 video cards, i would find things too slow to play properly and it is just way too expensive to get playable (imo) fps.

1440p 120hz has potential though simply because even if a player can't hit 120fps, there is no penalty. Whereas if you had a higher res screen and wanted to drop it down below native res for a higher fps, it would look like ass. Always been my main complaint with LCD, since everything went away from CRT it has made choices a lot harder.
post #2338 of 8206
Still no release date yet? :/
post #2339 of 8206
Quote:
Originally Posted by Perfect_Chaos View Post

Have to agree with this to some extent, a single card (should) be enough for 1440p even if @ 60fps. The question is how long would having 2 770's or 780's in SLi be enough for say 100fps average frame rate in games? If it means having to upgrade both cards every 1-2 years to maintain that frame rate in newer games, then it can become a bit excessive. I guess really it just depends on how good future video cards are going to become, the 880 GTX series and R9 390 are going to have to be a bigger leap forward than usual in performance.

But even when this happens, because it's a G-Sync 120 Hz, as you drop in framerate from 120 to 110 to 100 to 90 as the system ages, you're still having an amazing, smooth, crisp experience, all the way down to the 40s.

Your tolerance for framerate will determine how often you upgrade, but the point that this isn't only a high-end monitor - it is a high end monitor, but it scales downward to lower FPS better than anything else in the world.
post #2340 of 8206
Quote:
Originally Posted by Perfect_Chaos View Post

I guess really it just depends on how good future video cards are going to become, the 880 GTX series and R9 390 are going to have to be a bigger leap forward than usual in performance.

My feeling is that after this year, the industry is going to push towards 4k gaming. Video cards will rise to try to support this technology, because otherwise the monitors will not sell. A 1440p monitor would then get caught and "passed by" in their efforts to deliver single graphics cards that can do 4k @ 60hz.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Monitors and Displays
This thread is locked  
Overclock.net › Forums › Components › Monitors and Displays › [ASUS] RoG Swift PG278Q Discussion Thread