Overclock.net banner

[Legit] NVIDIA Shows-Off G-SYNC @ BlizzCon 2013

19K views 246 replies 80 participants last post by  JackNaylorPE  
#1 ·
The new thing that comes out of the video is that DisplayPort isn't the the only one used in the future. It makes sense since it's not the only one packet/data-based. HDMI probably has the low latency and granularity to support the micromanagement needed by the technology.

http://www.legitreviews.com/nvidia-shows-g-sync-blizzcon-2013_128230

(PS. Hardware-unrelated but Blizzard also announced other stuff in that Con, mainly mediocre washed-out games that beg for cash on the old player base.)
 
#3 ·
Quote:
Originally Posted by fateswarm View Post

(PS. Hardware-unrelated but Blizzard also announced other stuff in that Con, mainly mediocre washed-out games that beg for cash on the old player base.)
Coming up next on "comments that are completely unnecessary"

Of course they showed off their games at BlizzCon. It's their convention, and they're a gaming company.

Do most people see screen tearing as an issue severe enough to warrant a solution like this, though?
From my perspective, it seems like nVidia is trying to pull a Listerine maneuver by selling a solution to a problem which is petty at best.

But then again, I might just be accustomed to my outdated display.
 
#4 ·
Quote:
Originally Posted by Escatore View Post

Do most people see screen tearing as an issue severe enough to warrant a solution like this, though?
Yes, it's very annoying.

After low FPS, it's this, it's worse than bad graphics like aliasing.

When I was first playing games I didn't notice it, but when I noticed it it bothered me, once seen it cannot be unseen, and I do not think it's psychological, it's a genuine irritation even if you don't consciously see it.

edit: It's not just tearing, it also fixes pausing/latency/delay of input on vsync.
 
#5 ·
G-Sync is actually a worthwhile feature if it works as advertised. Is it a big deal? Not particularly, but it does add artificial value to Nvidia's cards.

Off-Topic:
Blizzard is like an old farmer relying on old, outdated tools of the trade. They simply can't compete with new, modern start-ups. I'm probably not gonna be the first one to say this... but LoL redefined the MOBA scene. DotA2 is another serious contender to MOBA.

What does Blizzard have? SC? A dying game in a dying genre?

Blizzard has gone wrong with StarCraft in many ways. Not listening to the community, not taking risks, not consistently updating the game trying to get the formula right. Furthermore, Blizzard has refused to allow LAN play, a slap in the face to the competitive scene, a HUGE slap in the face of the community that supported the series for over a decade! It's ridiculous.

They need to realize that simply bringing a polished game to an established genre is not enough anymore. This isn't 1999-2005, where many genres had very few polished games. StarCraft was a breath of fresh air to the RTS genre. It encompassed the trifecta, amazing story, a robust multiplayer experience, and never-before seen features(unparalleled Campaign editor just to mention one).

Diablo was also an extremely good game coming to a relatively unexplored genre. I honestly, only played Diablo II. I played the first Diablo after the 2nd because it was out of my time.

Blizzard has to realize that it cannot rely on the strength of it's franchise(that it completely ruined within the last 2 years by the way) and that it needs something new, something modern, and something excellent to survive in this new world of triple A publishers, and outstanding Indie developers.
 
#6 ·
Quote:
Originally Posted by HanSomPa View Post

G-Sync is actually a worthwhile feature if it works as advertised. Is it a big deal? Not particularly, but it does add artificial value to Nvidia's cards.
Artificial? Avoiding tearing and the pauses of vsync (especially on low capable fps) is a tremendous feature, let alone it removes input latency.

Though I guess you misplaced the meaning of the word 'artificial' since you say it's worthwhile.
 
#8 ·
Two things:

1) The difference between the two screens is minimal at best. At best. The right one might be slightly smoother. Slightly. As in the slightest ever possible.

2) Who notices screen tearing? I've been playing with VSNYC off in every single game for the longest time. Can my eyes see screen tearing happen? Yes, but do I notice it anymore (as in does it take me away from the gaming experiencing)? No. It's an adaptation to playing PC games and desiring higher framerates. You turn off VSYNC and as a result you suffer from screen tearing, but the positive is you get in some cases twice the amount of frames you would be otherwise (if not more).

Does the technology sound attractive, in what it is advertising to do? Of course. On paper it seems flawless. Except, if they are using that video as an example to promote their product. They are going to have a hard time trying to sell it. The two are nearly identical.

Just looks like nVidia trying to further abuse the usage of the "Sacred G."



Next is going to be GXAA (some magical newly improved anti-aliasing feature), G-Clock or G-Boost (auto overclocking feature that changes clocks to prevent framerates from dipping in graphically demanding areas ), then G-Slap (where nVidia jumps through your monitor and slaps you in the face with a solid gold bullion G)

thumb.gif
 
#12 ·
Quote:
Originally Posted by BiG StroOnZ View Post

Two things:

1) The difference between the two screens is minimal at best. At best. The right one might be slightly smoother. Slightly. As in the slightest ever possible.

2) Who notices screen tearing? I've been playing with VSNYC off in every single game for the longest time. Can my eyes see screen tearing happen? Yes, but do I notice it anymore (as in does it take me away from the gaming experiencing)? No. It's an adaptation to playing PC games and desiring higher framerates. You turn off VSYNC and as a result you suffer from screen tearing, but the positive is you get in some cases twice the amount of frames you would be otherwise (if not more).

Does the technology sound attractive, in what it is advertising to do? Of course. On paper it seems flawless. Except, if they are using that video as an example to promote their product. They are going to have a hard time trying to sell it. The two are nearly identical.
doh.gif
You do realize Youtube displays at 30fps right? Even if Youtube was able to display a variable framerate your display you are using cannot. Meaning you can only see G-sync in person, not on a non-Gsync display.

Closest G-sync look you can achieve right now is a game at a locked 60fps with V-sync enabled. But imagine that type of smoothness with the responsiveness of V-sync OFF.

Quote:
Originally Posted by shadman View Post

As a side note, I'm at PDXLan right now and they are showing it off. Its beautiful, even on a 60Hz monitor. Can't wait til they use it on a 120Hz.
What up Shad! Your a lucky man to be there.
thumb.gif
 
#13 ·
Quote:
Originally Posted by Swolern View Post

doh.gif
You do realize Youtube displays at 30fps right? Even if Youtube was able to display a variable framerate your display you are using cannot. Meaning you can only see G-sync in person, not on a non-Gsync display.

Closest G-sync look you can achieve right now is a game at a locked 60fps with V-sync enabled. But imagine that type of smoothness with the responsiveness of V-sync OFF.
What does Youtube being locked at 30fps have anything to do with picking up evidence of screen tearing? Your examples sound pretty hard pressed to convince someone to purchase a G-Sync monitor. How does one go about seeing this supposed advantage if you cannot even see the advantage unless you see it in person? You see what I'm saying? How do you get buyers to get something, that they cannot see as advertised unless they see it in person? Take the companies' word for it...
lachen.gif


I understand what it does, I made that quite clear with the "on-paper" semantics. What you are missing is how you convince people your G-Sync and G-Sync enabled monitors is worth it. Better yet worth the extra $100 over the standard monitor. Worth it better than using V-Sync. Also, there is really no specifics on whether you are going to take an FPS hit on using it. As in, are you able to get just as many frames as someone who has both V-Sync and G-Sync disabled or is there some sort of FPS loss. All it does is synchronize the monitor's refresh to the GPU's render rate. So aren't you still limited to your monitors refresh rate? Meaning, unless you get a G-Sync enabled monitor that does over 60Hz. That is the only way it will make the G-Sync feature better than traditional V-Sync.
axesmiley.png
 
#14 ·
Quote:
Originally Posted by Swolern View Post

Quote:
Originally Posted by BiG StroOnZ View Post

Two things:

1) The difference between the two screens is minimal at best. At best. The right one might be slightly smoother. Slightly. As in the slightest ever possible.

2) Who notices screen tearing? I've been playing with VSNYC off in every single game for the longest time. Can my eyes see screen tearing happen? Yes, but do I notice it anymore (as in does it take me away from the gaming experiencing)? No. It's an adaptation to playing PC games and desiring higher framerates. You turn off VSYNC and as a result you suffer from screen tearing, but the positive is you get in some cases twice the amount of frames you would be otherwise (if not more).

Does the technology sound attractive, in what it is advertising to do? Of course. On paper it seems flawless. Except, if they are using that video as an example to promote their product. They are going to have a hard time trying to sell it. The two are nearly identical.
doh.gif
You do realize Youtube displays at 30fps right? Even if Youtube was able to display a variable framerate your display you are using cannot. Meaning you can only see G-sync in person, not on a non-Gsync display.

Closest G-sync look you can achieve right now is a game at a locked 60fps with V-sync enabled. But imagine that type of smoothness with the responsiveness of V-sync OFF.
@BiG StroOnZ, it is fixing the problem without using the brute force of more power. A GTX 760 with G-Sync looked as smooth at ~40fps as my GTX 680 ~70fps at the same high settings. But it isn't apples to apples. Tearing is gone, but the refreshing is just a little slower, so that part is noticable, but not very. Thats why I said I'm exited for higher fps, because then that is a problem solved. EDIT: I was mistaken earlier. The demo monitors ARE 120Hz capable, but the game was just running low. A demo of a game giving higher fps was phenominal. Also, thats great you are used to your stutter, but this is something else. Check it out before you knock it. I was a skeptic at first too, but changed my mind.

Also yes, Youtube videos do not do it justice. It's like showing how great the colors are on an IPS on a youtube video you watch on your TN monitor. So you have to see it in person.
Quote:
Originally Posted by Swolern View Post

Quote:
Originally Posted by shadman View Post

As a side note, I'm at PDXLan right now and they are showing it off. Its beautiful, even on a 60Hz monitor. Can't wait til they use it on a 120Hz.
What up Shad! Your a lucky man to be there.
thumb.gif
Hey what's up! The GTX 680 still working well my friend! It's great being here, and as a side note, this LAN is a charity and raised over 36,000lbs of food from the ~500 attendants. Great stuff!
 
#15 ·
Quote:
Originally Posted by shadman View Post

@BiG StroOnZ, it is fixing the problem without using the brute force of more power. A GTX 760 with G-Sync looked as smooth at ~40fps as my GTX 680 ~70fps at the same high settings. But it isn't apples to apples. Tearing is gone, but the refreshing is just a little slower, so that part is noticable, but not very. Thats why I said I'm exited for higher fps, because then that is a problem solved. EDIT: I was mistaken earlier. The demo monitors ARE 120Hz capable, but the game was just running low. A demo of a game giving higher fps was phenominal. Also, thats great you are used to your stutter, but this is something else. Check it out before you knock it. I was a skeptic at first too, but changed my mind.
My 670 @ 40-50 fps looks like 60-70fps smooth in many games for the past two years with V-Sync disabled. Point being, this is something I've claimed about nVidia cards since the day I've picked up my 670. Regardless of what game I play even right now with BF4. I'm playing very smoothly. When I break out FRAPS, in most situations I'm no where near 60-70 fps. But the point is it still feels like it.

That's without this G-Sync "feature" if you want to call it.

I'm knocking it, not because I don't think it works because I'm sure it does. I'm saying, from that video you can't tell a thing. How do you sell something like that, without actually being able to prove it works unless you have a demonstration of it in person? Also, the actual implementation of G-Sync is only useful for monitors with a refresh of over 60Hz, on top of that you need the GPU horsepower to be able to push refresh rates over 60Hz. That = more $$$. If you catch my drift.
 
#16 ·
Meh, tearing just isn't that bad to warrant the cost of this plus 2D lightboost won't work with G-Sync. Keep it nV, I'm not interested.
 
#17 ·
Quote:
Originally Posted by BiG StroOnZ View Post

My 670 @ 40-50 fps looks like 60-70fps smooth in many games for the past two years with V-Sync disabled. Point being, this is something I've claimed about nVidia cards since the day I've picked up my 670. Regardless of what game I play even right now with BF4. I'm playing very smoothly. When I break out FRAPS, in most situations I'm no where near 60-70 fps. But the point is it still feels like it.

That's without this G-Sync "feature" if you want to call it.

I'm knocking it, not because I don't think it works because I'm sure it does. I'm saying, from that video you can't tell a thing. How do you sell something like that, without actually being able to prove it works unless you have a demonstration of it in person? Also, the actual implementation of G-Sync is only useful for monitors with a refresh of over 60Hz, on top of that you need the GPU horsepower to be able to push refresh rates over 60Hz. That = more $$$. If you catch my drift.
G-sync works on monitors with any refresh rate, on monitors with a high refresh rate it gives you an actual reason to go upwards from 60fps.
 
#18 ·
Quote:
Originally Posted by BiG StroOnZ View Post

My 670 @ 40-50 fps looks like 60-70fps smooth in many games for the past two years with V-Sync disabled. Point being, this is something I've claimed about nVidia cards since the day I've picked up my 670. Regardless of what game I play even right now with BF4. I'm playing very smoothly. When I break out FRAPS, in most situations I'm no where near 60-70 fps. But the point is it still feels like it.

That's without this G-Sync "feature" if you want to call it.

I'm knocking it, not because I don't think it works because I'm sure it does. I'm saying, from that video you can't tell a thing. How do you sell something like that, without actually being able to prove it works unless you have a demonstration of it in person? Also, the actual implementation of G-Sync is only useful for monitors with a refresh of over 60Hz, on top of that you need the GPU horsepower to be able to push refresh rates over 60Hz. That = more $$$. If you catch my drift.
Well in order to sell it, like everything else, its best to let people review it before hand so that customers will know that what they're buying isn't a load of phooey. Also when it comes to gpu power... This is OCN. OVERCLOCK THAT MUTHA SUCKA. Problem Solved
biggrin.gif


Seriously though, whats the point of overclocking your gpu if your monitor is stuck at 60hz and you can already achieve +60fps anyways? This is for those who want more fps for smoother game play without being confined to vsync
 
#19 ·
G-Sync isn't just about eliminating tearing and input latency, it's also about reducing perceived stutter due to variable framerate. Synchronizing refresh rate with framerate solves all of these issues even down to 30 fps.

Educate yourself! http://www.blurbusters.com/gsync/how-does-gsync-fix-stutters/

As for selling displays on Youtube, it's the same for just about any aspect of displays. You aren't going to sell an IPS or AMOLED panel on Youtube, you aren't going to sell a 120-144Hz monitor on Youtube, nor a 240-480Hz interpolated or strobing TV, you definitely aren't going to sell a 4k display on Youtube. This is because you just can't see the difference on your 60Hz TN from 24/30fps highly compressed videos streamed and played through Flash.

Edit: Lightboost DOES work with G-Sync!!! http://www.blurbusters.com/confirmed-nvidia-g-sync-includes-a-strobe-backlight-upgrade/
Edit 2: It doesn't work simultaneously yet, but there are methods by which it can and no doubt will in the future.
 
#20 ·
So here is what I found out from Nvidia's website as my source for some answers because there are questions you guys haven't asked.

Eventually for "Select' monitors and various resolutions.
Quote:
Q: What are the resolutions of G-SYNC monitors?
A: NVIDIA G-SYNC enabled monitors will be available in a variety of resolutions from 1920x1080, to 2560x1440 to 4Kx2K. The ASUS VG248QE NVIDIA G-SYNC enabled monitor has a max resolution of 1920x1080.
Quote:
Q: What display companies are planning on introducing G-SYNC monitors?
A: Many of the industry's leading monitor manufacturers have already included G-SYNC in their product roadmaps for 2014. Among the first planning to roll out the technology are ASUS, BenQ, Phillips and ViewSonic.
It's safe to say you can tack on $100 - $175 to current monitor prices in addition to their current cost when they come enabled. Example ASUS PB278Q cost about $600 now on sale, when they come out with a G-Sync version expect to pay $700 - $775 usd. Kit only works on ASUS VG248QE and not your existing monitor.
Quote:
Q: How much more does G-SYNC add to the cost of a monitor?
A: The NVIDIA G-SYNC Do-it-yourself kit will cost approximately $175.
The G-Sync Kit is only for specific monitor ASUS VG248QE and not other existing ones = you have to buy a new G-sync monitor.
Quote:
Q: When will I be able to purchase this?
A: The NVIDIA G-SYNC Do-it-yourself kits for the ASUS VG248QE monitor will be available for purchase later this year. We will have more information to come on how and when to get G-SYNC enabled monitors in the future.
Doesn't work for all games.
Quote:
Q: Does NVIDIA G-SYNC work for all games?
A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel's ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver.
The limitations: Not for the Korean monitors.

You cannot over clock a 60 Hz monitor that is multi-input with scalers, which is every monitor except some of the Korean monitors like QNIX at 2560 x 1440 resolution. Any over clock on the list of on-board companies that offer 60 Hz 2560 x1 440 monitors WILL frame skip. No over clocking capability.

So you'll have to make the choice to either go 1080p for 120 Hz - 144 hz or when ever they finally come out with 1440p for 60 Hz refresh rate G-sync enabled and no in between alternatives.

I wouldn't mind buying a new 2560 x 1440 resolution monitor I guess because I already play on one at 60 Hz but how many people will be inclined to buy a new monitor just for G-sync?

How long will we have to wait for 2560 x 1440 monitors to hit the stores?

Personally I'm not going back to 1080p monitor and if you aren't when can we expect 1440p to be supported? What time frame before these become available?

Here's the SOURCE of FAQ's - http://www.geforce.com/hardware/technology/g-sync/faq
 
#21 ·
This only has a point if

  1. A) The price of the G-sync chip isn't prohibitive
  2. B) There isn't just an exclusive contract with ASUS
  3. C) Developers support it in upcoming releases
  4. D) Utilization of the chip is widespread over many makes and supported by multiple game studios
(I think that's all of them)

Historically, an extra proprietary part in items that gives extra features has had a pretty high failure rate in terms of market success. Commercial history is littered with the burnt remnants of add-in or special items that tried and failed at their niches.

Good luck though!
thumb.gif
 
#23 ·
Quote:
Originally Posted by hatlesschimp View Post

I don't know if it has been confirmed $175 a board for the vg248qe but if it is then that's too expensive.
Yep, I've seen refurbished VG248QEs go for that much.
 
#24 ·
Quote:
Originally Posted by Arizonian View Post

So here is what I found out from Nvidia's website as my source for some answers because there are questions you guys haven't asked.

Eventually for "Select' monitors and various resolutions.

It's safe to say you can tack on $100 - $175 to current monitor prices in addition to their current cost when they come enabled. Example ASUS PB278Q cost about $600 now on sale, when they come out with a G-Sync version expect to pay $700 - $775 usd. Kit only works on ASUS VG248QE and not your existing monitor.
The G-Sync Kit is only for specific monitor ASUS VG248QE and not other existing ones = you have to buy a new G-sync monitor.
Doesn't work for all games.
The limitations: Not for the Korean monitors.

You cannot over clock a 60 Hz monitor that is multi-input with scalers, which is every monitor except some of the Korean monitors like QNIX at 2560 x 1440 resolution. Any over clock on the list of on-board companies that offer 60 Hz 2560 x1 440 monitors WILL frame skip. No over clocking capability.

So you'll have to make the choice to either go 1080p for 120 Hz - 144 hz or when ever they finally come out with 1440p for 60 Hz refresh rate G-sync enabled and no in between alternatives.

I wouldn't mind buying a new 2560 x 1440 resolution monitor I guess because I already play on one at 60 Hz but how many people will be inclined to buy a new monitor just for G-sync?

How long will we have to wait for 2560 x 1440 monitors to hit the stores?

Personally I'm not going back to 1080p monitor and if you aren't when can we expect 1440p to be supported? What time frame before these become available?

Here's the SOURCE of FAQ's - http://www.geforce.com/hardware/technology/g-sync/faq
Great post.
thumb.gif

Clears up a lot misconceptions.
 
#25 ·
Quote:
Originally Posted by HanSomPa View Post

G-Sync is actually a worthwhile feature if it works as advertised. Is it a big deal? Not particularly, but it does add artificial value to Nvidia's cards.

Off-Topic:
Blizzard is like an old farmer relying on old, outdated tools of the trade. They simply can't compete with new, modern start-ups. I'm probably not gonna be the first one to say this... but LoL redefined the MOBA scene. DotA2 is another serious contender to MOBA.

What does Blizzard have? SC? A dying game in a dying genre?

Blizzard has gone wrong with StarCraft in many ways. Not listening to the community, not taking risks, not consistently updating the game trying to get the formula right. Furthermore, Blizzard has refused to allow LAN play, a slap in the face to the competitive scene, a HUGE slap in the face of the community that supported the series for over a decade! It's ridiculous.

They need to realize that simply bringing a polished game to an established genre is not enough anymore. This isn't 1999-2005, where many genres had very few polished games. StarCraft was a breath of fresh air to the RTS genre. It encompassed the trifecta, amazing story, a robust multiplayer experience, and never-before seen features(unparalleled Campaign editor just to mention one).

Diablo was also an extremely good game coming to a relatively unexplored genre. I honestly, only played Diablo II. I played the first Diablo after the 2nd because it was out of my time.

Blizzard has to realize that it cannot rely on the strength of it's franchise(that it completely ruined within the last 2 years by the way) and that it needs something new, something modern, and something excellent to survive in this new world of triple A publishers, and outstanding Indie developers.
I stopped reading at sc2 dying. Good one.
 
#26 ·
This is going to be just as much an epic success as Ageia's implementation of Physx.