Overclock.net banner

Nvidia G-Sync FREE on Mobile eDP Monitors with 980M

82K views 190 replies 65 participants last post by  prejane 
#1 ·
Mobile G-Sync confirmed.

Original Post
http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/

Confirmed working with:
GTX 980M
eDP Monitor, LP173WF4-SPD1 (LGD046C)

Know working laptops:
Asus G751 series

Download link:
http://1pcent.com/?p=564

Download the original 346.87 driver and install, or copy nvlddmkm.sys from 346.87 and replace it in 347.25 extracted \Display.Driver\ folder and install driver.

Blog post:
http://1pcent.com/?p=564

PCPER
http://www.pcper.com/reviews/Graphics-Cards/Mobile-G-Sync-Confirmed-and-Tested-Leaked-Alpha-Driver

____________________________________
Original post:

I found an article on Gamenab.net about G-Sync, a moddified Nvidia 347.25 driver which allows G-Sync use on monitors with display port 1.2. I still do not believe this, I wanted your opinion. I myself will try to find what the author has modified.

http://gamenab.net/2015/01/24/nvidia-g-sync-hack-working-on-every-monitor/

http://gamenab.net/2015/01/26/truth-about-the-g-sync-marketing-module-nvidia-using-vesa-adaptive-sync-technology-freesync/
 
See less See more
#6 ·
G-sync requires a Scalar in the monitor with NVidia's module. Without it, G-sync won't be possible. It is, however, likely in that article that hey found a way to replicate what AMD have done with FreeSync. A more or less software based version of G-sync.
 
  • Rep+
Reactions: Cyro999
#7 ·
Quote:
Originally Posted by DADDYDC650 View Post

I have a GTX 980 and my Dell U2713HM seems to have a DP 1.2.... I'll give the driver a whirl.
Just wanted to warn you that I have no idea how legit this driver is, is it fake or not. I would be careful in your shoes. You probably understand this anyway.

I have a GTX 980 and Dell U2412M but no DP cable on hand, tomorrow I will buy one. I think the modded driver may add the option to enable G-Sync but the question is, will it really be activated... I can not wait for tomorrow when I will buy a DP cable just for this test, maybe we will know the answer before I make the buy.
smile.gif
 
#14 ·
We started looking into this over at PCPer. There are some glaring issues with this guys thinking / results:
  • He claims the Altera FPGA was chosen because of its 'security features', but it's really nothing more than a SHA-1 engine that could be enabled (an FPGA is meant to be a programmable device with all sorts of uses, and some of those uses might include wanting hardware SHA-1 processing). GSYNC doesn't SHA-1 encrypt the bitstream and these GSYNC FPGA's are very likely not touching that optional portion of the chip.
  • His first 'proof' video was shot with a game running in a window. G-SYNC / adaptive sync doesn't work that way (full screen only).
  • Aside from the fact that it is very hard to actually show differences between VSYNC-on and GSYNC when pointing a camera at a PC display, closely examining his second video shows the same type of VSYNC judder present when he selects VSYNC or GSYNC.
  • Comments are disabled for both of his videos.
It looks like this guy just tweaked the drivers to let him enable GSYNC in software, regardless of the connected display, but it doesn't seem to actually be doing anything different in the end.
 
#15 ·
Quote:
Originally Posted by malventano View Post

We started looking into this over at PCPer. There are some glaring issues with this guys thinking / results:
  • He claims the Altera FPGA was chosen because of its 'security features', but it's really nothing more than a SHA-1 engine that could be enabled (an FPGA is meant to be a programmable device with all sorts of uses, and some of those uses might include wanting hardware SHA-1 processing). GSYNC doesn't SHA-1 encrypt the bitstream and these GSYNC FPGA's are very likely not touching that optional portion of the chip.
  • His first 'proof' video was shot with a game running in a window. G-SYNC / adaptive sync doesn't work that way (full screen only).
  • Aside from the fact that it is very hard to actually show differences between VSYNC-on and GSYNC when pointing a camera at a PC display, closely examining his second video shows the same type of VSYNC judder present when he selects VSYNC or GSYNC.
  • Comments are disabled for both of his videos.
It looks like this guy just tweaked the drivers to let him enable GSYNC in software, regardless of the connected display, but it doesn't seem to actually be doing anything different in the end.
Hey Allyn
smile.gif
. Nice investigative work as usual.
 
#16 ·
I wonder if this will work on a true DisplayPort bypass board such as they sell on ebay for my LM270WQ1 based display (from HP Z1 Workstation)? I'm not using my original 2013 TCON board but it is DP1.2, or maybe even 1.2a, not sure (its a 2013 board so probably has 1.2a maybe, but a 60hz monitor). I installed a Overclocking kit in it last year (now does up to 110hz) so I had to remove the original TCON to fit with the overclocking kit. But one question, wouldn't using this modification be limited to 60hz because that is what the original TCON is?? What good is Gsync or Adaptive Sync if your DisplayPort can only do 60hz? Wouldn't you also need a DisplayPort TCON board that is also overclockable?

Someone said this works on the Yamakasi's or the Crossovers, and that is essentially what I built myself, except I used the Overclocking DVI board from emaxeon.

EDIT: One should read EVERY post before replying to a thread, such as what I just did... lol so TGTBT
 
#17 ·
Quote:
Originally Posted by Cyclops View Post

Hey Allyn
smile.gif
. Nice investigative work as usual.
The guy posted 240 FPS video and analyzing that on a high FPS display does definitely show a true 50 FPS frame rate when the GSYNC option is selected in the pendulum demo, but there are still a few questions:

- We don't know which actual panel is in use in that video. It could have been a real GSYNC panel with the driver modified to list it as a different name.
- On the flip side, if this is truly working in as the video / post suggests, it could only partially work and would start doing weird things once FPS dropped below 30 (part of the GSYNC module's responsibilities is to handle the forced refreshes to help avoid visible flicker at very low frame rates). Point being that it would not be the same experience you get with a true GSYNC panel.

I'm curious if anyone here has the hardware listed in that post, and is ambitious enough to test, to report back here. We tried it on one of the systems at the office that seemed to fit the bill, but it did not work (no GSYNC option appeared with the modified driver).
 
#18 ·
Quote:
Originally Posted by malventano View Post

- We don't know which actual panel is in use in that video. It could have been a real GSYNC panel with the driver modified to list it as a different name.
He claimed it was recorded on a laptop with a GTX 980M.
If that is to be believed then isn't it possible that by modding the driver he could achieve something to the effect of FreeSync?

I mean the entire DP 1.2a standard was built upon the existing Adaptive-Sync standard which has been a part of the Embedded-DisplayPort standard since 2009.

I could be entirely mistaken but laptops with eDP have supported Dynamic Refresh Rates for a fairly long time, but more as a power saving feature, there's simply been no interest in using it so software to control/make use of it was virtually non-existent.
With the appearance of Dynamic Refresh Rate software from both Nvidia and AMD these laptops that have supported the eDP standard could use a modded driver to enable something to the effect of FreeSync or Gsync.

IIRC AMD originally demo'd FreeSync on laptops.

I don't buy the conspiracy theoryexplanation given by GameNab.net about the Gsync module or that all DP 1.2 displays magically support Adaptive-Sync, but he could be right about eDP laptops supporting it.
 
#23 ·
Quote:
Originally Posted by malventano View Post

We started looking into this over at PCPer. There are some glaring issues with this guys thinking / results:
  • He claims the Altera FPGA was chosen because of its 'security features', but it's really nothing more than a SHA-1 engine that could be enabled (an FPGA is meant to be a programmable device with all sorts of uses, and some of those uses might include wanting hardware SHA-1 processing). GSYNC doesn't SHA-1 encrypt the bitstream and these GSYNC FPGA's are very likely not touching that optional portion of the chip.
  • His first 'proof' video was shot with a game running in a window. G-SYNC / adaptive sync doesn't work that way (full screen only).
  • Aside from the fact that it is very hard to actually show differences between VSYNC-on and GSYNC when pointing a camera at a PC display, closely examining his second video shows the same type of VSYNC judder present when he selects VSYNC or GSYNC.
  • Comments are disabled for both of his videos.
It looks like this guy just tweaked the drivers to let him enable GSYNC in software, regardless of the connected display, but it doesn't seem to actually be doing anything different in the end.
Is it not more plausible that it-is in fact used for encryption? The ASIC obviously has logic programmed by NVIDIA, seems reasonable to protect it from people intercepting what the scaler is doing. When enabling it, it creates an identity key to let the driver know the G-Sync module is present.

Certainly a lot more likely than anything the blogger is suggesting, the same thing could have been achieved on a much cheaper ASIC
 
#24 ·
Quote:
Originally Posted by Silent Scone View Post

Is it not more plausible that it-is in fact used for encryption? The ASIC obviously has logic programmed by NVIDIA, seems reasonable to protect it from people intercepting what the scaler is doing. When enabling it, it creates an identity key to let the driver know the G-Sync module is present.

Certainly a lot more likely than anything the blogger is suggesting, the same thing could have been achieved on a much cheaper ASIC
Well as far as FPGA's go, that hardware crypto engine would either be applied to the incoming and/or outgoing data streams. If it was the incoming stream, then there would need to be an equivalent hardware crypto engine on the GPU pipeline (going back to the earliest GPU model that supports GSYNC). If it was on the outgoing LVDS stream, everything on the panel would be garbage (panels have no decryption logic).

It's just not something that would be practical for a data stream with such high throughput. In fact, That SHA-1 unit might not even be able to handle that high of a data rate in the first place.

Besides, if someone had the necessary equipment to capture DP streams at that high of a rate, they are probably just as capable of dumping the FPGA ROM if they really wanted to - something which has nothing to do with the SHA-1 engine. Point being that if someone really wanted to reverse that tech, they could.
 
#25 ·
As a Dell U2713HM and 900 series owner I am definitely intrigued to see where this goes, but I don't think I'm willing to jump in this early.
 
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top