Download the original 346.87 driver and install, or copy nvlddmkm.sys from 346.87 and replace it in 347.25 extracted \Display.Driver\ folder and install driver.
____________________________________
Original post:
I found an article on Gamenab.net about G-Sync, a moddified Nvidia 347.25 driver which allows G-Sync use on monitors with display port 1.2. I still do not believe this, I wanted your opinion. I myself will try to find what the author has modified.
G-sync requires a Scalar in the monitor with NVidia's module. Without it, G-sync won't be possible. It is, however, likely in that article that hey found a way to replicate what AMD have done with FreeSync. A more or less software based version of G-sync.
Just wanted to warn you that I have no idea how legit this driver is, is it fake or not. I would be careful in your shoes. You probably understand this anyway.
I have a GTX 980 and Dell U2412M but no DP cable on hand, tomorrow I will buy one. I think the modded driver may add the option to enable G-Sync but the question is, will it really be activated... I can not wait for tomorrow when I will buy a DP cable just for this test, maybe we will know the answer before I make the buy.
Downloaded the drivers. Virus scans show no threats. All of my monitors are connected with DP, so I'll give this driver a go. It would be pretty epic if it worked, but the old saying goes - if it's too good to be true, it probably is.
We started looking into this over at PCPer. There are some glaring issues with this guys thinking / results:
He claims the Altera FPGA was chosen because of its 'security features', but it's really nothing more than a SHA-1 engine that could be enabled (an FPGA is meant to be a programmable device with all sorts of uses, and some of those uses might include wanting hardware SHA-1 processing). GSYNC doesn't SHA-1 encrypt the bitstream and these GSYNC FPGA's are very likely not touching that optional portion of the chip.
His first 'proof' video was shot with a game running in a window. G-SYNC / adaptive sync doesn't work that way (full screen only).
Aside from the fact that it is very hard to actually show differences between VSYNC-on and GSYNC when pointing a camera at a PC display, closely examining his second video shows the same type of VSYNC judder present when he selects VSYNC or GSYNC.
Comments are disabled for both of his videos.
It looks like this guy just tweaked the drivers to let him enable GSYNC in software, regardless of the connected display, but it doesn't seem to actually be doing anything different in the end.
We started looking into this over at PCPer. There are some glaring issues with this guys thinking / results:
He claims the Altera FPGA was chosen because of its 'security features', but it's really nothing more than a SHA-1 engine that could be enabled (an FPGA is meant to be a programmable device with all sorts of uses, and some of those uses might include wanting hardware SHA-1 processing). GSYNC doesn't SHA-1 encrypt the bitstream and these GSYNC FPGA's are very likely not touching that optional portion of the chip.
His first 'proof' video was shot with a game running in a window. G-SYNC / adaptive sync doesn't work that way (full screen only).
Aside from the fact that it is very hard to actually show differences between VSYNC-on and GSYNC when pointing a camera at a PC display, closely examining his second video shows the same type of VSYNC judder present when he selects VSYNC or GSYNC.
Comments are disabled for both of his videos.
It looks like this guy just tweaked the drivers to let him enable GSYNC in software, regardless of the connected display, but it doesn't seem to actually be doing anything different in the end.
I wonder if this will work on a true DisplayPort bypass board such as they sell on ebay for my LM270WQ1 based display (from HP Z1 Workstation)? I'm not using my original 2013 TCON board but it is DP1.2, or maybe even 1.2a, not sure (its a 2013 board so probably has 1.2a maybe, but a 60hz monitor). I installed a Overclocking kit in it last year (now does up to 110hz) so I had to remove the original TCON to fit with the overclocking kit. But one question, wouldn't using this modification be limited to 60hz because that is what the original TCON is?? What good is Gsync or Adaptive Sync if your DisplayPort can only do 60hz? Wouldn't you also need a DisplayPort TCON board that is also overclockable?
Someone said this works on the Yamakasi's or the Crossovers, and that is essentially what I built myself, except I used the Overclocking DVI board from emaxeon.
EDIT: One should read EVERY post before replying to a thread, such as what I just did... lol so TGTBT
The guy posted 240 FPS video and analyzing that on a high FPS display does definitely show a true 50 FPS frame rate when the GSYNC option is selected in the pendulum demo, but there are still a few questions:
- We don't know which actual panel is in use in that video. It could have been a real GSYNC panel with the driver modified to list it as a different name.
- On the flip side, if this is truly working in as the video / post suggests, it could only partially work and would start doing weird things once FPS dropped below 30 (part of the GSYNC module's responsibilities is to handle the forced refreshes to help avoid visible flicker at very low frame rates). Point being that it would not be the same experience you get with a true GSYNC panel.
I'm curious if anyone here has the hardware listed in that post, and is ambitious enough to test, to report back here. We tried it on one of the systems at the office that seemed to fit the bill, but it did not work (no GSYNC option appeared with the modified driver).
- We don't know which actual panel is in use in that video. It could have been a real GSYNC panel with the driver modified to list it as a different name.
He claimed it was recorded on a laptop with a GTX 980M.
If that is to be believed then isn't it possible that by modding the driver he could achieve something to the effect of FreeSync?
I mean the entire DP 1.2a standard was built upon the existing Adaptive-Sync standard which has been a part of the Embedded-DisplayPort standard since 2009.
I could be entirely mistaken but laptops with eDP have supported Dynamic Refresh Rates for a fairly long time, but more as a power saving feature, there's simply been no interest in using it so software to control/make use of it was virtually non-existent.
With the appearance of Dynamic Refresh Rate software from both Nvidia and AMD these laptops that have supported the eDP standard could use a modded driver to enable something to the effect of FreeSync or Gsync.
IIRC AMD originally demo'd FreeSync on laptops.
I don't buy the conspiracy theoryexplanation given by GameNab.net about the Gsync module or that all DP 1.2 displays magically support Adaptive-Sync, but he could be right about eDP laptops supporting it.
The monitor he tests on . S22D390H, is an HDMI/VGA(D-Sub) monitor. I have something pretty close to that, S22A350H. I can test this if interested, but I have a hard time believing it will actually work.
Second revisión of the modded drivers will be out soon. I believe this guy is onto something, we need more tests I guess.
Gonna test today with gtx 980 u2412m and DP cable.
I've properly tried this , on my Asus Rog laptop ( gtx 860m ) , which supposedly supports eDP - VESA and it doesn't work . No Gsync option and the Gsync pendulum demo doesn't allow for me to even select Gsync .
We started looking into this over at PCPer. There are some glaring issues with this guys thinking / results:
He claims the Altera FPGA was chosen because of its 'security features', but it's really nothing more than a SHA-1 engine that could be enabled (an FPGA is meant to be a programmable device with all sorts of uses, and some of those uses might include wanting hardware SHA-1 processing). GSYNC doesn't SHA-1 encrypt the bitstream and these GSYNC FPGA's are very likely not touching that optional portion of the chip.
His first 'proof' video was shot with a game running in a window. G-SYNC / adaptive sync doesn't work that way (full screen only).
Aside from the fact that it is very hard to actually show differences between VSYNC-on and GSYNC when pointing a camera at a PC display, closely examining his second video shows the same type of VSYNC judder present when he selects VSYNC or GSYNC.
Comments are disabled for both of his videos.
It looks like this guy just tweaked the drivers to let him enable GSYNC in software, regardless of the connected display, but it doesn't seem to actually be doing anything different in the end.
Is it not more plausible that it-is in fact used for encryption? The ASIC obviously has logic programmed by NVIDIA, seems reasonable to protect it from people intercepting what the scaler is doing. When enabling it, it creates an identity key to let the driver know the G-Sync module is present.
Certainly a lot more likely than anything the blogger is suggesting, the same thing could have been achieved on a much cheaper ASIC
Is it not more plausible that it-is in fact used for encryption? The ASIC obviously has logic programmed by NVIDIA, seems reasonable to protect it from people intercepting what the scaler is doing. When enabling it, it creates an identity key to let the driver know the G-Sync module is present.
Certainly a lot more likely than anything the blogger is suggesting, the same thing could have been achieved on a much cheaper ASIC
Well as far as FPGA's go, that hardware crypto engine would either be applied to the incoming and/or outgoing data streams. If it was the incoming stream, then there would need to be an equivalent hardware crypto engine on the GPU pipeline (going back to the earliest GPU model that supports GSYNC). If it was on the outgoing LVDS stream, everything on the panel would be garbage (panels have no decryption logic).
It's just not something that would be practical for a data stream with such high throughput. In fact, That SHA-1 unit might not even be able to handle that high of a data rate in the first place.
Besides, if someone had the necessary equipment to capture DP streams at that high of a rate, they are probably just as capable of dumping the FPGA ROM if they really wanted to - something which has nothing to do with the SHA-1 engine. Point being that if someone really wanted to reverse that tech, they could.
Right, so at best it's a handshake, or just not active at all in this instance
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Related Threads
?
?
?
?
?
Ask a question
Ask a question
Overclock.net
27.8M posts
541.2K members
Since 2004
A forum community dedicated to overclocking enthusiasts and testing the limits of computing. Come join the discussion about computing, builds, collections, displays, models, styles, scales, specifications, reviews, accessories, classifieds, and more!