Overclock.net › Forums › Industry News › Software News › [Various] GeForce 337.50 "wonder driver" Beta release
New Posts  All Forums:Forum Nav:

[Various] GeForce 337.50 "wonder driver" Beta release - Page 79

post #781 of 1057
Quote:
Originally Posted by SlackerITGuy View Post

It already has.

Delusion much?

The changes come from the work nvidia/intel/microsoft have been putting into DirectX 12, key area's in the driver were identified for optimisation which would have a downlevel effect and have been implemented with more to come.

MANTLE HAD NO BEARING ON THIS.

Changes such as this are not implemented in a 3 or even 6 month time frame, Mantle was not even a fly on the wall when nvidia would have put people to work on this.
Quote:
Originally Posted by error-id10t View Post

Anyone else confused about the shader cache option? Why would this increase performance when it saves the data on a disk which is slower than RAM? I can imagine that might be the case if you lack RAM or are running a slow CPU but take 8GB RAM situation with an Ivy or Haswell, you're not going to be in that situation IMO.

The cache(s) is(are) loaded to ram when the driver loads the game it corresponds to.

The data cached is likely the data that is sent to the gpu for rendering after batch/draw processing has been performed by the cpu.
Edited by diceman2037 - 4/8/14 at 5:03pm
post #782 of 1057
Quote:
Originally Posted by skupples View Post

We appreciate the work, Dubbed!

People are going to complain no matter what the issue is, specially when select "hardware media" sites start piling on. I will make a few things known, that may have not been known before. Total War is seeing gains on SLI systems because NVIDIA finally added an SLI profile.


Anyways... It doesn't matter which team you cheer for. (you should only cheer for your own performance, not either company) Nvidia & AMD have always been @ war, but the last few years were rather stagnant, Mantle has done a good job @ shaking up the industry, which means all of us benefit.

Yup - I remember saying when Mantle came out:
What will Nvidia bring?

At the time Shadowplay > Mantle (due to the FPS loss of fraps etc)
Then shadowplay's working, was utilised by other softwares, like OBS - meaning AMD users could utilise the same low-hit FPS on recordings.
Then Nvidia came out with this driver.
In less than 3 months - I can bet my rig that AMD will utilise these optimisations for their cards for DX games. As not EVERYTHING uses mantle.

Thus AMD has two jobs:
-Utilise Nvidia's optimisations on their own cards.
-Increase Mantle performance and game experience (ie dedicated dev team for BF4 etc)

In the mean time, Nvidia will continue optimising their drivers and then release a stable, which will bring along other fixes.
All in all - I love it. I don't care if you got AMD or Nvidia - we're all winners.
post #783 of 1057
Quote:
Originally Posted by diceman2037 View Post

Delusion much?

The changes come from the work nvidia/intel/microsoft have been putting into DirectX 12, key area's in the driver were identified for optimisation which would have a downlevel effect and have been implemented with more to come.

MANTLE HAD NO BEARING ON THIS.

Changes such as this are not implemented in a 3 or even 6 month time frame, Mantle was not even a fly on the wall when nvidia would have put people to work on this.

Mantle has been in development for almost 4 years now.

The Star Swarm benchmark is ~ 4 months old.

My guess is that Nvidia has been working on this for maybe three months.

So you are trying to tell us Nvidia has been working on this small update for over 3 years? thumb.gif
IT'S WHITE
(21 items)
 
 
CPUMotherboardGraphicsGraphics
4930k Rampage iv extreme Black Edition GTX Titan  GTX Titan 
GraphicsRAMHard DriveHard Drive
GTX Titan GskillZ Trident X Samsung EVO samsung 840 evo 
Optical DriveCoolingCoolingOS
Asus External Blue Ray drive 4x 480x60mm Swiftech MCP35x2 Win 8.1.2 
MonitorMonitorMonitorPower
AOC AOC AOC EVGA G2 1300w 
CaseMouseMouse PadAudio
STH10 RAT7 what is this?  Vali / Modi :(  
Other
Corsair HX850 
  hide details  
Reply
IT'S WHITE
(21 items)
 
 
CPUMotherboardGraphicsGraphics
4930k Rampage iv extreme Black Edition GTX Titan  GTX Titan 
GraphicsRAMHard DriveHard Drive
GTX Titan GskillZ Trident X Samsung EVO samsung 840 evo 
Optical DriveCoolingCoolingOS
Asus External Blue Ray drive 4x 480x60mm Swiftech MCP35x2 Win 8.1.2 
MonitorMonitorMonitorPower
AOC AOC AOC EVGA G2 1300w 
CaseMouseMouse PadAudio
STH10 RAT7 what is this?  Vali / Modi :(  
Other
Corsair HX850 
  hide details  
Reply
post #784 of 1057
Quote:
Originally Posted by Totally Dubbed View Post

Then shadowplay's working, was utilised by other softwares, like OBS - meaning AMD users could utilise the same low-hit FPS on recordings.

What's the story behind this? I know Nvidia gave OBS a license to the NVENC libraries, but I haven't read about AMD GPUs.
post #785 of 1057
Quote:
Originally Posted by Exilon View Post

What's the story behind this? I know Nvidia gave OBS a license to the NVENC libraries, but I haven't read about AMD GPUs.

well I would have presumed OBS would be able to be utilised on AMD GPUs? Am I mistaken?
I don't use OBS - I use shadowplay only
post #786 of 1057

anyone with 780 ti SLI can confirm this ?
 

From Hardcop :

 

With the GeForce GTX 780 Ti we found the peak consistent clock speed on both GPUs went up to 1019MHz while gaming. This is higher than the boost clock on a GTX 780 Ti which is 928MHz. As we posted on the previous page, this seems slightly higher than we've tested in the past. Normally we've seen the GPU hit 1006MHz while gaming, but now it is at 1019MHz with this newest driver. We also noticed the temperature of the GPU was higher, at 87c, versus 84c on previous drivers. This higher temperature threshold has allowed the frequency to go higher, hence the 1019MHz. In any case, this means the GTX 780 Ti SLI configuration was providing us higher performance for this round of testing, so it definitely got to give us its best shot at stock performance without overclocking.

post #787 of 1057
Quote:
Originally Posted by Xuper View Post

With the GeForce GTX 780 Ti we found the peak consistent clock speed on both GPUs went up to 1019MHz while gaming. This is higher than the boost clock on a GTX 780 Ti which is 928MHz. As we posted on the previous page, this seems slightly higher than we've tested in the past. Normally we've seen the GPU hit 1006MHz while gaming, but now it is at 1019MHz with this newest driver. We also noticed the temperature of the GPU was higher, at 87c, versus 84c on previous drivers. This higher temperature threshold has allowed the frequency to go higher, hence the 1019MHz. In any case, this means the GTX 780 Ti SLI configuration was providing us higher performance for this round of testing, so it definitely got to give us its best shot at stock performance without overclocking.

Hmmm.

HardOCP is certainly AMD biased though, so I'd be very interested to see if the temperature limit of 82-84'C and stock Kepler max boost has indeed been loosened...
Audio E-peen
(24 items)
 
  
CPUMotherboardGraphicsRAM
Core i7-5960X ASRock X99E-ITX/ac Unobtainium GPU 32GB Corsair Vengeance LPX DDR4-2666 
Hard DriveCoolingCoolingCooling
Samsung 950 Pro M.2 512GB Corsair H100i Corsair SP120 High-Performance PWM x2 Noctua NF-F12 iPPC-2000 PWM x2 
CoolingOSMonitorKeyboard
Noctua NF-A9x14 Windows 10 Pro x64 Dell U2715H Logitech G710+ 
PowerCaseMouseMouse Pad
Silverstone SX600-G NCASE M1 v4 Logitech G700s eVGA GTX 590 Mouse Mat 
AudioAudioAudioAudio
Schiit Yggdrasil Unobtainium Amplifier Unobtainium Headphones Cambridge Audio Azur 851A 
AudioAudioOther
Anthony Gallo Acoustics Reference Strada JH Audio JH13 Pro (Custom Silver Cables) Herman Miller Embody 
  hide details  
Reply
Audio E-peen
(24 items)
 
  
CPUMotherboardGraphicsRAM
Core i7-5960X ASRock X99E-ITX/ac Unobtainium GPU 32GB Corsair Vengeance LPX DDR4-2666 
Hard DriveCoolingCoolingCooling
Samsung 950 Pro M.2 512GB Corsair H100i Corsair SP120 High-Performance PWM x2 Noctua NF-F12 iPPC-2000 PWM x2 
CoolingOSMonitorKeyboard
Noctua NF-A9x14 Windows 10 Pro x64 Dell U2715H Logitech G710+ 
PowerCaseMouseMouse Pad
Silverstone SX600-G NCASE M1 v4 Logitech G700s eVGA GTX 590 Mouse Mat 
AudioAudioAudioAudio
Schiit Yggdrasil Unobtainium Amplifier Unobtainium Headphones Cambridge Audio Azur 851A 
AudioAudioOther
Anthony Gallo Acoustics Reference Strada JH Audio JH13 Pro (Custom Silver Cables) Herman Miller Embody 
  hide details  
Reply
post #788 of 1057
Quote:
Originally Posted by skupples View Post

Mantle has been in development for almost 4 years now.

The Star Swarm benchmark is ~ 4 months old.

My guess is that Nvidia has been working on this for maybe three months.

So you are trying to tell us Nvidia has been working on this small update for over 3 years? thumb.gif

1. False: Mantle has been in development since mid-late 2012.

2. Star Swarm was not initially even going to use Mantle

3/4. Nvidia has been working on these optimisations well over 12 months, They would have profiled the drivers numerous times and then allocated a small team to begin preliminary work on identifying the area's and idea's that could be used before before presenting findings and case research to management to have work proper begin on introducing these into the code base.
post #789 of 1057
Quote:
Originally Posted by Alatar View Post

maarten really now. Every user and every site has reported huge gains in Star Swarm. People have had runs with more units, more batches, more everything but a 50% improvement over the old results.
So if the cpu overhead decreased significantly under this driver can you explain to me why it did nothing for me in cpu bottlenecked DX11 games. This driver is a flop except for the high end SLi it seems seriously even RTS there were no gains the game still bogged down. (RA3 and Civ5)
post #790 of 1057
Quote:
Originally Posted by Xuper View Post

anyone with 780 ti SLI can confirm this ?

 
From Hardcop :

With the GeForce GTX 780 Ti we found the peak consistent clock speed on both GPUs went up to 1019MHz while gaming. This is higher than the boost clock on a GTX 780 Ti which is 928MHz. As we posted on the previous page, this seems slightly higher than we've tested in the past. Normally we've seen the GPU hit 1006MHz while gaming, but now it is at 1019MHz with this newest driver. We also noticed the temperature of the GPU was higher, at 87c, versus 84c on previous drivers. This higher temperature threshold has allowed the frequency to go higher, hence the 1019MHz. In any case, this means the GTX 780 Ti SLI configuration was providing us higher performance for this round of testing, so it definitely got to give us its best shot at stock performance without overclocking.

I haven't got 780 or 780ti - but I have SLI 680s (which if you take VRAM out the equation, beats a 780 hands down and competes with a 780ti's performance) - if the BOOST was linked to OC'ing the card's clock speed, then surely everyone would see it right?
That's my presumption - and if that's the case, I can safely say that's utter crap:
http://www.overclock.net/t/1480050/new-nvidia-337-50-drivers-battlefield-4-benchmarks-ht-on-vs-ht-off

Look at my screenshots at the end my OP.
You'll see full GPU-Z readings - they're all the same.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Software News
Overclock.net › Forums › Industry News › Software News › [Various] GeForce 337.50 "wonder driver" Beta release