Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016
New Posts  All Forums:Forum Nav:

[TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016 - Page 25

post #241 of 724
Quote:
Originally Posted by looniam View Post

you're either still haven't read the article or just plain misreading.
I responded to what you wrote and what you quoted. I read the conclusion that you quoted in your post which implores us gamers to "embrace the open tool" that is gameworks and immediately decided to skip reading the rest. I would be eternally great-full if you quote the bit that specifies that only changes to CUDA code need to be approved by nvidia and after that we will just have to agree to disagree.
post #242 of 724
sorry but i'll skip on the promise of your eternal gratitude. plus i only linked it seeing a side discussion on GWs but i don't care to derail the thread more.

read the article if you care to but i'm sure that won't stop you from calling it a fan boy blog w/o knowing the facts.

good luck with that. thumb.gif
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #243 of 724
Quote:
Originally Posted by looniam View Post

sorry but i'll skip on the promise of your eternal gratitude. plus i only linked it seeing a side discussion on GWs but i don't care to derail the thread more.

read the article if you care to but i'm sure that won't stop you from calling it a fan boy blog w/o knowing the facts.

good luck with that. thumb.gif
I asked you to quote it coz it ain't there, that was entirely your own input wasn't it?
You're right about the thread derailment, let's leave it there.wink.gif
post #244 of 724
It has been hinted to me that the GeForce name will stay but the current naming convention will not, so GeForce 1080 will hopefully not become a reality lachen.gif
post #245 of 724
Quote:
Originally Posted by PlugSeven View Post

I asked you to quote it coz it ain't there, that was entirely your own input wasn't it?
You're right about the thread derailment, let's leave it there.wink.gif
its already quoted but since you didn't see it the first time . . . but yeah, just keep making unfounded accusations. thumb.gif
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #246 of 724
Quote:
Originally Posted by looniam View Post

its already quoted but since you didn't see it the first time . . . but yeah, just keep making unfounded accusations. thumb.gif
Would that be the first paragraph of what you quoted were the conditions in which the developer is allowed to modify the code in are conveniently missing?
Quote:
The rest of it is available for use independent of graphics hardware, but it’s up to the developer to decide how that’s best implemented and work through their QA/debugging to tune those issues before it gets to the end user.
The developer does not have the kind of freedom that the author is trying to suggest here when it comes using this "open tool"
post #247 of 724
its totally funny you quoted one sentence and completely ignore the one before it - you know how gameworks is cuda based so i just "made that up". rolleyes.gif

let me make this clear:

WE are done here. thumb.gif
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
loon 3.2
(18 items)
 
  
CPUMotherboardGraphicsRAM
i7-3770K Asus P8Z77-V Pro EVGA 980TI SC+ 16Gb PNY ddr3 1866 
Hard DriveHard DriveHard DriveOptical Drive
PNY 1311 240Gb 1 TB Seagate 3 TB WD Blue DVD DVDRW+/- 
CoolingCoolingOSMonitor
EKWB P280 kit EK-VGA supremacy Win X LG 24MC57HQ-P 
KeyboardPowerCaseMouse
Ducky Zero [blues] EVGA SuperNova 750 G2 Stryker M [hammered and drilled] corsair M65 
AudioAudio
SB Recon3D Klipsch ProMedia 2.1  
  hide details  
Reply
post #248 of 724
Quote:
Originally Posted by looniam View Post

you're either still haven't read the article or just plain misreading.

yes, changes need NV's approval because it's CUDA! and who owns that?

considering every post you've made when gameworks is mentioned, i doubt you have a trust issue but just plain hate. i am not saying people have to like it but if you would read about the process w/o a predisposition toward dislike - you would see a lot of this gimpwork conspiracy is . . just FUD.

ie:
gameswork is just nvidia's way of screwing over kepler and AMD owners.

false because gameworks uses CUDA compute (which NV owns again) AMD cards wouldn't be able to run it and since maxwell CUDA computes ~40% per core better than kepler yeah, it will run not as well.


AMD has A-sync compute. i see nothing wrong if they can work with game devs to use that often in DX12 titles. if there are enough titles that i want to play use it, then i will keep that in mind for my next purchase. thumb.gif

E:
typos

EII:

i missed this:
ah . .he doesn't own both brands . .read the article. wink.gif

Of course everyone knew that Maxwell is more efficient than Kepler, but the progressive degradation of 780 Ti and GK 110 Titan can not be explained away by “Maxwell is just better at compute processing than Kepler”, based on actual arch differences, mathematically speaking. In any event, since I have moved on from Nvidia and will not buy another product again that has Nvidia’s fingerprints all over it, even if it means moving to consoles permanently ( I have been playing some games on Xbone for an hour or two on the weekends over the last few months, and quite frankly, the entertainment gratification for me is not all that different from playing on the PC, if not more, with fewer hardware headaches… Lol, and I will be trying Rise of the tomb raider as soon as I have some spare time) . But, I think this guy captured Nvidia customers’ sentiment quite well. Although this is a bit dated as it relates to W3 ( and Nvidia threw a bone to these,based on my understanding, as I haven't touched w3 or any other new games with gimpworks since last year’s w3 issues. But, will try a couple with Fury nitro, time permitting) , the issues with lack of driver optimizations still remain (and I didn't write this as neither have the time nor do I care as much to write a lengthy complaint….I just move on to the next vendor)

https://forums.geforce.com/default/topic/834132/maxwell-v-kepler-it-will-affect-nvidia-900-series-card-users-sooner-rather-than-later/

“Please allow me to copy and paste the text from one of the more popular threads, as it clearly outlines the various reasons Kepler users believe NVIDIA has essentially stopped supporting Kepler:

NVIDIA, this is very concerning.

I have spent my hard earned savings on your cards since I was 10 years old - RIVA TNT, RIVA TNT 2, GeForce 2 GTS, GeForce 7800 GT, GeForce 9800 GTX, GeForce 9800 GX2, GeForce GTX 480 (SLI), GeForce GTX 680, and most recently the GeForce GTX 780 ti.

In November of 2013, you introduced a new premium card for gaming - the NVIDIA GeForce GTX 780 ti. This card lacked the float point precision power of the NVIDIA GeForce GTX TITAN released in February of 2013, but it was the most powerful single-card solution with respect to video processing available at that time. As a loyal PC gaming enthusiast, I was intrigued by the new class of card. The $750.00 price tag was large, but not entirely unreasonable.

What is entirely unreasonable is the oversight, for lack of a better word, that has occurred with the release of the NVIDIA WHQL "Game Ready" Driver 352.86 and its support for the Kepler architecture buried at the foundation of the NVIDIA GeForce GTX 780 ti (and other 700 series cards). I will stop short of the accusations contained within this thread, which state that NVIDIA has crippled the performance of Kepler-based cards with the past few driver releases.

Optimization of a PC game is a responsibility shared by both the game developer and the video card manufacturer. However, there is little-to-no excuse for the performance found when one with a Kepler-based NVIDIA card starts up the newly released The Witcher 3.

My PC specs: Intel i5-3570k @ 4.4GHz, NVIDIA GeForce GTX 780 ti, Samsung 840 Pro 256GB SSD, G.Skill Ripjaw 1600MHz 8GB memory, and ACER XB270HU 2560 x 1440 G-Sync IPS monitor.

At 2560 x 1440 resolution with settings at LOW, NVIDIA Hair Works OFF, and minimal post-processing affects; REGARDLESS of the display driver (352.86, 350.12, 347.88, or 347.52), the Witcher 3 fails to achieve 50 frames per second. Yes, LOW.

Meanwhile, the NVIDIA GeForce GTX 970, a $350.00 card (a card that literally costs $400.00 less - you could buy two of them for the price of the GTX 780 ti adjusting for inflation over the course of one short year), can run at upwards of 45 frames per second (albeit at 1920 x 1080) resolution on ULTRA settings? I understand the impact on performance that higher resolution has, but the GTX 780 ti (and TITAN for that matter) were your enthusiast-premium cards. This performance is unacceptable, even if it is merely the result of an oversight in "Game Ready" driver support.

I will not petulantly presume NVIDIA has conspired against Kepler to make Maxwell more appealing and indignantly declare my future video card purchases promised to AMD; however, this situation has caused me to rethink my future enthusiast-performance purchases. Will I continue to buy the enthusiast-premium cards if NVIDIA fails to competently support them at the launch of blockbuster, highly anticipated triple-A games barely over a year into the card's life span while, in the meantime, NVIDIA's latest generation performs capably?

Why purchase an NVIDIA GTX TITAN X when competent drivers cannot be provided to the loyal, enthusiast NVIDIA consumer? The market for your ultra-enthusiast and enthusiast-performance cards will thin out quickly if this issue is not resolved AND explained.
Maxwell Card Owners, please join with your fellow NVIDIA Enthusiasts and take this threat to gaming seriously, even if it does not currently affect your games. It inevitably will.”
Simplicity
(11 items)
 
Apotheosis
(10 items)
 
 
CPUMotherboardGraphicsRAM
4770k Asus Z87 Pro TBD Corsair Vengeance (2x8GB) DDR3 1600 RAM 
OSMonitorKeyboardPower
Windows 7 Pro Dell U2713HM Alienware TactX gaming Seasonic 850W Gold  
CaseMouse
Cooler Master HAF XB Alienware TactX premium mouse 
  hide details  
Reply
Simplicity
(11 items)
 
Apotheosis
(10 items)
 
 
CPUMotherboardGraphicsRAM
4770k Asus Z87 Pro TBD Corsair Vengeance (2x8GB) DDR3 1600 RAM 
OSMonitorKeyboardPower
Windows 7 Pro Dell U2713HM Alienware TactX gaming Seasonic 850W Gold  
CaseMouse
Cooler Master HAF XB Alienware TactX premium mouse 
  hide details  
Reply
post #249 of 724
Quote:
Originally Posted by looniam View Post

its totally funny you quoted one sentence and completely ignore the one before it - you know how gameworks is cuda based so i just "made that up". rolleyes.gif

let me make this clear:

WE are done here. thumb.gif
The sentence above only tells us that some code is restricted only to nvidia because it's cuda. It does not tell us that what you quoted here:
Quote:
For teams that have gone the source route and have made modifications to the GameWorks source for their own purposes, NVIDIA reviews the code changes prior to license approval – but this is primarily limited to assuring that no IP is infringed upon or brought in without NVIDIA’s approval. Code changes can be requested and are generally still done by NVIDIA developers; as a result pull requests issued through Github are ignored. Mr. Skolones did mention that if a developer did want to suggest a change to the code that developers have plenty of ways to do so, though they are not near a position that would allow them to review such code changes through an option like Google’s Gerrit Code Review or Apache’s Subversion.
applies only to CUDA. The article is lacking key info and is a touch assumptious. Are we to infer from the sentence that I left out that this approval is only limited to tweaks of the CUDA code in gameworks?If so, then yeah, I guess we be done wink.gif
post #250 of 724
I don't trust anything in Nvidia's black box of code inside of Gameworks. Not saying they are doing it, but it is in Nvidia's best interest for Gameworks titles to run worse on AMD cards and last generation Nvidia cards. That is how you help push sales forward. Forgive me if I don't take their word for anything. Same applies to AMD for that matter.
Super P's rig
(20 items)
 
  
CPUMotherboardGraphicsRAM
5960x ASUS X99-A II Asus GTX 1080 Ti Corsair Vengeance DDR4 3000 
Hard DriveHard DriveHard DriveOptical Drive
MyDigitalSSD BPX NVMe Samsung 850 EVO Seagate Momentus XT 500 GB External DVDRW 
CoolingCoolingOSMonitor
EK-XLC Predator 240 Swiftech 240mm Radiator Windows 10 Samsung 40" 4K - UN40KU6290 
KeyboardPowerCaseMouse
G710+ EVGA SuperNOVA 850G2 Fractal Design Define S G700s 
Mouse PadAudioAudioAudio
Vipamz Extended XXXL Asus U7 M-Audio AV40 Sennheiser HD 439 
  hide details  
Reply
Super P's rig
(20 items)
 
  
CPUMotherboardGraphicsRAM
5960x ASUS X99-A II Asus GTX 1080 Ti Corsair Vengeance DDR4 3000 
Hard DriveHard DriveHard DriveOptical Drive
MyDigitalSSD BPX NVMe Samsung 850 EVO Seagate Momentus XT 500 GB External DVDRW 
CoolingCoolingOSMonitor
EK-XLC Predator 240 Swiftech 240mm Radiator Windows 10 Samsung 40" 4K - UN40KU6290 
KeyboardPowerCaseMouse
G710+ EVGA SuperNOVA 850G2 Fractal Design Define S G700s 
Mouse PadAudioAudioAudio
Vipamz Extended XXXL Asus U7 M-Audio AV40 Sennheiser HD 439 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Rumors and Unconfirmed Articles
Overclock.net › Forums › Industry News › Rumors and Unconfirmed Articles › [TT] NVIDIA should launch its next-gen Pascal GPUs with HBM2 in 2H 2016