Overclock.net › Forums › Industry News › Hardware News › [BSN] Nvidia Doesn't Fully Support DirectX 11.1 with Kepler GPUs, But…
New Posts  All Forums:Forum Nav:

[BSN] Nvidia Doesn't Fully Support DirectX 11.1 with Kepler GPUs, But… - Page 7

post #61 of 64
Quote:
Originally Posted by MoGTy View Post

Quote:
Originally Posted by Dromihetes View Post

Nvidia one DX generation behind like always.
2003 ,only DX 8.1 thru 4xxx series while Radeon 9xxx was on the market ,then the huge leap ahead with the DX 11 ,when ATI 5xxx series come out they were still in the G92 era.
All Nvidia users payed for outdated products practically this year.
Nvidia = lazy .

You're comparing different generation cards. Although they were being sold at the same time,
Fermi 4xx was competing with AMDs 5xxx, half a year later than AMDs Dx11 cards, but Nvidia has always been later with a competing release these past few years.

Nvidia's G92 was competing AMD's 4xxx both Dx10.

IIRC smile.gif

He was referring to nVidia being late to go up major DX versions, which is true..but doesn't really matter that much, nVidia has always waited to take massive jumps since GeForce FX and then Fermi because both times it really bit them in the rear.
Quote:
Originally Posted by ZealotKi11er View Post

Like DX10.1 made any difference. I want DX12.

DX10.1 did make a difference? Not many games used it but when they did, it made a significant difference in performance.
Quote:
Originally Posted by mushroomboy View Post

Nvidia in the end had the better DX9 performance

Actually, out of all of the last DX9 generation cards ATIs x1950XTX was the fastest, it scaled better in CFX (Although it had to use those horrid dongles) too, ATI changed the core config to better suit the workload and really took nVidia by surprise, the previous generation had ATI cards iirc being a little faster overall but the lack of DX9.0c support is what killed them.

Source.
Quote:
Originally Posted by mushroomboy View Post

I doubt it's Nvidia bossing around MS, that's laughable. MS probably didn't want the PC market to look too pretty making their console look even more out dated. Now, it only looks semi-out dated as developers are only hitting the DX11 game hard now. they could have delayed it more if they wanted too, who would have known? People STILL buy 360s with the reason that games are now looking pretty sweet on the 360. I find it odd, we finally have the tech to really utilize DX11 tech and new consoles come out... Why is that?

Actually, nVidia did it with DX10, iirc they got MS to cut a load of features so G80 could come out way earlier and DX10.1 was the inclusion of those cut features, obviously just a rumour but a rumour that was going around the entire industry at the time.
Quote:
Originally Posted by Master__Shake View Post

directx 10 was just a ploy by microsoft to get you to stop using xp and move on to vista...directx 11 and windows 7 is the same thing unless vista supports it...now with windows 8.... well directx 12 its just an update away....advancements in tech or marketing? you decide...

opinion, not fact.

Not really, DX10 was more efficient than DX9, the idea was while you didn't get many new features to make your game look better you got a bit more performance so you could either fit larger textures/more polygons in the same performance bracket iirc, it was only incompatible with XP because when it comes down to it XP and Vista/7/8 are vastly different. (Vista is NT Kernel 6, 7 is 6.1, 8 is 6.2, XP was 5.1/5.2)
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #62 of 64
Quote:
Originally Posted by Brutuz View Post

He was referring to nVidia being late to go up major DX versions, which is true..but doesn't really matter that much, nVidia has always waited to take massive jumps since GeForce FX and then Fermi because both times it really bit them in the rear.
DX10.1 did make a difference? Not many games used it but when they did, it made a significant difference in performance.
Actually, out of all of the last DX9 generation cards ATIs x1950XTX was the fastest, it scaled better in CFX (Although it had to use those horrid dongles) too, ATI changed the core config to better suit the workload and really took nVidia by surprise, the previous generation had ATI cards iirc being a little faster overall but the lack of DX9.0c support is what killed them.
Source.
Actually, nVidia did it with DX10, iirc they got MS to cut a load of features so G80 could come out way earlier and DX10.1 was the inclusion of those cut features, obviously just a rumour but a rumour that was going around the entire industry at the time.
Not really, DX10 was more efficient than DX9, the idea was while you didn't get many new features to make your game look better you got a bit more performance so you could either fit larger textures/more polygons in the same performance bracket iirc, it was only incompatible with XP because when it comes down to it XP and Vista/7/8 are vastly different. (Vista is NT Kernel 6, 7 is 6.1, 8 is 6.2, XP was 5.1/5.2)

the only thing I have to say is (not being asinine, just bare fact on this) that the ATi stuff came out very late in the game. The original DX9 support by them wasn't the greatest, and shortly after Nvidia released theirs with an improved shader model. Not saying it was much but the FX series was improved slightly. However I will say both companies did better later on, Nvidia was much better with the NV40 series as Radeon was with the R450 series. Generally speaking, Nvidia has been much better in the past. This is probably the first gen that we've seen things so neck and neck, AMD might be able to surpass Nvidia now. Granted it's not ATi anymore but a much larger company, probably why they can do so much.

However on the bum side, I still hear people say Linux drivers still sway towards Nvidia despite the FOSS drivers not being up to par as with AMD. Either way, we are at a very good time for GFX. Back then I went with Nvidia because at that time (and only until recently) Nvidia kicked the crap out of anything in the Linux market. Hands down, there was no competition back then.

[edit] And I don't comment on rumors, I treat them just like conspiracy theories. Unless you have solid evidence, keep it to yourself as it's just hearsay and babble to me. Not being mean on that, I treat people the same on all fronts when it comes to that. Rumors are just that, rumors.
Edited by mushroomboy - 11/27/12 at 1:36pm
Current Rig
(14 items)
 
  
CPUMotherboardGraphicsRAM
FX-8350 4.6GHz@1.44v GA-990FXA-UD3 R4.0 HD 7950 (1100/1450) 8G Muskin DDR3 1866@8CLS 
Hard DriveOptical DriveOSMonitor
1TB WD LiteOn DVD-RW DL Linux/Windows 19" Phillips TV 1080p 
PowerCaseMouseMouse Pad
OCZ 600W Generic Junk Logitech MX400 Generic Junk 
Audio
SBL 5.1 
  hide details  
Reply
Current Rig
(14 items)
 
  
CPUMotherboardGraphicsRAM
FX-8350 4.6GHz@1.44v GA-990FXA-UD3 R4.0 HD 7950 (1100/1450) 8G Muskin DDR3 1866@8CLS 
Hard DriveOptical DriveOSMonitor
1TB WD LiteOn DVD-RW DL Linux/Windows 19" Phillips TV 1080p 
PowerCaseMouseMouse Pad
OCZ 600W Generic Junk Logitech MX400 Generic Junk 
Audio
SBL 5.1 
  hide details  
Reply
post #63 of 64
Quote:
Originally Posted by mushroomboy View Post

Quote:
Originally Posted by Brutuz View Post

He was referring to nVidia being late to go up major DX versions, which is true..but doesn't really matter that much, nVidia has always waited to take massive jumps since GeForce FX and then Fermi because both times it really bit them in the rear.
DX10.1 did make a difference? Not many games used it but when they did, it made a significant difference in performance.
Actually, out of all of the last DX9 generation cards ATIs x1950XTX was the fastest, it scaled better in CFX (Although it had to use those horrid dongles) too, ATI changed the core config to better suit the workload and really took nVidia by surprise, the previous generation had ATI cards iirc being a little faster overall but the lack of DX9.0c support is what killed them.
Source.
Actually, nVidia did it with DX10, iirc they got MS to cut a load of features so G80 could come out way earlier and DX10.1 was the inclusion of those cut features, obviously just a rumour but a rumour that was going around the entire industry at the time.
Not really, DX10 was more efficient than DX9, the idea was while you didn't get many new features to make your game look better you got a bit more performance so you could either fit larger textures/more polygons in the same performance bracket iirc, it was only incompatible with XP because when it comes down to it XP and Vista/7/8 are vastly different. (Vista is NT Kernel 6, 7 is 6.1, 8 is 6.2, XP was 5.1/5.2)

the only thing I have to say is (not being asinine, just bare fact on this) that the ATi stuff came out very late in the game. The original DX9 support by them wasn't the greatest, and shortly after Nvidia released theirs with an improved shader model. Not saying it was much but the FX series was improved slightly. However I will say both companies did better later on, Nvidia was much better with the NV40 series as Radeon was with the R450 series. Generally speaking, Nvidia has been much better in the past. This is probably the first gen that we've seen things so neck and neck, AMD might be able to surpass Nvidia now. Granted it's not ATi anymore but a much larger company, probably why they can do so much.

However on the bum side, I still hear people say Linux drivers still sway towards Nvidia despite the FOSS drivers not being up to par as with AMD. Either way, we are at a very good time for GFX. Back then I went with Nvidia because at that time (and only until recently) Nvidia kicked the crap out of anything in the Linux market. Hands down, there was no competition back then.

[edit] And I don't comment on rumors, I treat them just like conspiracy theories. Unless you have solid evidence, keep it to yourself as it's just hearsay and babble to me. Not being mean on that, I treat people the same on all fronts when it comes to that. Rumors are just that, rumors.


ATi had the original DX9 first but was late to the table with DX9.0c, iirc a lot of the R4*0 series of cards were faster than nVidia's 6 series but because of the lack of DX9.0c support (And a bunch of games that used it coming out during the generation) is what hit them hard with it, nVidia has been better generally though for sure, ATI was faster at the end of the DX9 generation but the 8800GTX wasn't far off by that point and it utterly destroyed everything before it.

I still hear about how bad AMDs Windows drivers are, I really don't know how much of it is fact and how much of it is just repeated hearsay from forums though, I don't seem to get those bugs and glitches at least, but I will admit I am extremely annoyed at being stuck on an old version of Xorg thanks to my laptops GPU being too old.

It was one of those rumours like the Fermi delay rumours, no actual proof but then no real evidence either way, take it with a grain of salt but I believe it as nVidia has done similar stuff in the past. (Not saying AMD/ATI are better companies, though)
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #64 of 64
just to throw in a useless comment-of sort, radeon 4xxx series had dx 10.1. Which actually was a really nice version, including better parallax and Ambeint occlusion if i recall. It wasn't night and day, but definitely a nice touch. i guess far cry 2 is a good example along with bfbc2.

And as for the guy saying that nvidia has been better...what? you dont remember the 9800 ultra by ati do you. Smoked the x700 card which was next gen and was cheaper. Or how about the 9550? 256mb of DDR2 for 200$(less than half the price of nvidias similar counterpart). how about the x1900? or what about the 7000 64mb which was revolutionary at the time for dx 8.1? or how about 5870 which was as within 5% performance of the gtx 480, while being cheaper, quiter, more efficient and came out 8 months earlier or the 6770 which smoked nvidias similar offering because the gtx 550ti sucked. Do some research before you speak
Edited by 8800GT - 11/27/12 at 3:14pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [BSN] Nvidia Doesn't Fully Support DirectX 11.1 with Kepler GPUs, But…