Overclock.net › Forums › Industry News › Hardware News › [nordicHW] AMD hitting NVIDIA where it hurts, the games
New Posts  All Forums:Forum Nav:

[nordicHW] AMD hitting NVIDIA where it hurts, the games - Page 9

post #81 of 90
Why the hostility? Here is a thread that has a bunch of 4870 benchmark links in it if you really still want to see them: http://www.overclock.net/rants-raves...0-threads.html
My Dustcollector
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II x4 940 Foxconn A7GM-S Gigabyte HD6850 2x2GB AMPX + 2x1GB GeIL 
Hard DriveOptical DriveOSMonitor
Samsung F1 320GB + F3 1TB LITE-ON SATA DVD Burner Windows 7 Ultimate x64 Samsung 2333SW 
PowerCaseMouse
380w Earthwatts RAIDMAX Diamond ATX-228SP A4Tech X-750F 
  hide details  
Reply
My Dustcollector
(13 items)
 
  
CPUMotherboardGraphicsRAM
Phenom II x4 940 Foxconn A7GM-S Gigabyte HD6850 2x2GB AMPX + 2x1GB GeIL 
Hard DriveOptical DriveOSMonitor
Samsung F1 320GB + F3 1TB LITE-ON SATA DVD Burner Windows 7 Ultimate x64 Samsung 2333SW 
PowerCaseMouse
380w Earthwatts RAIDMAX Diamond ATX-228SP A4Tech X-750F 
  hide details  
Reply
post #82 of 90
Quote:
Originally Posted by Croft View Post
Nvidia doesn't want to develop a whole new series of cards just for 10.1 when the average consumer won't even know the difference.
Nvidia doesn't want to implement DX10.1 because it makes it mandatory for developers to have the ability to write custom AA filters that would execute on the shader hardware. G80's architecture simply isn't set up to allow the efficient execution of AA code on the shader hardware and since they've already invested a ton of money on developing the architecture, they (understandably) don't want to change it. This was originally intended to be part of the original DX10 specifications, but at the time it was being finalized Nvidia's development of the G80 was significantly underway and knowing that they wouldn't be able to adhere fully to the specs, Microsoft was pressured into dropping it as well as a few other things so DX10 could meet the Vista deadline and have compatible hardware on the marketplace.

I think there's a lot of uncertainty as to what exactly DX10 & 10.1 are supposed to do. DX10 was never meant to be a big graphical update to DX9. The idea behind it was to allow for a slightly higher level of effects, but require less function calls in order to achieve them, which would allow games to be more efficient. DX10.1 is just a superset that involves more efficiency, but not so much in the way of graphical effects (there are a few), and Nvidia hardware does actually support some of the DX10.1 specs, but the custom filter AA is one of the bigger things that they don't. The reason that DX10 typically sees a reduction in performance despite its purpose being to the contrary, is because the games that it's used in have so far only used it to add additional effects rather than optimizing the DX9 ones. Until a game is designed exclusively for DX10, the benefits will likely not be seen.
post #83 of 90
hmm
    
CPUMotherboardGraphicsRAM
Q6600 @ 3.8Ghz / E8600 @ 4.8Ghz 24/7! DFI X48 DK HD 4870 + (4870x2) 4gb Crucial Ballistix Tracer @1100 4.4.4.12 
Hard DriveOptical DriveOSMonitor
3 x 250GB Sata II in raid 0 LG Dual Layer Burner XP pro sp2 / Vista Ultimate 22 Inch BenQ / 24 Inch Samsung 
KeyboardPowerCaseMouse
Logitech G 11 Silverstone Strider 850w Modular Antec 900 Razer Copperhead 
Mouse Pad
Nova Microoptic 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Q6600 @ 3.8Ghz / E8600 @ 4.8Ghz 24/7! DFI X48 DK HD 4870 + (4870x2) 4gb Crucial Ballistix Tracer @1100 4.4.4.12 
Hard DriveOptical DriveOSMonitor
3 x 250GB Sata II in raid 0 LG Dual Layer Burner XP pro sp2 / Vista Ultimate 22 Inch BenQ / 24 Inch Samsung 
KeyboardPowerCaseMouse
Logitech G 11 Silverstone Strider 850w Modular Antec 900 Razer Copperhead 
Mouse Pad
Nova Microoptic 
  hide details  
Reply
post #84 of 90
Quote:
Originally Posted by darkcloud89 View Post
Nvidia doesn't want to implement DX10.1 because it makes it mandatory for developers to have the ability to write custom AA filters that would execute on the shader hardware. G80's architecture simply isn't set up to allow the efficient execution of AA code on the shader hardware and since they've already invested a ton of money on developing the architecture, they (understandably) don't want to change it. This was originally intended to be part of the original DX10 specifications, but at the time it was being finalized Nvidia's development of the G80 was significantly underway and knowing that they wouldn't be able to adhere fully to the specs, Microsoft was pressured into dropping it as well as a few other things so DX10 could meet the Vista deadline and have compatible hardware on the marketplace.

I think there's a lot of uncertainty as to what exactly DX10 & 10.1 are supposed to do. DX10 was never meant to be a big graphical update to DX9. The idea behind it was to allow for a slightly higher level of effects, but require less function calls in order to achieve them, which would allow games to be more efficient. DX10.1 is just a superset that involves more efficiency, but not so much in the way of graphical effects (there are a few), and Nvidia hardware does actually support some of the DX10.1 specs, but the custom filter AA is one of the bigger things that they don't. The reason that DX10 typically sees a reduction in performance despite its purpose being to the contrary, is because the games that it's used in have so far only used it to add additional effects rather than optimizing the DX9 ones. Until a game is designed exclusively for DX10, the benefits will likely not be seen.
Thank you for that well though out and intelligent response. That actually makes sense.
Prometheus
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 D0 @ 3.3 ghz EVGA X58 SLI LE XFX GTX 275 Core Edition @ 720/1590/1224 Crucial Tactical Tracer 12GB DDR3 1600 
Hard DriveOptical DriveOSMonitor
WD 500GB / WD 320GB Blu-Ray Drive Windows 7 Professional 64-Bit Samsung 2333HD 23" 1080p 
KeyboardPowerCase
G15 (Classic Blue) Corsair 750W TX NZXT Tempest 
  hide details  
Reply
Prometheus
(13 items)
 
  
CPUMotherboardGraphicsRAM
Core i7 920 D0 @ 3.3 ghz EVGA X58 SLI LE XFX GTX 275 Core Edition @ 720/1590/1224 Crucial Tactical Tracer 12GB DDR3 1600 
Hard DriveOptical DriveOSMonitor
WD 500GB / WD 320GB Blu-Ray Drive Windows 7 Professional 64-Bit Samsung 2333HD 23" 1080p 
KeyboardPowerCase
G15 (Classic Blue) Corsair 750W TX NZXT Tempest 
  hide details  
Reply
post #85 of 90
Quote:
Originally Posted by darkcloud89 View Post
Nvidia doesn't want to implement DX10.1 because it makes it mandatory for developers to have the ability to write custom AA filters that would execute on the shader hardware. G80's architecture simply isn't set up to allow the efficient execution of AA code on the shader hardware
that is ridiculous the claim that the G80's architecture isn't set up to allow the Execution of AA CODE. thats why in all the tests w/ Filters and such on they beat the g92's. That is a false claim.
Mobstanator
(13 items)
 
  
CPUMotherboardGraphicsRAM
E8500 Xfx 780i Sli Evga Gtx 260 55nm 4gb OCZ SLI DDR2 800 
Hard DriveOptical DriveOSMonitor
WD 320 sata + WD 1tb Sony Dvd Burner IDE Vista Premium 64bit + Windows 7 Samsung 22" Lcd Synmaster 226bw 
KeyboardPowerCaseMouse
Logitech g15 Ultra 1000 Watt X3 Thermaltake Speedo A.P. Logitech g7 
  hide details  
Reply
Mobstanator
(13 items)
 
  
CPUMotherboardGraphicsRAM
E8500 Xfx 780i Sli Evga Gtx 260 55nm 4gb OCZ SLI DDR2 800 
Hard DriveOptical DriveOSMonitor
WD 320 sata + WD 1tb Sony Dvd Burner IDE Vista Premium 64bit + Windows 7 Samsung 22" Lcd Synmaster 226bw 
KeyboardPowerCaseMouse
Logitech g15 Ultra 1000 Watt X3 Thermaltake Speedo A.P. Logitech g7 
  hide details  
Reply
post #86 of 90
Quote:
Originally Posted by Mobsta21 View Post
that is ridiculous the claim that the G80's architecture isn't set up to allow the Execution of AA CODE. thats why in all the tests w/ Filters and such on they beat the g92's. That is a false claim.
Read.

custom
    
CPUMotherboardGraphicsRAM
Intel Core 2 Quad Q9450 2.66ghz Asus Striker II Formula nForce 780i Nvidia eVGA Superclocked 8800GT SLI 4 x 1GB OCZ PC8500 1066mhz SLI READY 
Hard DriveOptical DriveOSMonitor
2 x WD 250GB SATA RAID 0 + 2 x SG 320GB Storage 2 x DVD-RW SATA+IDE Windows Vista™ Ultimate 64-bit Service Pack 1 Samsung Syncmaster 2032BW 
KeyboardPowerCaseMouse
Logitech EX110 Keyboard OCZ 600W StealthXStream Antec P182 Logitech EX110 Cordless Mouse 
Mouse Pad
Random Mouse Pad 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core 2 Quad Q9450 2.66ghz Asus Striker II Formula nForce 780i Nvidia eVGA Superclocked 8800GT SLI 4 x 1GB OCZ PC8500 1066mhz SLI READY 
Hard DriveOptical DriveOSMonitor
2 x WD 250GB SATA RAID 0 + 2 x SG 320GB Storage 2 x DVD-RW SATA+IDE Windows Vista™ Ultimate 64-bit Service Pack 1 Samsung Syncmaster 2032BW 
KeyboardPowerCaseMouse
Logitech EX110 Keyboard OCZ 600W StealthXStream Antec P182 Logitech EX110 Cordless Mouse 
Mouse Pad
Random Mouse Pad 
  hide details  
Reply
post #87 of 90
Quote:
Originally Posted by Mobsta21 View Post
that is ridiculous the claim that the G80's architecture isn't set up to allow the Execution of AA CODE. thats why in all the tests w/ Filters and such on they beat the g92's. That is a false claim.
They aren't setup to for AA to run on the shaders. The AA is performed by the ROPs, if it were run on the shaders the performance would be horrible for a G80 or derived architecture hence the reason they don't want to do it.
post #88 of 90
Quote:
That's not all though. We recently learned that AMD is looking to team up with more developers, and if there's one engine that has loved ATI hardware for a long time, it's the Valve Source engine. ATI is apparently hoping to ensnare Valve as well, and have DX10.1 support with future games like Half-Life 2: Episode 3, Left 4 Dead, Portal 2 and the rest. Knowing how flexible the Source engine is, we wouldn't be surprised if current games would be patched to further boost performance and additional effects through DX10.1.
Does this guy know that none of Valve's games even support DX10 yet...?
Xombinat0r
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 @ 2.4Ghz Rev. G0 eVGA nForce 780i SLI Rev. A1 eVGA GeForce 8800GT 512MB Corsair Dominator 2GB 1066MHz 
Hard DriveOptical DriveOSMonitor
WD Caviar SE16 500GB ASUS DRW-1814BLT Windows Vista Ultimate 32bit HP w2207 
KeyboardPowerCaseMouse
Logitech G15 Rev. 1 PC P&C Silencer 750W Gigabyte 3D Aurora 570 Logitech G5 Rev. 2 
Mouse Pad
Steelseries 5L 
  hide details  
Reply
Xombinat0r
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 @ 2.4Ghz Rev. G0 eVGA nForce 780i SLI Rev. A1 eVGA GeForce 8800GT 512MB Corsair Dominator 2GB 1066MHz 
Hard DriveOptical DriveOSMonitor
WD Caviar SE16 500GB ASUS DRW-1814BLT Windows Vista Ultimate 32bit HP w2207 
KeyboardPowerCaseMouse
Logitech G15 Rev. 1 PC P&C Silencer 750W Gigabyte 3D Aurora 570 Logitech G5 Rev. 2 
Mouse Pad
Steelseries 5L 
  hide details  
Reply
post #89 of 90
Quote:
Originally Posted by Xombie View Post
Does this guy know that none of Valve's games even support DX10 yet...?
It really shouldn't that hard to implement DX10 and DX10.1 at the same time, there isn't a lot of actual difference between them, DX10.1 just makes more things mandatory that were only "supported" in DX10.
Edited by darkcloud89 - 7/1/08 at 7:05pm
post #90 of 90
Quote:
Originally Posted by darkcloud89 View Post
It really shouldn't that hard to implement DX10 and DX10.1 at the same time, there isn't a lot of actual difference between them, DX10.1 just makes more things mandatory that were only "supported" in DX10.
But is it worth it?

And according to what Valve has been saying since the release of Vista, no.
Xombinat0r
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 @ 2.4Ghz Rev. G0 eVGA nForce 780i SLI Rev. A1 eVGA GeForce 8800GT 512MB Corsair Dominator 2GB 1066MHz 
Hard DriveOptical DriveOSMonitor
WD Caviar SE16 500GB ASUS DRW-1814BLT Windows Vista Ultimate 32bit HP w2207 
KeyboardPowerCaseMouse
Logitech G15 Rev. 1 PC P&C Silencer 750W Gigabyte 3D Aurora 570 Logitech G5 Rev. 2 
Mouse Pad
Steelseries 5L 
  hide details  
Reply
Xombinat0r
(13 items)
 
  
CPUMotherboardGraphicsRAM
Q6600 @ 2.4Ghz Rev. G0 eVGA nForce 780i SLI Rev. A1 eVGA GeForce 8800GT 512MB Corsair Dominator 2GB 1066MHz 
Hard DriveOptical DriveOSMonitor
WD Caviar SE16 500GB ASUS DRW-1814BLT Windows Vista Ultimate 32bit HP w2207 
KeyboardPowerCaseMouse
Logitech G15 Rev. 1 PC P&C Silencer 750W Gigabyte 3D Aurora 570 Logitech G5 Rev. 2 
Mouse Pad
Steelseries 5L 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [nordicHW] AMD hitting NVIDIA where it hurts, the games