Overclock.net › Forums › Industry News › Video Game News › [Fudz] ATI to Challenge TWIMTBP
New Posts  All Forums:Forum Nav:

[Fudz] ATI to Challenge TWIMTBP - Page 3

post #21 of 27
Quote:
Originally Posted by Nowyn View Post
And who said that developers are so eager to use DX10.1 in games? I mean DX10.1, as it was confirmed over and over again, brings only a couple of new features developers consider not so important and, to go further, all of these features can be run on DX10 hardware will a bit more complex coding. And then nVidia support some features of 10.1 on their hardware (check out some reviews, it's all in there) which are the most important in their estimation (we don't have complete list of these features cause nVidia don't want to give ATI this kind of intel, so they can use it against nVidia). Now for the Vantage part, what nVidia have to do with PhysX part here? nVidia card always were beaten by ATI counterparts in 3DMark benches, real game performance was other way around however. So how come they didn't buy off 3DMark authors to tune it before? So i really fail to see what nVidia has to do with it except for using what is there for their advantage. You should blame Futuremark and Futuremark only.

However, with all those nVidia hate going around for no good reason, it's hard to expect anything other than that ATI marketing crap from a guy owning ATI card. I personally think that ATI did really well with their HD 4 series, but it doesn't give a right to start basing slightly different approach of competition.
Assassin's Creed showed pretty substantial benefits on ATI hardware with rending higher quality AA with a lower performance hit. It's not so much new graphical features, but higher efficiency with increasing AA/AF and lighting affects.
Futuremark left out DX10.1 support to keep it fair for ATI and NVidia, yet GPU accelerated Physx is enabled. If that doesn't indicate influence on NVidia's behalf then I don't know what does....
There is a reason for "all the NVidia hate" lately: they've been acting like total pricks.
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #22 of 27
Thread Starter 
Quote:
Originally Posted by Nowyn View Post
If it turns out to be true i'd like to hear ATI fanboys opinion on that, since they have always blamed nVidia for playing "dirty" when cooperating with game developers and now when ATI realized they need to do that would it mean that ATI will become a "dirty" player?
I wouldn't necessarily consider myself an full blown ATI fanboy, but I'm not exactly the biggest Nvidia fan so I'll take a shot at it. The TWIMTBP program I don't think is really a "dirty" idea. Yes, it does optimize certain games for Nvidia cards, but that's a completely different issue than sabotaging your competitor's performance. As far as an optimization-type partnership goes, I see nothing wrong with that. Game developing costs are huge, and if Nvidia is offering a program to help them cope with the costs and optimize their code, that can be a beneficial thing to the industry. In most cases, the optmization isn't that extreme anyway. For example, Crysis is often cited as a game with a big Nvidia bias, but I think that's mostly exaggerated. Even without optimizations, some games are going to run better on one architecture and vice versa, Crysis is just a combination of this an TWIMTBP in my opinion.

However, when it comes to sabotage, it's a completely different story. I do believe that Nvidia played a large role in the removal of DX10.1 from Assassin's Creed. Ubisoift reps claimed that the implementation was broken and DX10.1 lost a render pass, but this does not account for the fact that there was a slight gain in image quality from enabling DX10.1 features. Additionally, the correct way of implementing DX10.1 is to remove that render pass and achieve the same effects a different way. Ubisoft's claim that it was broken is complete BS. This is really the only situation that I think Nvidia is guilty of playing dirty.

Assassin's Creed is an extreme example at the moment, but my fear is that this is where this politicization of games could be headed, where the features supported by a game are dictated only by the corporation that sponsors it. However if Nvidia/ATI don't resort to the same kind of tactic we saw with AC, then I don't think that such programs from either camp are a bad thing. However, I'm going to remain skeptical because who knows what either company could resort to if they also have competing ISV programs.

From a business perspective, this was an absolutely necessary move. I'd like to think that the people at ATI don't really want to do this, but it really is something they have to do. Now that the have hardware at that is very competitive with Nvidia, and they have a very good chance of really taking marketshare back they cannot afford the possibility of Nivida going to extreme measures to get games optimized for their platform. I'm not saying that Nvidia will do this, but AMD/ATI seems to be employing a much more aggressive strategy and they can't take any chances.

Quote:
Originally Posted by Nowyn View Post
And who said that developers are so eager to use DX10.1 in games? I mean DX10.1, as it was confirmed over and over again, brings only a couple of new features developers consider not so important and, to go further, all of these features can be run on DX10 hardware will a bit more complex coding. And then nVidia support some features of 10.1 on their hardware (check out some reviews, it's all in there) which are the most important in their estimation (we don't have complete list of these features cause nVidia don't want to give ATI this kind of intel, so they can use it against nVidia).
I do want to say one more thing about DX10. DirectX 10 was not meant to be a huge graphical leap over DX9. This is the reason why you people will complain that DX10 doesn't look any better than DX9, and for the most part this is true. What DX10 was intended to do was allow for DX9-level graphics but with fewer API function calls, and the effect would be much better performance without sacrificing quality. The reason that you see games perform worst in DX10 is because the majority of cases where DX10 is implemented, it is to provide additional effects rather than more efficient versions of the same effects. In order to provide the full benefits of the improved DX API, a game would have to be written natively in DX10.

DX10 was originally intended to have all the features that are present in DX10.1. However, Nvidia was working on G80 at the time and one of the things DX10 would have called for is (mandatory) shader-resolved antialiasing which Nvidia did not want to implement (and still doesn't). Microsoft ended up compromising because they needed to launch DX10 with Vista. This is the big thing in DX10.1 that stops Nvidia from implementing it, and despite their marketing it's far from useless (see DX10.1 AC performance). It is true that Nvidia hardware does indeed have other DX10.1 features, it's just that the shader-resolved AA is primarily stopping them from achieving full compliance.

Quote:
Now for the Vantage part, what nVidia have to do with PhysX part here? nVidia card always were beaten by ATI counterparts in 3DMark benches, real game performance was other way around however. So how come they didn't buy off 3DMark authors to tune it before? So i really fail to see what nVidia has to do with it except for using what is there for their advantage. You should blame Futuremark and Futuremark only.
I realize you weren't addressing me, but if you want to hear a balanced perspective on the issue, you can read my blog post.
(Short summary: The issue is essentially a non-issue)
Edited by darkcloud89 - 6/28/08 at 8:07am
post #23 of 27
When it comes to DX 10.1 all i hear is Assassin's Creed. There too many points of view on this matter and i don't see any proof of any of them being more valid than the other, so i don't take assumption as true facts. Give me at least one more DX 10.1 example of being superior to DX 10 and i'll change my mind. But there's no other game and such thing can't be proven by one game only.
post #24 of 27
ATI should have a slogan: "The way it's played!"
quad core madness
(14 items)
 
  
CPUMotherboardGraphicsRAM
Lapped Intel Core 2 Quad - Q6600 G0 Stepping Asus P5K Deluxe WiFi ATi Sapphire HD 6950 2GB Unlocked 8GB G.Skill Pi Black DDR2-800 PC2-6400 
Hard DriveOptical DriveOSMonitor
60GB OCZ Solid 2 SSD Samsung SATA 18X DVD Burner Windows 7 Ultimate x64 Dell 3007WFP-HC + 2x 2007FP in PLP 
KeyboardPowerCaseMouse
Ducky Shine Blue, MX Red Corsair HX620 620W Lian Li V1000 Plus Black Razer Mamba 2012 
Mouse PadAudio
XTrack Ripper XL Creative X-Fi XtremeMusic 
  hide details  
Reply
quad core madness
(14 items)
 
  
CPUMotherboardGraphicsRAM
Lapped Intel Core 2 Quad - Q6600 G0 Stepping Asus P5K Deluxe WiFi ATi Sapphire HD 6950 2GB Unlocked 8GB G.Skill Pi Black DDR2-800 PC2-6400 
Hard DriveOptical DriveOSMonitor
60GB OCZ Solid 2 SSD Samsung SATA 18X DVD Burner Windows 7 Ultimate x64 Dell 3007WFP-HC + 2x 2007FP in PLP 
KeyboardPowerCaseMouse
Ducky Shine Blue, MX Red Corsair HX620 620W Lian Li V1000 Plus Black Razer Mamba 2012 
Mouse PadAudio
XTrack Ripper XL Creative X-Fi XtremeMusic 
  hide details  
Reply
post #25 of 27
Quote:
Originally Posted by Nowyn View Post
When it comes to DX 10.1 all i hear is Assassin's Creed. There too many points of view on this matter and i don't see any proof of any of them being more valid than the other, so i don't take assumption as true facts. Give me at least one more DX 10.1 example of being superior to DX 10 and i'll change my mind. But there's no other game and such thing can't be proven by one game only.
You're missing the point: the reason that DX10.1 isn't implemented in games is due to so many titles being a TWIMTBP....the reason "all you hear is Assassin's Creed" is because it's the only title that managed to get DX10.1 support. The performance was very impressive and gave DX10.1 cards quite an edge with IQ and performance. With the next patch DX10.1 it was gone.......
Edited by binormalkilla - 6/28/08 at 9:29am
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #26 of 27
Quote:
Originally Posted by IcedEarth View Post
Wow, he almost makes it sound like a good thing to bride a dev to optimize for your cards.

Funny people should say that the consumer wins. I fail to see how...if I have a Nvidia card then it will run like crap on ATI optimized games and the same with ATI cards on Nvidia games. It is basically splitting games up into two markets, which means if you want to play the game at full whack you will need to buy both ATI/Nvidia top range GPU's in order to play ATI/Nvidia games at full speed. Now I am not saying ATI optimized games will cripple Nvidia cards, but do you really want a bad gaming experience because the other company payed more!?

The consumer does not win from graphics companies having to bride devs. Shame it has come to this.
I agree with that on principle, but bribing game devs is the only way ATI can make their cards competitive against Nvidia cards. See how crappy Crysis run on ATI cards (and on the other hand how crappy Call of Juarez run on Nvidia cards).
FairladyZ
(14 items)
 
Twin Turbo S2000
(13 items)
 
Skyline
(13 items)
 
CPUMotherboardGraphicsRAM
Core i7 2600k Asus P8Z68-M Pro MSI GTX970 Gaming 4G Corsair Ballistix Sport DDR3-1600 16GB 
Hard DriveOptical DriveCoolingOS
Western Digital Black Hybrid 1TB WD10S12X Asus BW-12B1ST Thermaltake WATER2.0 Pro Windows 7 Ultimate x64 SP1 
MonitorMonitorKeyboardPower
MAG Innovision 22 inch LCD Samsung SyncMaster 2220wm Microsoft Sidewinder X4 Antec Earthwatts 750w 
CaseMouse
Cooler Master Elite 430 Microsoft Trackball Optical 1.0 
CPUMotherboardGraphicsRAM
Q6600 (@3.4ghz, lapped) ASRock P45TurboTwins2000 HD4850 (@800gpu/1087mem) Patriot PC2-6400 2x2GB 
Hard DriveOptical DriveOSMonitor
WD6400AALS/WD6400AAKS Emprex DVD±RW Win7 Ultimate x86/XP Pro(Dual Boot) MAG 22" WS LCD 
KeyboardPowerCaseMouse
HP 5302H Multimedia Keyboard Antec Basiq 500w + Ultra Xconnect 500w Cooler Master Mystique 632S Microsoft Trackball Optical 1.0 
CPUMotherboardGraphicsRAM
Athlon X2 4600+ (@2.75ghz, lapped) ECS KN1 Lite Sapphire X1950GT 512mb (@681/661) 2x1GB Mushikin DDR400 
Hard DriveOptical DriveOSMonitor
Hitachi 80GB Maddog DVD-RW Windows XP Pro SP2 Mag 22" WS LCD LT2219WDB 
KeyboardPowerCaseMouse
Hewlett Packard 5302H Apevia 500w Generic CompUSA Microsoft Trackball Optical 1.0 
Mouse Pad
None 
  hide details  
Reply
FairladyZ
(14 items)
 
Twin Turbo S2000
(13 items)
 
Skyline
(13 items)
 
CPUMotherboardGraphicsRAM
Core i7 2600k Asus P8Z68-M Pro MSI GTX970 Gaming 4G Corsair Ballistix Sport DDR3-1600 16GB 
Hard DriveOptical DriveCoolingOS
Western Digital Black Hybrid 1TB WD10S12X Asus BW-12B1ST Thermaltake WATER2.0 Pro Windows 7 Ultimate x64 SP1 
MonitorMonitorKeyboardPower
MAG Innovision 22 inch LCD Samsung SyncMaster 2220wm Microsoft Sidewinder X4 Antec Earthwatts 750w 
CaseMouse
Cooler Master Elite 430 Microsoft Trackball Optical 1.0 
CPUMotherboardGraphicsRAM
Q6600 (@3.4ghz, lapped) ASRock P45TurboTwins2000 HD4850 (@800gpu/1087mem) Patriot PC2-6400 2x2GB 
Hard DriveOptical DriveOSMonitor
WD6400AALS/WD6400AAKS Emprex DVD±RW Win7 Ultimate x86/XP Pro(Dual Boot) MAG 22" WS LCD 
KeyboardPowerCaseMouse
HP 5302H Multimedia Keyboard Antec Basiq 500w + Ultra Xconnect 500w Cooler Master Mystique 632S Microsoft Trackball Optical 1.0 
CPUMotherboardGraphicsRAM
Athlon X2 4600+ (@2.75ghz, lapped) ECS KN1 Lite Sapphire X1950GT 512mb (@681/661) 2x1GB Mushikin DDR400 
Hard DriveOptical DriveOSMonitor
Hitachi 80GB Maddog DVD-RW Windows XP Pro SP2 Mag 22" WS LCD LT2219WDB 
KeyboardPowerCaseMouse
Hewlett Packard 5302H Apevia 500w Generic CompUSA Microsoft Trackball Optical 1.0 
Mouse Pad
None 
  hide details  
Reply
post #27 of 27
Quote:
Originally Posted by Nowyn View Post
When it comes to DX 10.1 all i hear is Assassin's Creed. There too many points of view on this matter and i don't see any proof of any of them being more valid than the other, so i don't take assumption as true facts. Give me at least one more DX 10.1 example of being superior to DX 10 and i'll change my mind. But there's no other game and such thing can't be proven by one game only.
I highly doubt you'd change your mind. By default DX10.1 is more efficient because it eliminates one render pass. So any time it is utilized properly it will be better than DX10. That's just the way it is.
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [Fudz] ATI to Challenge TWIMTBP