Overclock.net › Forums › Industry News › Video Game News › [vr-zone] did NVIDIA "break" DX10.1 in AC?!
New Posts  All Forums:Forum Nav:

[vr-zone] did NVIDIA "break" DX10.1 in AC?! - Page 6

post #51 of 68
Quote:
Originally Posted by Desidero View Post
It's really depressing thinking about the fact that I'm going to have to choose between AMD/ATI and nvidia based not on performance, but by company ethics. If this is true, along with all the other crap that nvidia has been saying and doing lately, I'll vote AMD with my money. I've never actually had an AMD CPU or an ATI GPU, but there's always time for a first. I'm planning on finally replacing my old computer (not sig pc - that's not for me personally) this winter and I feel like I should go with AMD over nvidia and Intel just because the two companies are playing dirty. Monopolies suck.
Unlike Intel nVidia is far from a monopoly. AMD/ATI holds 40% marketshare. That however doesn't make nVidia tactics any less childish.
Lee XT
(17 items)
 
  
CPUMotherboardGraphicsRAM
AMD FX-6300 Asus M5A97 SAPPHIRE Radeon HD 7850 AMD 4GB DDR3 1333MHZ 
RAMRAMRAMHard Drive
AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ OCZ Vertex 4 256GB 
CoolingOSMonitorKeyboard
Corsair H80 Windows 8.1 Pro MCE Dell P2414H WHXV7  Microsoft Generic 
PowerCaseMouseMouse Pad
Ultra 600W Limited Edition NZXT Black Steel Razer Deathadder Razer Goliath 
Audio
Realtek HD Audio 
  hide details  
Reply
Lee XT
(17 items)
 
  
CPUMotherboardGraphicsRAM
AMD FX-6300 Asus M5A97 SAPPHIRE Radeon HD 7850 AMD 4GB DDR3 1333MHZ 
RAMRAMRAMHard Drive
AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ AMD 4GB DDR3 1333MHZ OCZ Vertex 4 256GB 
CoolingOSMonitorKeyboard
Corsair H80 Windows 8.1 Pro MCE Dell P2414H WHXV7  Microsoft Generic 
PowerCaseMouseMouse Pad
Ultra 600W Limited Edition NZXT Black Steel Razer Deathadder Razer Goliath 
Audio
Realtek HD Audio 
  hide details  
Reply
post #52 of 68
Quote:
Originally Posted by VulcanDragon View Post
What I don't understand is: When DirectX was first invented, one of the core tenets of the library was that it would query the hardware to see what features your machine could support, then use those features. Has that changed? If not, then DirectX 10.0 should be fully enabled and supported in any game coded for 10.1. If you're using a 10.1 enabled card from ATI, you get the 10.1 features; if you're using a 10.0 card from nVidia, the game would know that and only execute 10.0 calls. Unless this basic principle of DirectX has changed, then there is no excuse for ever removing support for a higher level of DirectX.
Exactly. It just doesn't make any sense for Nvidia not to be behind this move.
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
post #53 of 68
Quote:
Originally Posted by VulcanDragon View Post
I think it's sad that the developers are entering into these hardware marketing arrangements in the first place. I understand why they do it...games are expensive to create, and the extra cash from nVidia helps get the game made. But can anyone really claim to be surprised when the developer succumbs to pressure to ensure that the game looks better, or at least no worse, than on the competing card? This is the scummy side of capitalism, I'm afraid.
Precisely why don't see this as that big of a deal. I will not try to defend nVidia's hush money, or benchmarks propped by their pay-offs, but this case is a little bit different.

Allow me to shift the focus of this thread.

*Invoke M$ rant!*

Its really all Microsoft's mess up that causes these problems. I don't really believe either company can be blamed in matters like this. nVidia rejected DX10.1 when Microsoft brought it to the table because of silicon costs, yes. And it is true that ATI pulled it off, obviously. Does that make nVidia some big money-grubbing corporation? No, because trying to re-implement a new directx would mean back to the drawing board with the G80's and G92's. Where would that put nVidia? I think we all can venture an educated guess...

Although, I have to admit, I'm surprised at the rumors that the GT200 will still be just DX10. I thought for sure that by the time the new architectures started to hit (and DX10.1 actually got some use) nVidia would have leveled their own playing field.

But at the same time, nVidia put their foot down protecting the fruits of billions of dollars and over a year of production time. A wise move, by any man's call. Lets say ten, that AMD hopped on that board as well, and rejected DX10.1. What would happen then? I think anyone with a slight knack for clairvoyance and a little knowledge of the market would say that with one market standard for future generations of games, regardless of whether or not its an inferior one, would result in much lower costs for game developers, consumers, and ultimately more progression of the technology in general.

Again, I blame this on Microsoft's inability to get the job done with DX10 alone, not nVidia's stubbornness to not completely whipe away what was the GeForce 8 series.
Buddha's PC
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k Gigbyte Z77X- UD3H GTX 670 Samsung DDR3-1600 
Hard DriveHard DriveOSMonitor
500GB Seagate 7200RPM 1TB Samsung Spinpoint F3 Windows 7 Ultimate 23" Samsung Syncmaster 2333SW 
MonitorMonitorKeyboardPower
23" Samsung Syncmaster 2333SW 23" Samsung Syncmaster 2333SW Filco Ninja Majestouch 2 PCP&P 610w Silencer 
CaseMouseMouse PadAudio
Antec 1200 Razer Deathadder 3.5G Mionix Sargas Logitech G930 
  hide details  
Reply
Buddha's PC
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5 2500k Gigbyte Z77X- UD3H GTX 670 Samsung DDR3-1600 
Hard DriveHard DriveOSMonitor
500GB Seagate 7200RPM 1TB Samsung Spinpoint F3 Windows 7 Ultimate 23" Samsung Syncmaster 2333SW 
MonitorMonitorKeyboardPower
23" Samsung Syncmaster 2333SW 23" Samsung Syncmaster 2333SW Filco Ninja Majestouch 2 PCP&P 610w Silencer 
CaseMouseMouse PadAudio
Antec 1200 Razer Deathadder 3.5G Mionix Sargas Logitech G930 
  hide details  
Reply
post #54 of 68
Quote:
Originally Posted by liermam View Post
Precisely why don't see this as that big of a deal. I will not try to defend nVidia's hush money, or benchmarks propped by their pay-offs, but this case is a little bit different.

Allow me to shift the focus of this thread.

*Invoke M$ rant!*

Its really all Microsoft's mess up that causes these problems. I don't really believe either company can be blamed in matters like this. nVidia rejected DX10.1 when Microsoft brought it to the table because of silicon costs, yes. And it is true that ATI pulled it off, obviously. Does that make nVidia some big money-grubbing corporation? No, because trying to re-implement a new directx would mean back to the drawing board with the G80's and G92's. Where would that put nVidia? I think we all can venture an educated guess...

Although, I have to admit, I'm surprised at the rumors that the GT200 will still be just DX10. I thought for sure that by the time the new architectures started to hit (and DX10.1 actually got some use) nVidia would have leveled their own playing field.

But at the same time, nVidia put their foot down protecting the fruits of billions of dollars and over a year of production time. A wise move, by any man's call. Lets say ten, that AMD hopped on that board as well, and rejected DX10.1. What would happen then? I think anyone with a slight knack for clairvoyance and a little knowledge of the market would say that with one market standard for future generations of games, regardless of whether or not its an inferior one, would result in much lower costs for game developers, consumers, and ultimately more progression of the technology in general.

Again, I blame this on Microsoft's inability to get the job done with DX10 alone, not nVidia's stubbornness to not completely whipe away what was the GeForce 8 series.
Do you have any articles to support your claims regarding why Nvidia didn't use DX10.1? I'm not sure I follow along, or agree with this being Microsoft's fault for that matter. These specifications have been released for quite some time, and if a company with the current financial issues of AMD/ATI can make it work then Nvidia has no excuse.
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
BladeRunner v3.0
(11 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i7-5930K @ 4.6GHz Core, 4.4GHz Cache ASUS X99 Sabertooth Sapphire R9 380 Dual-X OC G.Skill TridentZ 32GB DDR4 @ 13-15-13-33-1T 320... 
Hard DriveCoolingOSKeyboard
Samsung 850 Pro 512GB Noctua NH-D15S Windows 10 Home 64-bit Logitech G910 Orion Spark 
PowerCaseMouse
EVGA SuperNova 1000W T2 NZXT Phantom 820 Black Logitech G5 
  hide details  
Reply
post #55 of 68
Quote:
Originally Posted by liermam View Post
Precisely why don't see this as that big of a deal. I will not try to defend nVidia's hush money, or benchmarks propped by their pay-offs, but this case is a little bit different.

Allow me to shift the focus of this thread.

*Invoke M$ rant!*

Its really all Microsoft's mess up that causes these problems. I don't really believe either company can be blamed in matters like this. nVidia rejected DX10.1 when Microsoft brought it to the table because of silicon costs, yes. And it is true that ATI pulled it off, obviously. Does that make nVidia some big money-grubbing corporation? No, because trying to re-implement a new directx would mean back to the drawing board with the G80's and G92's. Where would that put nVidia? I think we all can venture an educated guess...

Although, I have to admit, I'm surprised at the rumors that the GT200 will still be just DX10. I thought for sure that by the time the new architectures started to hit (and DX10.1 actually got some use) nVidia would have leveled their own playing field.

But at the same time, nVidia put their foot down protecting the fruits of billions of dollars and over a year of production time. A wise move, by any man's call. Lets say ten, that AMD hopped on that board as well, and rejected DX10.1. What would happen then? I think anyone with a slight knack for clairvoyance and a little knowledge of the market would say that with one market standard for future generations of games, regardless of whether or not its an inferior one, would result in much lower costs for game developers, consumers, and ultimately more progression of the technology in general.

Again, I blame this on Microsoft's inability to get the job done with DX10 alone, not nVidia's stubbornness to not completely whipe away what was the GeForce 8 series.
I agree to a point, but then if it cost Nvidia so much to implement DX10.1 into their silicon why was AMD able to do it so seemlessly?? That is what I ask.

Or does it revert back to Nvidia not implementing all features that were supposed to be in DX10(which MS set) and AMD was able to. So you can in a way assume that Nvidia is in a way to blame since even before the G80 was in full production they knew what it was supposed to be able to do but couldn't.

Ever wonder why AMD/ATi implemented DX10.1 so quickly?

imo, what I see is Nvidia wanting it to run their way on their hardware, they don't care how it runs on anybody else's hardware. MS is supposed to set the standard for DX10 but when a gfx chip maker that holds 60% of the market says, "well our hardware can't handle that" what is MS to do, kill their whole OS due to one company. Of course not, so they change minor things to bypass what can't be run.

I wonder this, why is Nvidia's arch. still not up to snuff and able to run full DX10 as it was meant to. I think in the small details DX10.1 was just trying to implement some features that were supposed to be there from the beginning. Of course it would cost Nvidia a lot of money their hardware was a failure from the beginning. And why is it rumored that yet again Nvidia will not have hardware capable of running DX10.1 on the new GT200 core's??

It's anybodys guess about this actually, just doesn't make sense why they want to hold back a progress'ing API, DX10 needs to mature just as much as DX9 did, and how many times did that change?
D
(15 items)
 
The Sheep Skinner
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel i7 6700 Gigabyte Z170N-Gaming5 Sapphire Radeon R9 Fury Tri-X 3840 G.Skill TridentZ  
Hard DriveCoolingCoolingCooling
960 EVO 500GB EK SE 240mm, Magicool slim 240mm EK Supreme HF CU Gold EKFC-Fury X WB 
OSMonitorPowerCase
Win 10 Pro Acer XG270HU EVGA 750W  Evolv ITX 
MouseMouse Pad
Naos7000 Corsair MM600 
CPUMotherboardGraphicsRAM
C2D E8400 DFI LT P35 Radeon HD4890 OCZ 2GB 800MHz 
Hard DriveOptical DriveOSMonitor
500GB Asus multi DVD W7 U Samsung 2232BW+ 
PowerCase
Corsair HX520W CM 690 
  hide details  
Reply
D
(15 items)
 
The Sheep Skinner
(13 items)
 
 
CPUMotherboardGraphicsRAM
Intel i7 6700 Gigabyte Z170N-Gaming5 Sapphire Radeon R9 Fury Tri-X 3840 G.Skill TridentZ  
Hard DriveCoolingCoolingCooling
960 EVO 500GB EK SE 240mm, Magicool slim 240mm EK Supreme HF CU Gold EKFC-Fury X WB 
OSMonitorPowerCase
Win 10 Pro Acer XG270HU EVGA 750W  Evolv ITX 
MouseMouse Pad
Naos7000 Corsair MM600 
CPUMotherboardGraphicsRAM
C2D E8400 DFI LT P35 Radeon HD4890 OCZ 2GB 800MHz 
Hard DriveOptical DriveOSMonitor
500GB Asus multi DVD W7 U Samsung 2232BW+ 
PowerCase
Corsair HX520W CM 690 
  hide details  
Reply
post #56 of 68
Quote:
Originally Posted by rx7racer View Post
I agree to a point, but then if it cost Nvidia so much to implement DX10.1 into their silicon why was AMD able to do it so seemlessly?? That is what I ask.

Or does it revert back to Nvidia not implementing all features that were supposed to be in DX10(which MS set) and AMD was able to. So you can in a way assume that Nvidia is in a way to blame since even before the G80 was in full production they knew what it was supposed to be able to do but couldn't.

Ever wonder why AMD/ATi implemented DX10.1 so quickly?
Nvidia doesn't want to implement DX10.1 because it requires that antialiasing be done on the shaders, and this is why ATI was able to implement it so fast because their hardware already did AA that way. Given Nvidia's architecture, they would take a much greater performance hit from enabling AA than they do now, similar to the way the HD3800 series does. They probably would have to make some pretty significant changes to their architecture to better implement it, but given their constant rebranding of the G80, it's clearly something they don't plan on doing.

Microsoft had originally intended for everything in DX10.1 to be part of the original DX10, but I think they ended up compromising due to pressure from Nvidia. ATI just went with the original specifications and so they didn't really have to do anything major besides slap a DX10.1 sticker on when Microsoft finally included it in SP1.

Quote:
imo, what I see is Nvidia wanting it to run their way on their hardware, they don't care how it runs on anybody else's hardware. MS is supposed to set the standard for DX10 but when a gfx chip maker that holds 60% of the market says, "well our hardware can't handle that" what is MS to do, kill their whole OS due to one company. Of course not, so they change minor things to bypass what can't be run.

I wonder this, why is Nvidia's arch. still not up to snuff and able to run full DX10 as it was meant to. I think in the small details DX10.1 was just trying to implement some features that were supposed to be there from the beginning. Of course it would cost Nvidia a lot of money their hardware was a failure from the beginning. And why is it rumored that yet again Nvidia will not have hardware capable of running DX10.1 on the new GT200 core's??
I agree with you. I think Nvidia just looked at the situation and didn't want to spend any money creating a new design and so they continue to use G80 as a base for everything. I suspect that even GT200 is a direct derivative of G80. I'm not saying this is a bad thing, since ATi is doing the exact same thing since the HD2000 series, but Nvidia is starting with a design that never correctly implemented DX10 in the first place.

And you're right, GT200 still does not support DX10.1 and Nvidia hasn't announced any plans of supporting it in the future.
Edited by darkcloud89 - 6/3/08 at 12:36pm
post #57 of 68
Quote:
Originally Posted by liermam View Post
Precisely why don't see this as that big of a deal. I will not try to defend nVidia's hush money, or benchmarks propped by their pay-offs, but this case is a little bit different.

Allow me to shift the focus of this thread.

*Invoke M$ rant!*

Its really all Microsoft's mess up that causes these problems. I don't really believe either company can be blamed in matters like this. nVidia rejected DX10.1 when Microsoft brought it to the table because of silicon costs, yes. And it is true that ATI pulled it off, obviously. Does that make nVidia some big money-grubbing corporation? No, because trying to re-implement a new directx would mean back to the drawing board with the G80's and G92's. Where would that put nVidia? I think we all can venture an educated guess...

Although, I have to admit, I'm surprised at the rumors that the GT200 will still be just DX10. I thought for sure that by the time the new architectures started to hit (and DX10.1 actually got some use) nVidia would have leveled their own playing field.

But at the same time, nVidia put their foot down protecting the fruits of billions of dollars and over a year of production time. A wise move, by any man's call. Lets say ten, that AMD hopped on that board as well, and rejected DX10.1. What would happen then? I think anyone with a slight knack for clairvoyance and a little knowledge of the market would say that with one market standard for future generations of games, regardless of whether or not its an inferior one, would result in much lower costs for game developers, consumers, and ultimately more progression of the technology in general.

Again, I blame this on Microsoft's inability to get the job done with DX10 alone, not nVidia's stubbornness to not completely whipe away what was the GeForce 8 series.

Regardless of why nvidia didn't impliment dx10.1 it is NO EXCUSE for using thier influence over a developer to have them remove a feature that improves performance for dx10.1 cards, no excuse at all!
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3570k ASRock Z77 Extreme6 EVGA GTX 980ti Superclock 4x4GB Samsung 1600mhz 
Hard DriveHard DriveCoolingOS
Samsung SSD840 2xMaxtor 200gb RAID0 Customer water cooling with '77 Bonneville rad Windows 10 Professional 
MonitorKeyboardPowerCase
27" Korean 1440p Razor mechanical Corsair TH850W CM Stacker 810 
MouseMouse Pad
Razor Lachesis generic 
  hide details  
Reply
My System
(15 items)
 
  
CPUMotherboardGraphicsRAM
Intel Core i5 3570k ASRock Z77 Extreme6 EVGA GTX 980ti Superclock 4x4GB Samsung 1600mhz 
Hard DriveHard DriveCoolingOS
Samsung SSD840 2xMaxtor 200gb RAID0 Customer water cooling with '77 Bonneville rad Windows 10 Professional 
MonitorKeyboardPowerCase
27" Korean 1440p Razor mechanical Corsair TH850W CM Stacker 810 
MouseMouse Pad
Razor Lachesis generic 
  hide details  
Reply
post #58 of 68
Quote:
Originally Posted by Urufu_Shinjiro View Post
Regardless of why nvidia didn't impliment dx10.1 it is NO EXCUSE for using thier influence over a developer to have them remove a feature that improves performance for dx10.1 cards, no excuse at all!
I agree, that's just total crap. I mean There is no proof that DX10.1 support would have hurt NVidia cards' performance in the game, so why prevent them from implementing the latest technology that allows ATI users to have better image quality at a lower performance hit?
They were unhappy with the results showing ATI on top with cards much cheaper than theirs, so they have to bring in the great equalizer............The Way It's Meant To Be Played.....
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
PWNzershreck
(15 items)
 
  
CPUMotherboardGraphicsRAM
4930K @ 4.6 GHz ASUS Rampage IV Black Edition MSI GTX 1080 FE Heatkiller Acetal 16 GB Corsair Vengeance 1600C9 
Hard DriveOptical DriveCoolingOS
2x Samsung 840 Pro  ASUS DVD-RW SATA Koolance 380i & 2x HW Labs 480GTX Arch Linux x86_64, Windows 7 x64 
MonitorKeyboardPowerCase
LG UC88-B Ultrawide, ASUS VS278Q Ducky Corsair AX1200i Caselabs STH10 
MouseMouse PadAudio
Logitech G500 Func 1030 ASUS Xonar Essence STX 
  hide details  
Reply
post #59 of 68
That's the annoying thing about it. AMD/ATI is suffering due to Nvidia having a bigger hold on the market. Nvidia are winning with inferior technology and claiming it is better.

Personally i don't care if Nvidia are on top or not, i go with performance. But i certainly won't go with a company that doesn't have the consumer in mind.
Smallville
(16 items)
 
  
CPUMotherboardGraphicsGraphics
Intel i5 3570k Asus Sabertooth Z77 Asus HD7950 DirectCU II Asus HD7950 DirectCU II 
RAMHard DriveOptical DriveCooling
8GB DDR3 Corsair Vengeance 1600Mhz 128GB Corsair Force Pro, 2 x 2TB HDD, 500GB Ext... LG Blu-Ray, DVD/CD RW drive Corsair H60 
OSMonitorKeyboardPower
Windows 8.1 Pro 64-bit Genuine 27" Crossover 27Q S-IPS 1440p Cyborg V7 Corsair HX1000 
CaseMouseMouse PadAudio
NZXT Phantom White Microsoft Sidewinder X8 Wireless Steelseries QcK Sennheiser eH250 
  hide details  
Reply
Smallville
(16 items)
 
  
CPUMotherboardGraphicsGraphics
Intel i5 3570k Asus Sabertooth Z77 Asus HD7950 DirectCU II Asus HD7950 DirectCU II 
RAMHard DriveOptical DriveCooling
8GB DDR3 Corsair Vengeance 1600Mhz 128GB Corsair Force Pro, 2 x 2TB HDD, 500GB Ext... LG Blu-Ray, DVD/CD RW drive Corsair H60 
OSMonitorKeyboardPower
Windows 8.1 Pro 64-bit Genuine 27" Crossover 27Q S-IPS 1440p Cyborg V7 Corsair HX1000 
CaseMouseMouse PadAudio
NZXT Phantom White Microsoft Sidewinder X8 Wireless Steelseries QcK Sennheiser eH250 
  hide details  
Reply
post #60 of 68
So Nvidia is doing the same thing Intel is doing to AMD in the CPU world? Can AMD ever get a break? Perhaps AMD should take Nvidia to court over this!
    
CPUMotherboardGraphicsRAM
X2 5000+ BE Gigabyte GA-MA78GM-S2H AM2+ Integrated HD3200 2x2GB OCZ DDR2 
Hard DriveMonitorPowerCase
Maxtor 40GB Samsung 2220WM 22" Antec Neo HE 550W Ultra Wizard 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
X2 5000+ BE Gigabyte GA-MA78GM-S2H AM2+ Integrated HD3200 2x2GB OCZ DDR2 
Hard DriveMonitorPowerCase
Maxtor 40GB Samsung 2220WM 22" Antec Neo HE 550W Ultra Wizard 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Video Game News
Overclock.net › Forums › Industry News › Video Game News › [vr-zone] did NVIDIA "break" DX10.1 in AC?!