Overclock.net banner

1 - 20 of 58 Posts

·
Registered
Joined
·
1,170 Posts
Discussion Starter #1
I'm not really sure if I should get an 850W PSU and two 8800 GTXs or a 700W PSU with one 8800 GTX that will eventually be upgraded to a higher card in the future.

I hear that Sli doesn't help as much if the resolutions aren't extremely high. Does anyone know if there is a big performance difference between two 8800 GTXs and one 8800 GTX for 1280x1024 resolution? If the cost is not worth the increase for this resolution, then I'll just get one 8800 GTX that will be upgraded into a higher card in +2 years.

I just want to be able to run Crysis and Alan Wake on maxed out settings when those two are released, and I am not exactly sure if a single 8800 GTX can play those at 60 fps with all the eye candy turned to the highest.

All helpful replies will be awarded with +reps!
 

·
Premium Member
Joined
·
2,767 Posts
SLI is definitely NOT worth it for 1280x1024. You have to get up to 1600x1200 minimum to even start to see differences.

Edit: You should also note that your CPU would be a serious bottleneck for a pair of GTX's. A highly overclocked QX6700 was still slated as the bottleneck in GTX SLI benchmark testing.
 

·
Banned
Joined
·
16,364 Posts
Do you really feel the need to waste that much money on another GPU, that your cpu, nor anyone else's for that matter can even use, fully?

Read the posts in the News section, QX6700's are bottlenecking w/ one 8800....Somethin just tells me your cpu's not up to the task either.
 

·
Registered
Joined
·
1,170 Posts
Discussion Starter #5
Hmmm then what about a 20 inch Monitor with 1680x1050? Ah and I'm going to build a brand new rig. Core 2 Duo e6600, Abit AW9D-MAX i975X, and a lot of other stuff.
 

·
Premium Member
Joined
·
2,767 Posts
Quote:

Originally Posted by Akatsuki No Tobi View Post
Hmmm then what about a 20 inch Monitor with 1680x1050?
While the 20" would definitely be nice, it still won't warrant the need for such graphics overkill. Heck, my mid-low system runs all my games at 1680x1050 with med-high settings. If you want to start noticing the power advantage of the GTX go for 30" 2560x1600.
 

·
Premium Member
Joined
·
11,043 Posts
Imo i would only recommend the 8800GTX or SLI 8800GTX's if you had a min of a Core 2 Duo E6600, High Spec 24" LCD. With your CPU you are really not going to make the most out of one 8800GTX let alone SLI as it will be a terrible bottleneck.
 
  • Rep+
Reactions: Akatsuki No Tobi

·
Banned
Joined
·
16,364 Posts
If your thinking of the 8800's someday then go bigger. Nvidia recommends a 25" display for best res, on a 7900gt, soooo........ you figure the rest. I couldn't afford that big so I got a HQ 21". Its a HP f2105, or a 21" widescreen. Top res, is 1680x1050 and its got a 1000:1 contrast ratio.

*Oblivion w/ HDR @ highest res, is pleasing to the eye to say the least.
 

·
Registered
Joined
·
1,170 Posts
Discussion Starter #10
I hear though that DX10 is going to put most of the gaming pressures onto the GPU and take a load off of the processor.
 

·
Banned
Joined
·
404 Posts
Quote:

Originally Posted by Akatsuki No Tobi View Post
I hear though that DX10 is going to put most of the gaming pressures onto the GPU and take a load off of the processor.
DX10: The Future of PC Gaming

http://www.bit-tech.net/hardware/200..._gaming/1.html

Quote:
Lower CPU Overheads:
Firstly, the architects have worked to alleviate overhead issues by redesigning the performance critical portions of the API. As a result, CPU overhead should no longer be a problem with DirectX 10 class hardware when it is running on a DirectX 10 code path. In part, this is down to the new driver model (WDDM) and how as much of the driver is in the user mode as possible, but it's not the only thing that Microsoft has changed.

Because of the API redesign, the cost of draw calls and state changes has been massively reduced - the architects have achieved this by implementing some new features that result in less CPU intervention. These features include texture arrays, predicated draw and stream out - we will come to these shortly.

The final reduction in CPU overhead was achieved during the resource validation stage. In DirectX 9, resources have to be validated every time an object is used in a frame (i.e. millions of times), while resource validation is completed when the objects are created in DirectX 10. Objects are only created once, meaning that resource validation only needs to occur once too, resulting in a huge overhead reduction.
 

·
Premium Member
Joined
·
11,043 Posts
The way it's looking the only CPU's that will not bottleneck these GPU's are Intels 45nm CPU's and/or AMD's K8L. Although by then the 8800GTX will be old news anyhow...buying high end h/w is always a lose lose situation imo
 
  • Rep+
Reactions: Akatsuki No Tobi

·
Banned
Joined
·
404 Posts
Quote:

Originally Posted by Mhill2029 View Post
The way it's looking the only CPU's that will not bottleneck these GPU's are Intels 45nm CPU's and/or AMD's K8L. Although by then the 8800GTX will be old news anyhow...buying high end h/w is always a lose lose situation imo
Yes, the unfortunate part of buying hardware. Usually when you buy something ultra-high end, something else better comes out in a few weeks, or months...
 

·
Banned
Joined
·
16,364 Posts
Wait until DX10 comes out b4 you buy a 8800. Why buy something you cant fully use? Its like having to leave half of the wrapping paper on it er sumthing.
Besides its gonna be loads cheaper then, and the 8900GTX's will be out too!
 

·
Banned
Joined
·
2,069 Posts
When DX10 comes out (Crysis is D3D10) CPU bottlenecks will be a thing of the past. I'd be willing to bet a dual-core A64 will benchmark similarly to a Core2 in DX10 bench's/games. Since so much load is taken off the CPU, there will be virtually no bottleneck.
 

·
Premium Member
Joined
·
11,043 Posts
Also too much emphasis is being derived around DX10 implementation, would the 8800GTS/GTX still be in as much demand if they were just DX9? Most likely NOT, although there is always someone that wants the BEST so it'd probably still be a seller like any new hardware.

The 8800GTS/GTX are fantastic powerhouses but need equally powerful CPU's to do them any justice. Also look ahead how many titles are going to utilise DX10 off the shelf....erm......not many.

With R600 and better CPU's coming from AMD & Intel, now is the worst time to be forking out lots of cash on the latest hardware....Personally i'd sit it out and watch DX10 and Vista come about before making any decisions.
 
  • Rep+
Reactions: Akatsuki No Tobi

·
Registered
Joined
·
1,170 Posts
Discussion Starter #20
Well as you can see my computer specs, you can figure out that I've been saving up money for quite some time. I think I've saved up for about 1.5 years. I was so tempted to get a an X1900XT when it came out, but I saved up.

Now this card is out and I am so tempted to get it. But with all the reviews I read about this card, I think it would be a good investment. So I think I'll stick with a single 8800 GTX and get myself a 20 inch screen. So does anyone know if an extra 8800 GTX would help a 1600x1200 resolution?
 
1 - 20 of 58 Posts
Top