Overclock.net banner

1 - 14 of 14 Posts

·
Graphics Junkie
Joined
·
995 Posts
Discussion Starter #1
OK so I've read in a few places that a dual GPU card like the GTX 295 and the ATI 5970 will not be able to use all it's dedicated memory...power..and stuff...

So I'm thinking of getting a ATI 5870 instead of the ATI 5970 simple becasue of this dual and single GPU performance when using 3D applications..Like Lightwave 3D, 3D Studio MAX, Maya etc.

Could someone shine more light on this issue before I purchase my next GPU?

Thanks!
 

·
Registered
Joined
·
1,344 Posts
quadro is probably what your looking for, but im not a huge fan of ati so im a lil bias, the fire would be comparative ATI card, also i wouldnt get a gaming card for those apps.
 

·
Registered
Joined
·
2,007 Posts
Quote:


Originally Posted by AllLeafs
View Post

OK so I've read in a few places that a dual GPU card like the GTX 295 and the ATI 5970 will not be able to use all it's dedicated memory...power..and stuff...

So I'm thinking of getting a ATI 5870 instead of the ATI 5970 simple becasue of this dual and single GPU performance when using 3D applications..Like Lightwave 3D, 3D Studio MAX, Maya etc.

Could someone shine more light on this issue before I purchase my next GPU?

Thanks!

A Quadro or a FireGL is what you want to go for if you're doing SPECIFICALLY just 3D software but if you also want to game using it, I would recommend you to go for the highest-end single GPU solution you can afford that isn't a workstation card (in which case you should get the 5870).

Workstation cards are NOT meant for gaming.
 

·
Registered
Joined
·
1,493 Posts
In the case of some of those specific app's, it's probably a good idea to stick w/ nvidia. Mudbox has a few features that will not work on ati drivers, additionally, the primary render engine that ships with max is mental ray, which is owned by nvidia.

Right now dual gpu's really aren't going to do much for you. It's highly likely that some of the progressive rendering systems (vray rt / i-ray) will support gpu accelerated rendering in the future. When that happens ofc. you'd want to cram all the gpu your board will fit. But, that's not yet the case (probably not until next year).

So, generally, at least for now you should stick w/ whatever is the best single gpu you can get. It might be a good idea as well to wait for fermi for cuda support.

Anyway, all that said, it still kinda depends on specifically what you're doing to. There's quite a vast array of tasks that can be performed w/ these app's, & depending on what exactly you're focusing on, gpu performance priority level can be somewhat high or nil.
 

·
Registered
Joined
·
1,493 Posts
Quote:


Originally Posted by Core2uu
View Post

A Quadro or a FireGL

Total waste of money in the majority of circumstances. This is one of those things, that if you can't think of a specific feature this card gives you that you absolutely must have to perform your task, then it's a waste of money. Outside of very few & specific circumstances, this is generally the case.

Just one of those "if you have to ask, you don't need it" items.
 

·
Registered
Joined
·
652 Posts
Quote:


Originally Posted by supaspoon
View Post

Total waste of money in the majority of circumstances. This is one of those things, that if you can't think of a specific feature this card gives you that you absolutely must have to perform your task, then it's a waste of money. Outside of very few & specific circumstances, this is generally the case.

Just one of those "if you have to ask, you don't need it" items.

this

Workstation cards date back to the days when 3D apps used OpenGL as their graphics API.

Most 3D apps now support DirectX as well, workstations cards usually run no better under DirectX than gaming cards.

Of course all the major software/hardware vendors would have you believe otherwise, such is the profit margin in workstation cards.
 

·
Graphics Junkie
Joined
·
995 Posts
Discussion Starter #9
Rendering a 3D scene was all CPU based I thought? But better rendering times would be awesome too. Basically I'm looking for a GPU that can handle a lot of polygons on the screen...I don't want to keep hiding and un-hiding objects to lower the visible polygon count on the screen as I manipulate 3D objects without the scene becoming choppy and freezing while maneuving in the environment. Yes I like to game!
I was going to Cross fire 2x ATI 5870's...but after reading the comment about Cuda and Fermi and all that nVidia stuff I'm thinking i should jsut wait then...wil cuda and fermi architecture really make a difference in quiker rendering and more polygon display output on screen?
 

·
Registered
Joined
·
652 Posts
Most render engines only utilise the CPU, although a few such as VRAY have versions that run on CUDA, I believe a OpenCL version is on the pipeline as well.
 

·
Registered
Joined
·
1,493 Posts
Quote:


Originally Posted by moward
View Post

Most render engines only utilise the CPU, although a few such as VRAY have versions that run on CUDA, I believe a OpenCL version is on the pipeline as well.

Unless it's a very recent development, Vray is the only "mainstream" (there are a few lesser knowns) rendering engine app.'s that has a progressive (suedo-realtime) rendering system. However, the current commercially available version does not yet support gpu rendering (cuda). There are demo vid's out there showing this off, but it's not yet something you can actually get. MR's iray should support it as well, but it looks like it will not be shipping w/ max/maya 2011 versions, so it's likely the end of the year at the earliest (when AD releases 2012 or maybe an sp).

If we're lucky Vray will release a new version for autodesks 2011 app's that includes the feature. /fingers crossed.

But, at least for the moment, it's not yet applicable.
 

·
Registered
Joined
·
2,007 Posts
Quote:


Originally Posted by supaspoon
View Post

Unless it's a very recent development, Vray is the only "mainstream" (there are a few lesser knowns) rendering engine app.'s that has a progressive (suedo-realtime) rendering system. However, the current commercially available version does not yet support gpu rendering (cuda). There are demo vid's out there showing this off, but it's not yet something you can actually get. MR's iray should support it as well, but it looks like it will not be shipping w/ max/maya 2011 versions, so it's likely the end of the year at the earliest (when AD releases 2012 or maybe an sp).

If we're lucky Vray will release a new version for autodesks 2011 app's that includes the feature. /fingers crossed.

But, at least for the moment, it's not yet applicable.

And here is where Fermi comes into play.


No, but seriously, I'm hoping that with its release, we'll see some REAL progression on GPU rendering and not just limited to one proprietary commercial app such as V-ray, but a larger scope than that. It'll still be a really long time before anything like that gets incorporated into Blender, though.
 

·
Graphics Junkie
Joined
·
995 Posts
Discussion Starter #13
Ok I'll repost in this section after March 26th to finally decide what the most ideal GPU will be for 3D applications. Thanks for your input guys.
 

·
Registered
Joined
·
1 Posts
Hi there guys. Any news? It's october now, and i was hoping for an update. id like a gpu meant for 3d software as well, while still being flexible enough for today's gaming. but aside from that it would be useful to cite options regarding the best you'd recommend if money wasn't a problem, or if i wanted to get the best possible with a strained budget...

ps. unfortunately im not knowledgeable in these waters, so i can only understand noob terms...
 
1 - 14 of 14 Posts
Top