Overclock.net banner

1 - 20 of 35 Posts

·
Premium Member
Joined
·
5,777 Posts
Discussion Starter · #1 ·


Quote:


We have asked around and most Nvidia partners have told us that they will be happy once Fermi single chip start selling and all of them expect healthy demand for this card. They are preparing to start selling Geforce GTX 480 and Geforce GTX 470 and at this time, most of them are left in the dark about a possible dual-GPU Fermi based card.

Since such a card would have a huge TDP, around 400W in the best case scenario, there is a big chance that Nvidia will wait for some future chips, eg. a mobile version of Fermi to make such a card.

Our sources are suggesting that they are not aware of any dual-Fermi slated to appear this Spring. If all goes well we might see the dual Fermi or some derived dual Fermi around Computex time, in the first days of June 2010.

Until dual Fermi comes out, Radeon HD 5970 remains king of the hill. Who would have thought that buying this card would prove to be such a great investment.

Source
 

·
Z-80 > i9
Joined
·
17,244 Posts
I doubt they'll have one before 32nm is ready... Which will be the next generation of cards anyway.
Reducing 215w to 150w isn't easy.
 

·
Premium Member
Joined
·
6,160 Posts
28nm is the next step for GPU graphics lithography if I'm not mistaken.

Based on what we know so far, Fermi should be able to get close to the 5970s stock performance with higher clock speeds if heat wasn't a culprit.

I say this because nVidia's core design is more complex than ATi's stream processor. We're talking about 16-way ALUs for nVidia's CUDA cores and 5-way ALUs for ATi's stream processors. (Which is why ATi has 1600 of them and nVidia has as many as 512)

Don't forget that these CUDA cores are arranged in a MIMD array rather than a SIMD array like ATi's stream processors.

If they could just get heat and power consumption under control, but nV has always had a problem in that area, at least it has since GPUs switched to the unified shader architecture with the G80 core back in '06.

They're philosophies are different as well. nV starts with a large core design and regresses to weaker cores from there. ATi starts with efficiency and then makes a monster based on that efficiency(5970). Yes that beast is hot and power hungry, but conversely, you get the most powerful card in the world.
 

·
Registered
Joined
·
1,227 Posts
isn't it cos power consumption will be too high?
edit: rushed to comment on article before even reading...fail
 

·
Banned
Joined
·
11,112 Posts
Quote:


Originally Posted by Chrono Detector
View Post

Yeah, its quite shocking that a 480 GTX TDP is close to a 5970, and for them to make a dual chip card isn't possible for NVIDIA at the moment.

GTX 480*... let's get it right this time?
 

·
Banned
Joined
·
4,563 Posts
I think they will have a dual fermi announced if not in the process of being made within this next year. Nvidia doesn't like to be in second place when it comes to gpu performance so I can't see them putting this off for too long.
 

·
Premium Member
Joined
·
5,295 Posts
This is one reason why the price of 5970 will not drop when 470/480 is released. It will not any time soon.

Perhaps until a dual core Fermi arrives on scene, Nvidia might close up the performance gap somewhat by doing a refresh of the 480 by enabling all 512 SPs and with factory overclocking? And, ATI might lower the price of 5970 to increase its price/performance ratio against this refresh.
 

·
Banned
Joined
·
3,579 Posts
Quote:


Originally Posted by Brutuz
View Post

I doubt they'll have one before 32nm is ready... Which will be the next generation of cards anyway.
Reducing 215w to 150w isn't easy.

28nm is the next gpu size
 

·
Z-80 > i9
Joined
·
17,244 Posts
Quote:


Originally Posted by BizzareRide
View Post

28nm is the next step for GPU graphics lithography if I'm not mistaken.

But GPUs can still use 32nm if they wanted, personally if I was nVidia I'd try to get to 32nm as soon as possible (At the very least, it would enable the use of 512 SP GPUs at 750Mhz)

Quote:


Originally Posted by vicious_fishes
View Post

28nm is the next gpu size


You mean its the next half node, GPUs can use full nodes if they want, for example the 65nm GPUs. (And the 180nm ones iirc)
 

·
Premium Member
Joined
·
5,777 Posts
Discussion Starter · #12 ·
Quote:


Originally Posted by Chrono Detector
View Post

Yeah, its quite shocking that a 480 GTX TDP is close to a 5970, and for them to make a dual chip card isn't possible for NVIDIA at the moment.

It is possible if they use three PCI-e power connectors; but I think the whole point is that they're trying to avoid that at the moment because doing so will limit the scope of the market audience for that card.
 

·
Premium Member
Joined
·
36,139 Posts
Quote:

Originally Posted by Brutuz View Post
But GPUs can still use 32nm if they wanted, personally if I was nVidia I'd try to get to 32nm as soon as possible (At the very least, it would enable the use of 512 SP GPUs at 750Mhz)
TSMC skipped 32nm. Global Foundries is the only fab doing 32nm besides Intel.

But it's more cost effective to skip 32nm overall for just about everything. The difference between 40nm & 28nm is much greater than 40nm & 32nm.

Also, GPU's & CPU's are not limited to half-nodes or full-nodes. (As Via's done a half node CPU.)
It's just which ever the company chooses to use. And Intel/AMD have stuck to Full-nodes so far. It's more cost effective for bulk production of a line that's going to be selling for more than 6-9 months. (Unlike GPU's.)

Quote:

Originally Posted by Open1Your1Eyes0 View Post
It is possible if they use three PCI-e power connectors; but I think the whole point is that they're trying to avoid that at the moment because doing so will limit the scope of the market audience for that card.
They can't use 3 PCIe power connectors on a Reference design. That's out of spec (IE: GPU's are limited to a 300w barrier right now.) From design specs and such. So unless it was 3 6pins, they'd be going over spec.
 

·
Registered
Joined
·
590 Posts
Quote:

Originally Posted by Tator Tot View Post

They can't use 3 PCIe power connectors on a Reference design. That's out of spec (IE: GPU's are limited to a 300w barrier right now.) From design specs and such. So unless it was 3 6pins, they'd be going over spec.
Maybe i do not understand the market for 800-1000$ video cards but is that really an issue?

I mean do a lot of people pay 1000$ for a videocard but do not have the computer to handle it? You would think if you can blow almost a grand on a video card you can afford the PSU that goes along with it.
 

·
Registered
Joined
·
386 Posts
Quote:

Originally Posted by BounouGod View Post
Maybe i do not understand the market for 800-1000$ video cards but is that really an issue?

I mean do a lot of people pay 1000$ for a videocard but do not have the computer to handle it? You would think if you can blow almost a grand on a video card you can afford the PSU that goes along with it.
I don't think the consumer has any say in how much power the consumer is willing to use.
 

·
Premium Member
Joined
·
36,139 Posts
Quote:

Originally Posted by BounouGod View Post
Maybe i do not understand the market for 800-1000$ video cards but is that really an issue?

I mean do a lot of people pay 1000$ for a videocard but do not have the computer to handle it? You would think if you can blow almost a grand on a video card you can afford the PSU that goes along with it.
It's not about the PSU. It's about industry standards.

Industry standards state a PCIe GPU cannot exceed 300watts.
 

·
Registered
Joined
·
1,534 Posts
They should just market it without a heatsink and no reccomended power use and let the hardcore enthusiasts take care of it. I'm sure somone would be able to easily get a full coverage waterblock then.
 

·
Registered
Joined
·
590 Posts
Quote:

Originally Posted by Tator Tot View Post
It's not about the PSU. It's about industry standards.

Industry standards state a PCIe GPU cannot exceed 300watts.

Right, if you want to stick within standards, but can't you make a GPU outside of standards and still sell it? I know it means places like DELL wont touch it but i mean when you are selling a 1000$ GPU you have to expect your target consumer is a bit more tech savy then average no?
 

·
Registered
Joined
·
590 Posts
Quote:

Originally Posted by Hey Zeus View Post
Just buy 2 480's if you want a dual setup


Or 3 and a small nuclear power plant to power them!
 
1 - 20 of 35 Posts
Top