Overclock.net banner
1 - 20 of 393 Posts

· Premium Member
Joined
·
14,321 Posts
Discussion Starter · #1 ·
I open this thread so that all members are able to share their information and discuss GeForce GTX200 series GPU here. I greatly appreciate your supports.

The specs of GeForce GTX200 series GPU lists here is unofficial. I will update it as soon as the official data is available.

For all GeForce GTX200 series GPUs owners, if you would like to share your information, (which I encourage you to do), please provide your information. I will list you in the GeForce GTX200 series owner's list.

Currently, we don’t have a feature that allows multi members maintain a thread. This is very important because we have many members who have exceptional knowledge and great experiences. Their valuable inputs will definitely enrich and make this thread informative. Therefore, I encourage members provide information related to GeForce GTX200 series GPU and suggestion on how to maintain this thread. You post the information under the main thread. I will create a link pointing to your post under your name. In this way, you have all credits. Viewers can go to your post to view the contents. I encourage viewers give rep to these members if you find their information is helpful. This is the best way for you to appreciate their efforts.

Geforce GTX200 Series GPU Thread Signature tag

Code:

Code:
[CODE]
[center][url="http://www.overclock.net/nvidia/334363-geforce-gtx200-series-gpu-information-thread.html"]:bruce:[b]Geforce GTX200 Series GPU Information Thread[/b][/url][/center]
[/CODE]

To add this to your signature, go to User CP, edit signature, copy the code above, paste there and save.

Geforce GTX200 Series GPU Information Thread

Table of Contents

  • Latest GeForce GTX200 series GPU Information And Discussion
  • GeForce GTX200 series GPU Owners
  • GeForce GTX200 series GPU Reference Specs
  • Retail GeForce GTX200 series GPU Specs and Manufactuers List
  • GeForce GTX200 series GPU Pictures Gallery
  • Reference Benchmark

Latest GeForce GTX200 series GPU Information And Discussion
[EXP] More Pics of GTX295 Leaked

Details Here

Quote:





Some GTX280/260 Retails Information

GTX200 Owner List

Below is a list of some of the current GTX200 owners. If you would like to share the your information, please provide the following information: Owner Name, Manufacturer Name, Model #, Memory Size, Core Clock (stock/OC), Shader Clock (stock/OC), Memory Clock (stock/OC), SLI, GPU HS, GPU Idle Temperature, GPU Load Temp, Processor, CPU Speed, System Memory, Motherboard, 3DMark06/Vantage score and OS (operation system).

Please feel free to contact me to change your information.



Retail GTX260/280 Specs, Manufacturers List

Here is a list of the current GTX260/280 cards that are available as of today. This list will be updated as more information is released. The default GTX280 core clock is set at 602 MHz, shader clock at 1296 MHz and memory clock at 1107 MHz. The default GTX260 core clock is set at 576 MHz, shader clock at 1242 MHz and memory clock at 999 MHz. (Data that is not specified by the manufacturer with be denoted with an * (asterisk).



GTX280 & GTX260 Review

NVIDIA GTX 295 Previews
 

· Registered
Joined
·
1,971 Posts
Quote:

Originally Posted by thejamesman View Post
Is the GTX200 even...out?
No, it's not, but it's not too early to hoover info.

http://www.dailytech.com/Nextgen+NVI...ticle11842.htm

Quote:
NVIDIA's upcoming Summer 2008 lineup gets some additional details

Later this week NVIDIA will enact an embargo on its upcoming next-generation graphics core, codenamed D10U. The launch schedule of this processor, verified by DailyTech, claims the GPU will make its debut as two separate graphics cards, currently named GeForce GTX 280 (D10U-30) and GeForce GTX 260 (D10U-20).

The GTX 280 enables all features of the D10U processor; the GTX 260 version will consist of a significantly cut-down version of the same GPU. The D10U-30 will enable all 240 unified stream processors designed into the processor. NVIDIA documentation claims these second-generation unified shaders perform 50 percent better than the shaders found on the D9 cards released earlier this year.

The main difference between the two new GeForce GTX variants revolves around the number of shaders and memory bus width. Most importantly, NVIDIA disables 48 stream processors on the GTX 260. GTX 280 ships with a 512-bit memory bus capable of supporting 1GB GDDR3 memory; the GTX 260 alternative has a 448-bit bus with support for 896MB.

GTX 280 and 260 add virtually all of the same features as GeForce 9800GTX: PCIe 2.0, OpenGL 2.1, SLI and PureVideoHD. The company also claims both cards will support two SLI-risers for 3-way SLI support.

Unlike the upcoming AMD Radeon 4000 series, currently scheduled to launch in early June, the D10U chipset does not support DirectX extentions above 10.0. Next-generation Radeon will also ship with GDDR4 while the June GeForce refresh is confined to just GDDR3.

The GTX series is NVIDIA's first attempt at incorporating the PhysX stream engine into the D10U shader engine. The press decks currently do not shed a lot of information on this support, and the company will likely not elaborate on this before the June 18 launch date.

After NVIDIA purchased PhysX developer AGEIA in February 2008, the company announced all CUDA-enabled processors would support PhysX. NVIDIA has not delivered on this promise yet, though D10U will support CUDA, and therefore PhysX, right out of the gate.

NVIDIA's documentation does not list an estimated street price for the new cards.
DailyTech has a history of ignoring embargos and posting info before it's due, seems this is pretty solid.

Quote:
The GTX series is NVIDIA's first attempt at incorporating the PhysX stream engine into the D10U shader engine.
I thought they said PhysX would be ported to CUDA and supported by all CUDA-enabled GPUs (G80 and upwards). I guess they thought they'd better not throw their old users a bone and rather give them a trinket to upgrade for..
 

· Tator Tot Enthusiast
Joined
·
3,330 Posts
Sweet! Will post as much as I can about GTX200 info
 

· Registered
Joined
·
1,489 Posts
Sweet, I should be able to buy this card with my stimulus check that I haven't received yet. It will be easy to wait since I don't have the money yet and I should be receiving it sometime mid June.

edit : I plan on going Intel C2D/quad so the beauty wouldn't suffer from a bottleneck.
 

· Initializing...
Joined
·
24,062 Posts
Good info here
,At least this card will do better than the 9xxx series I hope.
 
  • Rep+
Reactions: linskingdom

· Premium Member
Joined
·
14,321 Posts
Discussion Starter · #10 ·
Quote:
Translated Version




With the sale, is drawing near, more and more on the flagship GTX 280 cards as a spy is expected. But following these few images, in fact, just yesterday, small version of the plan to enlarge. We can understand that to some details of this card. 。 Distribution of eight positive graphics memory, also in the back of the eight, comprising a total of 1 GB/512Bit specifications. 。 As in the past put it, GPU should be looked packaging with the G80 ceramic roof, but has omitted the metal ring.

In addition, the power supply is not part of the six said before the power supply. We estimate that there may be three-phase. The exact number of power supply control chip can see the need for a clear understanding.


Source
 

· Iconoclast
Joined
·
34,391 Posts
Quote:


Originally Posted by Perry
View Post

Dang! Look at the IHS on the core. That thing must crank the heat if they've resorted to it.

At least...I think that's an IHS.

It is an IHS.

It's more to protect the core and minimized the need for a perfect base on the stock heatsinks. An IHS does not help cooling.
 

· Premium Member
Joined
·
14,321 Posts
Discussion Starter · #15 ·
Quote:


In just under a month Nvidia will release the GeForce GTX 280 and 260 cards and because of them the homes of big-spenders will get much hotter. The G(T)200 chip that powers both GTX models is manufactured using 65nm technology and will be one if not the hottest desktop-prone GPU around so naturally it will need appropriate cooling.

Of course Nvidia will be providing a decent (but we're not sure how loud) cooling solution but those who want their card to not suffer too much, especially during the more extreme summer days will have to look elsewhere. Coming to the rescue Aqua Computer has designed and unveiled a waterblock that will work with the GeForce GTX 280 (and likely the 260 too). Made out of crispy copper the waterblock will have G1/4" connectors and channels optimized for very low flow resistance. No word on pricing yet but we can wait three weeks (BEd: We can't afford the GTX 280 anyway).





Source
 

· Premium Member
Joined
·
14,321 Posts
Discussion Starter · #16 ·
Quote:
GeForce GTX 280 is going to be a mighty beast. The GT200 core impressed us from the very beginning in matters of raw performance. Alas there have been little information on the real-life performance of the card, other than statements that it runs Crysis fine at certain settings and that it scores 7000 points in the Vantage Extreme profile. We've been provided some more solid information based on the performance in Stanford's [email protected] client. A slide recently presented says that GeForce GTX 280 will be capable of folding slightly more than 500 mol/day, which is three times more than what Radeon HD 3870 can do, about 170 mol/day (according to the slide), or five times more than PlayStation 3; 100 mol/day.

Original Link
Source
 

· Premium Member
Joined
·
8,338 Posts
Quote:


Originally Posted by linskingdom
View Post

Source

Any estimated costs lins? i was in the market for a card to get back into gaming, this might be the card for me.
 

· Premium Member
Joined
·
14,321 Posts
Discussion Starter · #18 ·
Quote:


Originally Posted by phospholipid
View Post

Any estimated costs lins? i was in the market for a card to get back into gaming, this might be the card for me.

As of today, $499 for GT280 but I don't think this is correct. We should know more as early as next week.
 

· Retired Benchmark Editor
Joined
·
10,215 Posts
$500-$600 sounds about right :/
 
  • Rep+
Reactions: linskingdom

· Premium Member
Joined
·
14,321 Posts
Discussion Starter · #20 ·
Quote:


NVIDIA STAGED ITS regular editors' day, and once again, we were not invited. Luckily, that means we can tell you about it early.

With the usual flair of spin and statistics which almost no one questions for fear of being cut off, NV talked about two new GPUs, the GTX280 and GTX260. The 280 is the big one, the 260 is the mid range, what used to be the GTS.

The 280 has 240 stream processors and runs at a clock of 602MHz, a massive miss on what the firm intended. The processor clock runs at 1296MHz and the memory is at 1107MHz. The high-end part has 1G of GDDR3 at 512b width. This means that they are pretty much stuck offering 1G cards, not a great design choice here.

The 280 has 32ROPs and feeds them with a six and eight-pin PCIe connector. Remember NV mocking ATI over the eight-pin when the 2900 launched, and how they said they would never use it? The phrase 'hypocritical worms' come to mind, especially since it was on their roadmap at the time. This beast takes 236W max, so all those of you who bought mongo PSUs may have to reinvest if they ever get three or four-way SLI functional.

The cards are 10.5-inch parts, and each one will put out 933GFLOPS. Looks like they missed the magic teraflop number by a good margin. Remember we said they missed the clock frequencies by a lot? Here is where it must sting a bit more than usual, sorry NV, no cigar.

The smaller brother, aka low-yield, salvage part, the GTX260 is basically the same chips with 192 SPs and 896M GDDR3. If you are maths-impaired, let me point out that this equates to 24 ROPs.

The clocks are 576MHz GPU, 999MHz memory and 896MHz GDDR3 on a 448b memory interface. The power is fed by two six-pin connectors. Power consumption for this 10.5-inch board is 182W.

This may look good on paper, but the die is over 550mm, 576 according to Theo, on the usual TSMC 65nm process. If you recall, last quarter NV blamed its tanking margins on the G92 yields.

How do you fix a low yield problem? Well, in Nvidia-land, you simply add massive die area to a part so the yields go farther down. 576 / 325 = 1.77x. Hands up anyone who thinks this will help them meet the margin goals they promised? Remember, markets are closed Monday, so if you sleep in, no loss.

The 260 will be priced at $449 and go up against the ATI 770/4870 costing MUCH less. The 280 will be about 25 per cent faster and quite likely lose badly to the R700, very badly, but cost more, $600+.

As we said, it is going to be an interesting summer


Source
 
1 - 20 of 393 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top