Overclock.net banner

[TPU] MSI Announce the MSI N260GTX Lightning Series

1574 Views 24 Replies 20 Participants Last post by  dieanotherday
2
Quote:
But wait, that's not all from MSI for today. The company also made public the release of the N260GTX Lightning video card. This NVIDIA GeForce GTX 260 based card features 1792 MB of DDR3 memory, dual fan design and an exclusive "AirForce Panel" with touch buttons for instant overclocking and voltage regulation.

MSI, the worldwide high-end graphics card and motherboard brand-manufacturer, today proudly announced the release of greatest custom designed GeForce GTX 260 â€" MSI N260GTX Lightning. To further satisfy users' request of high durability and product stability, MSI N260GTX Lightning introduced the brand new standard of graphics cards called “Military Class†which redefine the standard of top quality. In addition, with 1792MB double-sized memory and dual fan design, MSI N260GTX Lightning becomes the most unique GeForce GTX260 in the market! Furthermore, the exclusive bundled device "AirForce Panel" is included. All you have to do is slightly touch the panel to enjoy high speed by overclocking!

Military Class â€" The new standard of highest quality
The standard of Military Class meets the purchase standard of international military which includes the ability for Hi-c Cap to work in 125℃ and the 2.5 times longer lifespan of new solid capacitor. With such high-end components, the durability and stability of MSI N260GTX Lightning highly transcend the standard of similar products.

Overclocking by the touch panel - AirForce Panel
AirForce Panel, the exclusive bundled device, uses the touch panel to provide the complete voltage and clock settings. Just simply press the panel to overclock the card! In addition, MSI N2600GTX Lightning reserves the voltage pin measurement area for oveclocking enthusiasts. You can simply use the multi-meter to detect the core and memory voltages. If plus AirForce Panel, everyone can easily reach the professional overclocker's world!

Industrial first 10 Phase Power PWM
MSI N260GTX Lightning is the world's first graphics card with 10 phase power PWM. Among the 10 phase power PWM, 8 Phase PWM are for GPU to greatly improve the total power supply and stability. Overclockers never have to worry about the traditional hardware structure which can’t support the overclocked voltage and clock settings. The key to break the WR (world record) is very simple, just overclocking it.

Variety features
MSI N260GTX Lightning not only has high standard quality of Military Class, but also has the AirForce Panel that overclockers love. In addition, the first 10 Phase Power PWM transcends the class of similar products among this industry. Moreover, the dual PWM fan design with five heat pipes ensures the best heat dissipation and lower working temperature. Lastly, the doubled memory size on board makes MSI N260GTX Lightning become the most unique and powerful GeForce GTX260 graphics card!
source
See less See more
1 - 20 of 25 Posts
Man MSI is on a role for 2009, new motherboards, new laptops and some pretty nice GPUs.
problem is, it's an MSI. if it was made by EVGA it would be a great card...
Hmm..makes me wonder if I should sell my 260s for a 295. I just want a 295 because of my 30 inch monitor, my 260 isn't great at 2560x1600
maybe will all that memory, the problem would dissipate.
See less See more
nice card, horribly worded article (unless it was a translation)
I'll buy it just to rid myself of the hideous reference coolers. Probably gonna be hard to hunt one of these down
See less See more
2
Quote:


Originally Posted by Inuyasha1771
View Post

Hmm..makes me wonder if I should sell my 260s for a 295. I just want a 295 because of my 30 inch monitor, my 260 isn't great at 2560x1600
maybe will all that memory, the problem would dissipate.

Memory is not shared on SLi / crossfire setups (The 295 is just SLi on a stick). Each core has access to half the total memory, and the memory contents for both cores are the same.

That came out weird, but what I'm trying to say, is that you don't gain any memory with SLi.

As far as this card goes, it's just another thing to appeal to the uninformed. It'll cost an arm and a leg, and for what? A goofy touchscreen thing that does what rivatuner already does for you, and a bunch of memory the card can't use... Awesome.
See less See more
Would be cool to run a couple a those in SLI.
it can use that memory, it has a 448bit mem interface, just because its DDR3 has only part of the way to do with it, that memory bus has alot to do with it. If it didnt then why increase the bus width of the card at all? DDR3 has to do with memory voltage, and speeds, tahts it. GDDR3 on a 448bit bus is just as good as GDDR5 or GDDR4. One thing we all forget also is latancy, the newer memories have super high latancy, which means first gen parts will generally preform equal to or worse than the older stuff, want proof why was DDR2 adopted so slow everywhere? Why did Nvidia go back to DDR1 for the FX5900 when it could have used GDDR2? Why did AMD go back to GDDR3 for the 2900XT and not use the GDDR4 from the 1950XTX? Latancy, the raw bandwith might exced the old stuff but the higher latancy kills it. No the bus width if fine, if a 256bit bus can access 1gb of ram @ 1800mhz effective clock, why cant 448bit access 1796mb of ram?
The results will speak for themselves we just got to wait
the card has the bandwidth to take advantage of the doubled memory. looks good.
2
Quote:

Originally Posted by TheSandman View Post
it can use that memory, it has a 448bit mem interface, just because its DDR3 has only part of the way to do with it, that memory bus has alot to do with it. If it didnt then why increase the bus width of the card at all? DDR3 has to do with memory voltage, and speeds, tahts it. GDDR3 on a 448bit bus is just as good as GDDR5 or GDDR4. One thing we all forget also is latancy, the newer memories have super high latancy, which means first gen parts will generally preform equal to or worse than the older stuff, want proof why was DDR2 adopted so slow everywhere? Why did Nvidia go back to DDR1 for the FX5900 when it could have used GDDR2? Why did AMD go back to GDDR3 for the 2900XT and not use the GDDR4 from the 1950XTX? Latancy, the raw bandwith might exced the old stuff but the higher latancy kills it. No the bus width if fine, if a 256bit bus can access 1gb of ram @ 1800mhz effective clock, why cant 448bit access 1796mb of ram?
What? You don't know what the hell you're talking about. A 256bit bus can NOT make use of 1gb of ram. Not if that RAM is GDDR3. If you're talking about the 4870, and it's GDDR5 RAM, then you're comparing apples and oranges.

Quote:

Originally Posted by G|F.E.A.D|Killa View Post
the card has the bandwidth to take advantage of the doubled memory. looks good.
Again, same as above. Unless this card is using GDDR5, or has a 896 bit bus, the extra memory is useless. It just appeals to the idiots who think bigger numbers = better card.
See less See more
Quote:

Originally Posted by Liability View Post
What? You don't know what the hell you're talking about. A 256bit bus can NOT make use of 1gb of ram. Not if that RAM is GDDR3. If you're talking about the 4870, and it's GDDR5 RAM, then you're comparing apples and oranges.

Again, same as above. Unless this card is using GDDR5, or has a 896 bit bus, the extra memory is useless. It just appeals to the idiots who think bigger numbers = better card.
and what facts can you base this one I might ask? GDDR5 256bit preforms no better than GDDR3 512bit. Id go so far as to say with the increased latancy it will preform worse till speeds can be ramped up. Theoretical bandwith means very little, its the actual bandwith that matters. Increased latancy means lower actual bandwith, and you can not reach your peak theoretical bandwith. In truth if we could ramp DDR1 up to speeds GDDR3 runs at then DDR1 would beat the crap out of GDDR3 in all honesty.
See less See more
Quote:

Originally Posted by G|F.E.A.D|Killa View Post
the card has the bandwidth to take advantage of the doubled memory. looks good.
Bandwidth or not, there is no game out there will use 1.8GiB of video memory, except at the most extreme of settings. And if you do manage to run settings that do take up most of that, the GPU couldn't render it at an acceptable frame rate.

Even if the card was running 4GHz GDDR5 on a 4096-bit bus, the GPU would be too slow for it to matter with 1.8GiB of frame/texture buffer in use.

Still, it's good for SLI users.
See less See more
no crysis can use it @ 1920x1080 and higher res actully
1 - 20 of 25 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top