Overclock.net › Forums › Intel › Intel - General › GIGABYTE GTX 570 Review/SLI/SB/OC
New Posts  All Forums:Forum Nav:

GIGABYTE GTX 570 Review/SLI/SB/OC

post #1 of 10
Thread Starter 
GIGABYTE GTX 570 Review
When NVIDIA introduced its Fermi architecture in its GF100 series GPUs, the NVidia 400 series, we saw some really great new technologies come into play in the GPU world. Now that NVIDIA has introduced its flagship architecture there is a growing demand for a more refined version of the GF100, and GIGABYTE has provided two GTX 570 to show off the new line of Fermi GPUs with the GF110. NVIDIA introduced the GF110 and GIGABYTE released the GIGABYTE GTX 580, which was a more refined version of the original GF100 and GTX 480/470 in many ways. Today we will take a look into the differences of GF100 and GF110 GPUs and how the GTX 570 was born to replace the NIVIDA GTX 480/470 series VGAs. We will also take a look at Scalable Link Interface (SLI) with two GIGABYTE GTX 570 on the brand new Sandy Bridge LGA1155 platform with NF200 chipset. I will then take a look into overclocking, power consumption, as well as some useful tricks for tuners and benchers alike such as how to GET 3DMark11 to work with these GPUs in SLI.

The GTX 570 is part of GIGABYTE’s Hard Core Gamer Series, here are some features:

GIGABYTE product number: GV-N570D5-13I-B

The article will be organized into segments:
Introduction (Specifications) and Packaging
Closer look at the GPU and Motherboard Spacing
In-depth look at the GF110 and new features
The Voltage Regulator and other SMD components
Overclocking and Tricks
Benchmarks Single GPU/O.C. & SLI/SLI O.C.
Conclusion

Evolution of the GTX 570(differences between cards and specifications)


I highlighted the major aspects and specifications of the top NVIDIA 400 and 500 series VGAs. I then highlighted the GTX 570 in red, and then highlighted identical specifications of the two 400 series cards. In conclusion you can gather from these specs that the GTX 570 has the same number of processors as the GTX 480 but at the same time has the same memory hardware as the GTX 470. Although it lacks the extra memory and memory bus of the GTX 480 its higher clocks will make up for it. NVIDIA did a great job with lowing the TDP so cooling this card wouldn’t be as tough nor would overclocking.

Now let’s start out with some specifications, all Fermi cards GF100 and GF110 (and GF104/GTX 460 that we don’t mention) have 4 Graphic Processing Clusters (GPC). Inside each GPC are Streaming multiprocessors(SM) and a raster engine. The number streaming multiprocessors are the same in the GTX570 and GTX 480, this means that everything in that streaming multiprocessor is included as well (further discussed in the GF110 section). The GTX 580 has one more streaming multiprocessor than the GTX570. The differences between the GTX 480 and GTX 570 are that the GTX 480 has an extra 8 ROPs and 128KB L2 cache as well as the extra memory controller that the GTX 580 has. So while the GPCs are the same, the memory is laid out like it is in the GTX 580. Now where the GTX 570 is similar to the GTX 470 is in its memory sub-system, it has the same number of ROPs and so on. The core clock of the 500 series GPUs was increased over their predecessors. I will go more in depth about all the architectural improvements as well as the design of the GPU in the GF110 section.



Beauty and the Beast
I should mention that this GIGABYTE GTX 570 is a reference card.. What makes this GIGABYTE specific are a few very intricate differences; some to mention are the Gigabyte BIOS and packaging and accessories. Gigabyte also has other versions of the GTX 570 which are super clocked and/or have a much more beefed up cooler/VRM for overclocking.
Here is a purple box, not too flashy but gets the point across, there is a card in here taller than your motherboard is wide and takes up two slots to give you that bang for your buck.



For SLI GTX 570s you need two:



When you open the box you realize why it’s so big, inside you have a wealth of VGA accessories.



Included are: 2 x 6pin to Molex PSU power adapters, a mini HDMI to HDMI cable, a DVI to D-sub adapter, driver CD, and finally the manual.

Next we move onto the GPU itself. Outside of its anti-static bag this monster is sleek, unlike the GTX 480 there are no perturbing heatpipes. GIGABYTE GTX 570 uses a vapor chamber that we will go into later in the review.

Now this card’s outputs and connectors are protected by blue GIGABYTE labeled protectors.

This card does come with a premium, but the packaging treats it like royalty. Connectors can be left on or removed for 2-way or 3-way SLI.

Closer look at the GPU and Motherboard Spacing
Installation and SLI spacing:
When you install the cards you want to have them in the 16x PCI-E slots on the motherboard. Most current motherboards, but not all, support SLI technology. Here I have it installed on my P67A-UD7 with an NF200 PCI-E bridge enabling the use of two 16x slots for full speed SLI.

Here we have a single card installation, please do not forget the two 6-Pin PCI-E power connectors!

Now let’s get another angle in the case with the SLI connector attached as well as 2x 6pin PCI-E power cables per card, so I have to use 4x 6Pin PCI-E power cables:

Above is the correct 2-way SLI installation method. Below is an indentation to maintain proper airflow for SLI configurations. This new shroud design is part of the new GIGABYTE 500 series Graphics Cards.

A closer look at the card:
When I said sleek, I wasn’t joking, take a look at the shot of the front end of this GPU:
Before we move on, here are the dual 6-pin PCI-E power plugs:


Now we disassemble:
Cover off:
As you can see the heatsink to cool down this 219 watt behemoth is pretty conservative, in a little we will explain how and why.
Here is a shot of the fan, which is a bit different from the NVIDIA GTX 400 series blower fan in that it has a plastic stabilizer on top so that there is less vibration and thus less noise and more efficiency:


Another shot of the fan:

Here you can see that the heatsink is surrounded by nice foam protection so there is no direct contact to the shroud:

Now we strip the card down to its bare essentials.

I would like to make a few comments here: the shroud is designed to maximize airflow into the fan and then into the heatsink. The heatsink has a vapor chamber that acts as a giant heat pipe, with an evaporator wick that heats water vapor, which carries the heat up to the condenser wick. The condenser wick is attached to the top of the vapor chamber and to the bottom of the aluminum fins. The cold air from the fan that cools down the aluminum then makes its way to re-condense the vapor and it goes back to the evaporator wick to be recycled. This is a 100% vacuum sealed chamber so that the water has no problem evaporating and condensing quickly.

This vapor chamber is nothing new to either GPUs or coolers, but it’s new for stock reference cards.
Now we have the heatsink with vapor chamber and its 100% copper construction, the fins on the heatsink are aluminum like noted earlier.


This thermal paste was not hard, and it seems to be good quality, probably ceramic or something close to it. NVIDIA and GIGABYTE realize that these cards run hotter than your CPU so they do their best to not cut corners for the cooling.
Here you can see a tiny reflection, many will be tempted to lap (sand down to mirror finish) the surface, but it’s not needed and should be avoided as there is most likely a VERY thin layer of copper between the surface and the internal wick to maximize heat exchange.


Earlier I skipped the full body heatsink, which cools down all the hot VRM components such as the Low RDS (on) MOSFETs, the Drivers for the MOSFETs, and the RAM. The thermal tape/heat pad used is very common in high-end IC cooling solutions; even the smallest driver has a part of a heat pad.
(It was extremely hard to disassemble this I has to use a #6 star bit, and that is very small, the screws were also very tight, and they had a blue color to indicate that they were unscrewed, it will void your warranty. BUT you can replace the heatsink for the GPU without touching the full cover body heatsink.)


There is one more thing I want to mention before we move even deeper into the cards electrical system, these cards are under tight quality control:

Welcome to the GF110 GTX 570 Edition


The greatest downfall of the GF100 series VGAs was there enormous power consumption and their large TDP. The GTX 480 which was the GF100 flagship GPU had a TDP of 250 watts. These new GF110s boast significantly lower TDPs, I will cover how they did this shortly. The basic architecture of the GF100 was carried over into the GF110, with four Graphics Processing Clusters, within each lies four Streaming Multiprocessors and one Raster Engine. Each Streaming multiprocessor contains the following:

32 CUDA Cores
2 Warp Schedulers
1 PolyMorph Engine (contains tesselator)
4 Special Function units(Geometry)
4 Texture units
16 load/store units
64KB L1 Cache and Registers

Outside the GPC you have 48 ROPs (GPC to L2 cache transfer) and 768KB L2 cache which communicates and transfers data to the 6 x 64bit Memory Controllers.


In traditional style we have the GTX 570(opposed to GTX580) with a single Streaming Multiprocessor disabled (greyed out) as well as a disabled memory controller along with cache and ROPS (greyed out). Instead of 16 SMs we have 15, 512 CUDA cores to 480, so on and so forth. We also have 40 ROP units opposed to 48 and 640KB L2 cache opposed to 768KB. While this does set apart the GTX 570 from the GTX 580, it doesn’t affect the GTX 570’s ability to carry out tasks nor does it limit the features of the GTX 570. Fermi architecture was designed so that instructions were to be carried out in parallel, thus why you see so much redundancy in the core. Here the impact is very minimal; you have a 16 core processor with one core turned off. A big impact comes in the reduction of memory from 1.5 GB to 1.25 GB complimented by a reduction in the memory bus from 348mb/s to 320mb/s.

Transistor Addition and Rearrangement
You are probably asking yourself, how can a card with the same number of cores, texture units, same manufacturing process, and transistor count have a lower thermal design power than its predecessor? The answer lies within the GF110 core itself. NVIDIA has been quoted saying:
“Lower leakage transistors on less timing sensitive processing paths and higher speed [leakage] transistors on more critical processing paths”
What does this mean? Well transistors are the basic element of every microprocessor, they have the ability to switch on and off and produce work and heat and the same time. Only part of the work they do can be rendered useful, so that wasted work is what we refer to as leakage. You also need to know that the ratio of work to heat is constant, so you have to deal with variable ratio of useful work to leakage. Now the way it works is that high leakage transistors are faster and better for overclocking they also tend to do more work, so how can NVIDIA keep the same design yet reduce the thermal package? The answer is rearrangement and introduction. NVIDIA used two types of transistors in their original GF100 Fermi, high leakage and low leakage, is what we will call them. For GF110 they added sort of a middle ground transistor, that had leakage in between the high and low leakage transistor, we can call it a middle leakage transistor to make it simple. Then NVIDIA rearranged their transistors so that they moved the high leakage transistors into areas where more work is being done, moved the low leakage transistors where very little work is done, and replaced some of the debatable transistors with the middle leakage ones. Now while the exact locations are unknown you can see that they really did a great job. They reduced the TDP by almost 15-20% while raising stock clocks speeds, and keeping the same architecture.


Z-Culling
There are some other things that were also improved upon in the GF110. One of those is known as Z-Culling. Z-Culling takes place in the Raster Engine and NVIDIA claimed that they improved its efficiency over the GF100.


Now you are probably wondering what Z-Culling actually does. In laymen’s terms Z-Culling is basically throwing out pixels that the user will never see. Think about a level in your favorite game; imagine you are coming up a house in the clearing in the middle of a forest. When you approach the house while still in the forest there are parts of the house that are blocked by the trees, there is no reason to have the VGA render the pixels for those areas of the house that you can’t see in that particular frame, so z-culling is the process of removing those frames so they are not rendered and waste resources. Same thing happens as you approach the house, there are trees behind the house that have portions that should not be rendered. While this technology is nothing new, it still is very important.

Power Regulation and Other SMD/ICs:
While there are many options for CPU VRM design, on a GPU you only have so much area to work with. On a standard ATX motherboard there is enough room to fit 24 phase VRM, but on a GPU things are a bit different. This restriction coupled with the fact that current day GPUs pull a lot more current than current CPUs means that the VRM on a GPU needs to pack a heavy punch in a small area. Digital PWM matches perfectly in this case, as they are geared towards clearing up PCB real-estate and they offer very precise voltage control and user defined regulation. Coupled with high quality Low RDS (on) MOSFETs, Hi-C capacitors, low profile inductors, and well matched driver you can pack a lot of punch into a small area. This design is one of NVIDIAs’, and companies like GIGABYTE further refine or redesign the VRM when designing Super Clocked and overclocked versions of these reference GPUs. Today we are taking a look at a reference design of the PCB used for the GTX 580, this PCB has been stripped of a few things such as extra RAM and extra VRM components such as capacitors, and Low RDS (on) MOSFETs.


On the left we have a Chil 8266 6 Phase GPU Digital PWM , on this PCB it can supply up to 6 phases, but for the GTX 570 only 4 phases are implemented. This loss of two phases might take a toll on the Sub Zero OCers. The Chil has a switching frequency range from 250khz to 1mhz, and has the ability to vary load onto just a single phase when in low power mode. On the right we have an APW7088 from Anpec Electronics two phase PWM with fixed frequency and integrated MOSFET drivers. The Anpec is used to power the two phases for the RAM, and the Chil is used for the GPU. Here we have a 4+2 phase VRM designed to generate little heat when idle and pack a nice punch when at load.

In orange we have three Texas instruments INA219 current sensing power ICs, which basically are able to sense the current pulled from the PCI-E slot and the two 6-Pin PCI-E PSU connectors, these little ICs are hooked up directly to the shunt resistors to monitor power management. Now this is good news and bad news. The good news is that the voltage input can vary from the PCI-E port to the PCI-E 6 pin connectors, meaning this card can handle a dual PSU configuration if needed. But the real reason they are here is to help regulate power and help software driven overcurrent protection limit two programs, OCCT GPU and Furmark. Both programs are known to draw much more current that any game could ever load on a GPU. Companies such as NVIDIA are tired to receiving burnt out cards because of overclocking and using Furmark and OCCT to stability test. There is good news though; this limiting is only done in software, so this should have no adverse effect for Overclocking, just extra protection. In a little bit I will show you how I got around it for max TDP testing.

In green we have four Chil 8510 MOSFET drivers, which control the Low RDS(on) MOSFETs which are highlighted in pink. These drivers are the recommended part for the Chil 8266 Digital PWM, so the system works in harmony.
In red we have some inductors, on motherboards it is more common to see ferrite core chokes, but they are not low profile enough for this design, inductors are basically chokes and are more commonly matched with Digital VRM designs.
In yellow you see the VRM for the RAM consisting of 2 phases.
You are probably wondering where the capacitors are, well here you go, they are on the backside of the PCB. In keeping with low profile design they are all 100% solid Highly Conductive Polymerized or Hi-C Capacitors. Here you can see ten of them circled in red.


Edited by Sin0822 - 2/10/11 at 10:34am
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
post #2 of 10
Thread Starter 
The memory:
This GTX 570 uses 10 x 128mb Samsung K4G10325FE-HC05 GDDR5 memory chips with a rated speed of 2000 MHz/ 4000 MHz DDR. At stock they are clocked to 3696 MHz so you do have a bit room in overclocking. Each chip is on a 32bit channel, thus ten of them makes up the GTX 570’s 320bit memory bus.


Last but not least we have the MX BIOS chip. Instructions are given in the manual on how to flash this BIOS. At stock there is a 1.1v limit on the vcc of the GF110, but NVIDIA has said that up to 1.3v is max for Overclocking, so a modified BIOS is needed.


Overclocking and Test Setup
Overclocking was done with Stock Cooler, ambient temperature was kept low at around 15-20C.
I used NVIDIA Inspector teamed up with GPU-Z(from W1zzrd @ Techpowerup) to do all of my benchmarking and overclocking. GIGABYTE has GPU Overclocking Software called OC GURU, but it only works with SOC cards, not stock reference cards.

Please Beware that Removing OCP can and will damage your card if not monitored properly, many have damaged their card, just watch out, that is why I did not link to how to remove the OCP. For reviewing purposes I thought it was in order, but for normal use the software OCp wont engage with any game so do not worry.
Nvidia Forceware 263.09 was used
Windows 7 was used SP1
Intel I7 2600K at 5.1 GHz for all benchmark runs
Corsair 2x2GB Dominator @1866mhz 9,9,9,24 T2
GIGABYTE P67A-UD7 revision 1.0
Sparkle Gold Series 1KW
WD Raptor 150GB
Two GIGABYTE GTX 570


Now stock clocks on this card are 732-core, 1900-memory, 1464-shader. Shader and Core are tied together. Vcc or the voltage for the GPU core is at stock VID .975, many cards have lower or higher VIDs. Without a BIOS modification the maximum voltage you can push through the core is 1.1v, which is what I used. This limit is there because the stocker cooler cannot handle more. NVIDIA has stated max voltage is 1.3v. Please keep in mind that the RAM on this GPU is rated for 2000 MHz/4000 MHz, so you can easily adjust that higher, but please keep in mind, too much memory OC will make the card fail as the user is not aware of the RAM failing.
For Single Card: I was able to achieve 950-core/2211-memory/1899-shader. Please be aware that this card has throttling built into it, so at idle it will report low clocks.
For SLI: I was able to achieve 940-core/2100-memory/1879-shader.
For all benchmarks I ran the single card at stock, the single card OCed to what is set above, SLI at stock, and SLI OCed to what it says above.
Overclocks were tested by running Unigine, and Furmark was used with OCP disabled through GPU-Z command line. Power consumption was measured at the socket and Furmark was used to put full load on GPU, processor was about 25-50% load. More Information on how to disable software OCP for Furmark and OCCT can be found here:


Here you can see the OC for a single card with NVIDIA Inspector:

Here you can see the OC for SLI with GPU-Z monitoring:



3DMark11 SLI FIX:

3Dmark11 has a problem where it doesn’t recognize FERMI SLI, Futuremark says the blame lies with NVIDIA and that they need to update their drivers. Either way 3DMark11 is a very new program and the GTX 570 is a very new card, but this SLI problem seems to extend to all Fermi cards not just the NVIDIA 500 series.
Here is the fix:
#1 Please run NVIDIA inspector as administrator, by selecting, “run as administrator”
#2 Click the driver button highlighted in red
#3 change the line highlighted in green from 0x00000000 to 0x080000F5
SLI should now work and your score in 3Dmark11 should increase A LOT, I have tested this and it affects no other program but 3DMark 11.

Here is the result:


Benchmarking!
As a preface to the scores I am about to show you I want to state a few things. First I used a single Dell 24” wide-screen monitor for all the testing. Resolution of 1980x1080 was the maximum I had in the house, 2560x1600 resolution coming soon!. Needless to say not many people have 25600x1600 resolution monitors so for these initial benchmarks I used the standard resolutions. Resolutions of 1680x1050 as well as 1280x1024 were used because not everyone has a 24' monitor. 3Dmark was run on performance presets, only GPU and CPU tests. Physx was left on, but I also understand to submit 3Dmark scores to HWBot you have to disable physx. There was no difference with physx disabled for 3dmark11, so I just left it on for all tests.

Every test was run a minimum of 5 times. Other than Unigine and 3DMark tests, FRAPs was used and timed for 5 minutes to record and spit out an average frames per second. Even if the game has an internal benchmark I did not use it. Such as Mafia II has an internal benchmark in its demo so does HAWX and Metro 2033, but those benchmarks I feel are not a correct representation of the final product or of the actual FPS. The same level was use on a game by game basis; I picked one with a lot of activity.


To start our benchmarking results off I chose 3DMark Vantage:
(Vantage was run at performance settings, PHYSX was left ON, HWBot requires PhysX to be uninstalled, but I found it had more of an impact on CPU score than GPU score, to keep everything constant, and because the most ocers and gamers will NOT disabled/uninstall PHYSX, I left it on for all tests)


Next we move onto 3DMark 11, this new benchmark incorporates DirectX 11, which is one of the new advantages the GTX570 has; many games are starting to use DX11. 3DMark11 Tests were run on performance presets.


As you can see in both Futuremark programs, SLI scaling is not 200% but closer to 180-190% which is still very good. Overclocked these cards increased performance about 120-125% in both versions.


I personally feel as if Unigine is an excellent benchmarking program as well as a stability program. Even if my overclock passed through 3DMark, Unigine would crash if there was some problem, it is faster to use than Furmark or OOCT, plus those programs can damage your GPU, and it’s actually very nice to watch. I used Unigine’s internal benchmarking program. Unigine was run with DX11, Shaders high, antialiasing off, and tessellation normal.


With Unigine you see some very nice scaling with SLI at about 180%. Overclocking improvements are about 120%.

Next we more onto real world games, first up is Call Of Duty: Modern Warfare 2, the level where you fly into the city to rescue a platoon before the nuke goes off. All settings at their highest, antialiasing 4x, and number of corpses high. This game uses DirextX 9.


Here we see SLI improvements of 165%, and overclocking improvements of 110-115%. Now real world testing does show a bit slightly lower results on both SLI scaling and overclocking, this is real life, not a synthetic benchmark, this is what I expect to see. These results are still very nice.


Dirt 2 is a popular game, Direct X 11 was included, but the game was not built around DirectX 11, it was coded to use DirectX 11, so we can use it as a DirectX 11 game. A raid race was performed, settings MultiSampling at 4xMSAA and all settings medium.


Here the GPU is much more taxed than with the CODMW2, which is DirectX9, but the GPU does keep up, the FPS are pretty incredible for a Direct X 11 game. SLI scaling is very low around 120-130% and that is because I think the drivers for the GPU I used are very new. NVIDIA just released drivers that greatly improve SLI performance in many games. Overclocking brings up performance about 110-115% which seems to be the norm.

Now we move onto Metro 2033, a game that takes place in the near future when humans live underground and fight mutants. It is pretty scary to say the least. The game has very nice graphics and in the past taxed GPUs so much that is known as an FPS killer. Run at High settings, AAA, AF 4x, and DX11.


This game is a hard hitter on a single card; 64FPS overclocked on a top of the line card is very good to say the least in this game. Even though in other games it would be terrible, this DirectX 11 Monster really knows how to tax a GPU. What is really wonderful is SLI scaling; this game knows how to take advantage of SLI. We have a 150% increase in SLI, and 115% increase from overclocking. While that might not sound like a lot, it’s a very good margin for SLI.


Power Consumption:
Now for power consumption we have a card with a TDP of 219watts. For all tests the OCP was disabled through GPU-Z command prompt, and Furmark was run which loaded the CPU as well. You need to keep in mind that the CPU was not loaded anywhere close to 100%, more like 25-50%.
Highest number seen on the power meter was recorded for both idle and load. This is power pulled from the socket, you need to calculate 80-90% efficiency for the a 80 Plus Gold certified PSU, and also include the power from the fans, the hard drive, and the CPU.
Now From my review on the UD7 and from some extra testing I have found that the CPU wattage at the load Furmark puts on it to be 100watts.



Calculated Power consumption of GPU:
Single Stock: 224.8 watts (500watt PSU recommended)
Single OC: 280 watts (550 watt PSU recommend)
SLI Stock: 426.4 watts (800 watt PSU recommended)
SLI OC: 576 watts (1000watt PSU recommended)

Conclusion
I sum performance up into five categories: Performance, Functionality, Overclocking, Value, and Appeal.

I once had a professor back in college, and he gave us a written exam, I aced everything on it I mean aced it. Well when I got my grade back he had given me a 90%, and when I confronted him he gave me such a crazy answer I just accepted it and took my A. He said, "Yes you got everything right, but 100% is for God, 95% is for me, and 90% is for you." What a nut, but I am not too far off, so no 10 point out of 10 for any manufacturer, because in reality, nothing is perfect. I will give a 9.9 though.
Scale of 1-10, I don’t give 10s.

Performance:
This GPU is very powerful, while its power consumption has been reduced it still will pull more power than your CPU overclocked. Performance in real world and synthetic benchmarks is excellent to say the least. This GPU really is packed with enough power to run any game on the market right now with all settings set to high and maintain above 40-50FPS on the harshest DX11 games, 50-150FPS on the majority of DX11 games, and above that for DX10 and DX9 games. What does this mean? This cards performance excellent. Toss a second one in there for SLI and you will see massive FPS improvements in games and benchmarks. A single GTX580 is better. Score: 9.8 .

Functionality:
Personally Video Cards serve one of three purposes. Either you game and love great graphics and high frames per second, you’re a bencher and need a card(s) to compliment your excellent CPU, or you are a folder and need a GPU(s) to do processing. Either which way the Fermi Architecture takes care of all of you. At the expense of a decent PSU, this card serves its purpose well. Score: 9.7


Overclocking:
All I have to say is wow. So far there aren’t too many GTX 570s out there, but after going through forums, these GIGABYTE GTX 570s are overclocking wonderfully. Overclocking was very simple, and other than OCP limiting OCCT and Furmark (two most popular stability tests), everything went great. You can disable the OCP but do beware you can fry your chip, so just use Unigine for stability testing as I can attest it does a great job. Sadly GIGABYTE didn’t include overclocking software or have a version of OC Guru for this card, but it is fine as there are so many overclocking programs for NVIDIA cards. Super Over Clocked (SOC) cards will probably do much better, but these cards did excellent on the stock cooler. Sure you can overclock a 480 to the same speeds as a 570, but you can most likely overclock the GTX 570 further because of its default clocks and TDP. Of course for Overclocking there are Overclocked Versions of this card. I should also mention that both cards hit the same top speed, and in SLI both hit very close to max single card speed, which is excellent SLI overclocking. Score: 9.5

Value:
With a retail price of $350 you either love it or hate it. Right now a GTX480 is priced at around $400, but of course you can always buy used for cheaper. At $350 dollars most gamers, folders, and benchers would say this card is very well priced. Its high clocks really allow it to outshine the competition; low price, high clocks, same features, and new tech really make this a steal. If you are a budget wise gamer, bencher, or folder then this might not be the card for you and you aren’t alone. $350 is a lot to spend on a single part of a computer, but when it’s the heart of a system then it is more reasonable. This card is the replacement for the prior flagship GTX480 so price is going to be high. If you can afford two it’s a pretty god price. Score: 9.5

Appeal:
The extremely high overclocks of this card really make it shine in my eyes, but appeal is more about physical appearance and features. This card is sleeker than the GTX 480 without any heat pipes. NVIDIA also claims it runs cooler and quieter. The card’s color and sleek look make it pretty much a standard GPU. Other than it outputs heat like a heater when at full load this card runs pretty cool and consumes very little power at idle thanks to its auto down-clocking. If you need extra heat during the winter two of these will heat up a room (I know) so maybe you can turn your heater on less (just kidding). When the card was at 100% load it was actually not that loud, it didn’t bother me at all, only while I was overclocking and manually changed the fan speed did it start to become a problem. Compared to its counterpart this GPU outputs 20% less heat, offers the same features, same technology, and higher default speeds. That is pretty attractive for a $50 cheaper price tag. The GIGABYTE accessories fit this card well with the D-Sub adapters and mini-HDMI to HDMI cable and PSU connectors, this card comes ready to go. Score: 9.7

Total Score: 9.64 this card is hands down a winner at a great price.

I would like to thank all those are GIGABYTE who made this review and others possible! Thank YOU!!!!


Edited by Sin0822 - 2/9/11 at 9:08pm
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
post #3 of 10
As always, a great write up; you're a detail man I've come to realize and that's what I like about your reviews/guides. +rep

I would like to say one thing about the OCP. I think you should state/warn about the potential risks that disabling the OCP can have on a card. As I'm sure you have read also, it seems that the disabling of the OCP can potentially lead to a fried 570. I just wanted to point this out for users that have not heard of the problem. Though, if they didn't look into it at all then I don't think they should be disabling any power limitations to begin with.
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
2500k @ 5GHz ASRock P67 Extreme4 GTX 580 8GB G.Skill Ripjaws X @ 1600 8-8-8-24 
Hard DriveMonitorPowerCase
64GB Crucial SSD/1 TB WD Caviar Black Asus VG236H 23" 3D Corsair 850 TX 800D 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
2500k @ 5GHz ASRock P67 Extreme4 GTX 580 8GB G.Skill Ripjaws X @ 1600 8-8-8-24 
Hard DriveMonitorPowerCase
64GB Crucial SSD/1 TB WD Caviar Black Asus VG236H 23" 3D Corsair 850 TX 800D 
  hide details  
Reply
post #4 of 10
Thread Starter 
Wait I didn't warn against it!???shoot, let me do that, thanks for reminding me man!

i appreciate your kind words Its my first GPu review.
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
post #5 of 10
Great review sin, I love my 570 cant wait to get another.

+1
 
Media Server
(12 items)
 
 
CPUMotherboardGraphicsGraphics
i7 4770K MSI Z97S Krait XFX RX 480 Black Edition XFX RX 480 Black Edition 
RAMHard DriveHard DriveOptical Drive
G.Skill Trident X 2666mhz Intel 530 Sandisk Extreme Pro LG WH14NS40 
Optical DriveCoolingOSMonitor
LG WH16NS40 Noctua NH-D14 Win 10 64-Bit BenQ XL2730Z 
KeyboardPowerCaseMouse
Logitech G710+ Seasonic Platinum 1000w Fractal Design R5 Zowie FK1+ 
Mouse PadOtherOther
XTrac Ripper XXL  4 x Vardar F-4 120mm (2x NH-D14, 2x Bottom Intake) 3 x Vardar F-3 140mm Intake (two front, one side) 
CPUMotherboardGraphicsRAM
Intel Celeron J1800 QNAP Proprietary  Intel HD Graphics Crucial 8GB Kit 
Hard DriveHard DriveHard DriveHard Drive
4TB Western Digital Red Pro 4TB Western Digital Red Pro 4TB Western Digital Red Pro 4TB Western Digital Red Pro 
CoolingOSPowerCase
Stock Embedded Linux External, 96 W, 100-240 V TS-451 
  hide details  
Reply
 
Media Server
(12 items)
 
 
CPUMotherboardGraphicsGraphics
i7 4770K MSI Z97S Krait XFX RX 480 Black Edition XFX RX 480 Black Edition 
RAMHard DriveHard DriveOptical Drive
G.Skill Trident X 2666mhz Intel 530 Sandisk Extreme Pro LG WH14NS40 
Optical DriveCoolingOSMonitor
LG WH16NS40 Noctua NH-D14 Win 10 64-Bit BenQ XL2730Z 
KeyboardPowerCaseMouse
Logitech G710+ Seasonic Platinum 1000w Fractal Design R5 Zowie FK1+ 
Mouse PadOtherOther
XTrac Ripper XXL  4 x Vardar F-4 120mm (2x NH-D14, 2x Bottom Intake) 3 x Vardar F-3 140mm Intake (two front, one side) 
CPUMotherboardGraphicsRAM
Intel Celeron J1800 QNAP Proprietary  Intel HD Graphics Crucial 8GB Kit 
Hard DriveHard DriveHard DriveHard Drive
4TB Western Digital Red Pro 4TB Western Digital Red Pro 4TB Western Digital Red Pro 4TB Western Digital Red Pro 
CoolingOSPowerCase
Stock Embedded Linux External, 96 W, 100-240 V TS-451 
  hide details  
Reply
post #6 of 10
Nice review, very detailed as always. I happen to have one of these and only until I got a 2nd 570, did I notice that the gigabyte comes in at higher stock clocks than my other card.
Daily
(15 items)
 
  
CPUMotherboardGraphicsRAM
5960x Asus X99A II Asus GTX 1060 Dual Team Vulcan (4 x 8GB) DDR4 3000 
Hard DriveOptical DriveCoolingOS
Samsung 960 evo LG Bluray  Thermaltake Water 3.0 Ultimate Windows 7 64 
MonitorKeyboardPowerCase
Dell U3415W OCN Ducky  Seasonic X750 Corsair Carbide 600Q 
MouseMouse PadAudio
Razer DeathAdder Black Edition Glorious  Creative Sound Blaster Z 
  hide details  
Reply
Daily
(15 items)
 
  
CPUMotherboardGraphicsRAM
5960x Asus X99A II Asus GTX 1060 Dual Team Vulcan (4 x 8GB) DDR4 3000 
Hard DriveOptical DriveCoolingOS
Samsung 960 evo LG Bluray  Thermaltake Water 3.0 Ultimate Windows 7 64 
MonitorKeyboardPowerCase
Dell U3415W OCN Ducky  Seasonic X750 Corsair Carbide 600Q 
MouseMouse PadAudio
Razer DeathAdder Black Edition Glorious  Creative Sound Blaster Z 
  hide details  
Reply
post #7 of 10
Thread Starter 
are you sure it does? this is a reference card.
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
post #8 of 10
Quote:
Originally Posted by Sin0822 View Post
are you sure it does? this is a reference card.
Now you got me questioning myself. I bought the vanilla reference card. The cheapest newegg had at the time. But then I flashed the original bios to allow more volts, then it turns out the bios wasn't so great and the card didn't work in my classified but worked in every other mb I own. I thought that I used the original bios to flash it back, but I might be wrong.

Looking at stock core is around 732, mine is around 797mhz. Maybe I used a different bios, but it still shows the vendor to be Nvidia, just as my original bios showed.
Daily
(15 items)
 
  
CPUMotherboardGraphicsRAM
5960x Asus X99A II Asus GTX 1060 Dual Team Vulcan (4 x 8GB) DDR4 3000 
Hard DriveOptical DriveCoolingOS
Samsung 960 evo LG Bluray  Thermaltake Water 3.0 Ultimate Windows 7 64 
MonitorKeyboardPowerCase
Dell U3415W OCN Ducky  Seasonic X750 Corsair Carbide 600Q 
MouseMouse PadAudio
Razer DeathAdder Black Edition Glorious  Creative Sound Blaster Z 
  hide details  
Reply
Daily
(15 items)
 
  
CPUMotherboardGraphicsRAM
5960x Asus X99A II Asus GTX 1060 Dual Team Vulcan (4 x 8GB) DDR4 3000 
Hard DriveOptical DriveCoolingOS
Samsung 960 evo LG Bluray  Thermaltake Water 3.0 Ultimate Windows 7 64 
MonitorKeyboardPowerCase
Dell U3415W OCN Ducky  Seasonic X750 Corsair Carbide 600Q 
MouseMouse PadAudio
Razer DeathAdder Black Edition Glorious  Creative Sound Blaster Z 
  hide details  
Reply
post #9 of 10
Quote:
Originally Posted by Mikecdm View Post
Now you got me questioning myself. I bought the vanilla reference card. The cheapest newegg had at the time. But then I flashed the original bios to allow more volts, then it turns out the bios wasn't so great and the card didn't work in my classified but worked in every other mb I own. I thought that I used the original bios to flash it back, but I might be wrong.

Looking at stock core is around 732, mine is around 797mhz. Maybe I used a different bios, but it still shows the vendor to be Nvidia, just as my original bios showed.
Yea, I'm pretty sure you used the EVGA 570 SC bios; only card that comes stock at 797mhz. The reference Gigabyte comes in at the standard 732.
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
2500k @ 5GHz ASRock P67 Extreme4 GTX 580 8GB G.Skill Ripjaws X @ 1600 8-8-8-24 
Hard DriveMonitorPowerCase
64GB Crucial SSD/1 TB WD Caviar Black Asus VG236H 23" 3D Corsair 850 TX 800D 
  hide details  
Reply
My System
(13 items)
 
  
CPUMotherboardGraphicsRAM
2500k @ 5GHz ASRock P67 Extreme4 GTX 580 8GB G.Skill Ripjaws X @ 1600 8-8-8-24 
Hard DriveMonitorPowerCase
64GB Crucial SSD/1 TB WD Caviar Black Asus VG236H 23" 3D Corsair 850 TX 800D 
  hide details  
Reply
post #10 of 10
Thread Starter 
yea that is what i thought lol. Stock cards are stock card, this is just a refernce design card with GB BIOS.
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
X99 Main Rig
(10 items)
 
  
CPUMotherboardGraphicsRAM
Intel 5960X Extreme Edition @ 4.5GHz Always Changing VisonTek R9 290 G.Skill Ripjaws 4 16GB (4x4GB) DDR4 @ 3200MHz 
Hard DriveHard DriveHard DriveCooling
Samsung 128GB M.2 PCI-E 4x SSD Apotop 256GB SSD 1.82TB NAS Noctua NH-D15 with both fans 
OSPower
Win7 Pro Enermax 1000W 
  hide details  
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Intel - General
Overclock.net › Forums › Intel › Intel - General › GIGABYTE GTX 570 Review/SLI/SB/OC