Overclock.net - An Overclocking Community - Reply to Topic

Thread: [AnandTech] The NVIDIA GeForce GTX 1660 Ti Review Reply to Thread
Title:
Message:

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in


  Additional Options
Miscellaneous Options

  Topic Review (Newest First)
03-17-2019 01:21 PM
AlphaC https://www.servethehome.com/evga-ge...lack-review/3/


Bunch of GPU compute and pro workloads
03-06-2019 09:23 PM
Majin SSJ Eric
Quote: Originally Posted by 1Kaz View Post
As a salty 970 owner, I just want to correct 1 statement here. Nvidia flat out lied about the 970. It was sold as having 2048 kb L2 cache and 64 ROPs. It DOES effect performance, as only 3.5 gigs can be accessed at once, while the other .5 gigs are significantly slower and cannot be accessed at the same time as the 3.5 gigs. People always point to the RAM and say it only had 3.5 gigs. Your correct, it has 4 gigs, but because Nvidia skimped on performance elsewhere, which they lied about, the card is frequently referenced as only having 3.5 gigs.

https://www.pcper.com/reviews/Graphi...ations-GTX-970
Its kinda like that time when Nvidia offered the 1060 in both 3GB and 6GB variations yet quietly stuck a lesser GPU into the 3GB version while marketing it simply as a 3GB 1060. That's the sort of thing that Defoler will defend each and every time its ever brought up (just wait and see, I bet he quotes this post)!
03-06-2019 07:31 PM
1Kaz
Quote: Originally Posted by Defoler View Post
Let us remember that the 970 did have 4GB and it was used (by the firmware). Why you are trying to wink into "oh I caught you", at least be accurate and less of sleazy about it.
Calling it unusable in uppercase is like saying "I bought titans but you are stupid if you buy titans"

There is a difference in marketing something that is false (hence a lie), and something that sheds better light on a product. AMD claimed, not mistakenly spoke, that the vega 56 price is going to drop to 280$ permanently, and only after reviews came out, said it was only temporary. That is a lie, not to shed better light on their product, but to intentionally influence review bottom line. Even the 4GB fiasco from nvidia with the 970 did not change the bottom line of the performance of those cards. Saying it had 4GB vs 3.5GB did not magically made the card perform faster.
As a salty 970 owner, I just want to correct 1 statement here. Nvidia flat out lied about the 970. It was sold as having 2048 kb L2 cache and 64 ROPs. It DOES effect performance, as only 3.5 gigs can be accessed at once, while the other .5 gigs are significantly slower and cannot be accessed at the same time as the 3.5 gigs. People always point to the RAM and say it only had 3.5 gigs. Your correct, it has 4 gigs, but because Nvidia skimped on performance elsewhere, which they lied about, the card is frequently referenced as only having 3.5 gigs.

https://www.pcper.com/reviews/Graphi...ations-GTX-970
03-06-2019 10:04 AM
Slaughtahouse
Quote: Originally Posted by ilmazzo View Post
Efficiency counts only for datacenters, all other categories just point to price or liking
Any large array of GPU's benefit from efficiency; Facility management will always consider operational costs. Render farms, research facilities, shady bitcoin operations... etc.

For most users with a single gpu setup, the differences are negligible. However, as previously mentioned, users within extreme climates or where utility rates are expensive, it's still worth considering. As wattage is not only expressed by cost/kWh, but also thermal / cooling loads applied to a built environment. That heat has to go somewhere.
03-06-2019 06:02 AM
ilmazzo Efficiency counts only for datacenters, all other categories just point to price or liking
03-05-2019 10:29 PM
Majin SSJ Eric A cost difference of $40-60 over 3 years is negligible. Even the proposed additional cost of $131 over a year's time works out to a little over $10 per month, which is again negligible. The argument of choosing more efficient video cards to save money on your power bill has always been a specious one as even large wattage differences between cards don't really amount to much actual savings on a typical power bill, and that's even assuming a 24/7 workload (which is not nearly representative of average PC usage).
03-05-2019 08:52 PM
AlphaC
Quote: Originally Posted by maltamonk View Post
Ok so at what price does all this matter? We could always take the price difference of some cards and apply that to cooling.Does 100w of heat = $100?
Depends on climate and price per kwh.


If you use a card 4 hours a day and it uses 100W more on average at the wall then that's 0.4kwh/day which is going to add up to around $40-60 over the warrantied lifespan of the card (3 years). In a warm climate with air conditioning it's doubled since you need to pump that heat out , in a cool climate where you need heat it isn't as big a deal since it is essentially an electrical heater.



If you're folding on your cards 24 hours a day there's a much bigger cost since it's 876 kwh/year. At $0.15/kwh that's going to be easily $131 a year , not including taxes / supply charges / etc.
03-05-2019 11:38 AM
Slaughtahouse
Quote: Originally Posted by maltamonk View Post
Ok so at what price does all this matter? We could always take the price difference of some cards and apply that to cooling.Does 100w of heat = $100?
It's subjective. I cannot define that for anyone.

Simply put, I am trying to compact my rig. My end goal is to get this system down into the small custom case I can build to support my existing 360mm radiator. At any point I can maximize efficiency (power consumption) I will.

Is it best value? Of course not. AMD is the clear answer for best value, leading the charts at $/frame.
03-05-2019 10:15 AM
maltamonk
Quote: Originally Posted by Slaughtahouse View Post
Water blocks are hard to come by for these mid range cards & I am not willing to invest into a GPU with 6GB of VRAM at this time. Do most people water block mid range cards? No and I understand why. However, I want a silent loop and it's just a personal preference.

The RX 580 would be the best option, given it's compute performance. It's honestly a great value. However, there are few blocks for it, especially there is low availability in Canada.

https://www.dazmode.com/store/produc...s%2Bamd-cards/ One of the few local stores here.

The GTX 1070 would be the nvidia equivalent and that opens up the door for GTX 1070 / 1080 water blocks. As well, the 1070 is about ~25% faster in most DX11 titles. So there is also that incentive. Not that I game much.

Not to mention is still uses less power than the RX 580. Which will give me more loop headroom for my CPU (3930k). If I had more radiator space, I probably wouldn't care. But I am on a single 360mm/32mm radiator. My previous GPU was about 250W (GTX 780) and temps in my loop were "OK" but I didn't have headroom to OC. If I can slash 100w from my loop, I am a happy camper.
Ok so at what price does all this matter? We could always take the price difference of some cards and apply that to cooling.Does 100w of heat = $100?
03-05-2019 06:48 AM
Slaughtahouse
Quote: Originally Posted by maltamonk View Post
So here's my question to you. At what point does price come into your equation? If a 580 is enough for your needs then surely a 1060 is as well. Seeing that a 1060 uses around the same power (actually a bit less) as the 1660ti, how does the now $100 price difference apply?
Water blocks are hard to come by for these mid range cards & I am not willing to invest into a GPU with 6GB of VRAM at this time. Do most people water block mid range cards? No and I understand why. However, I want a silent loop and it's just a personal preference.

The RX 580 would be the best option, given it's compute performance. It's honestly a great value. However, there are few blocks for it, especially there is low availability in Canada.

https://www.dazmode.com/store/produc...s%2Bamd-cards/ One of the few local stores here.

The GTX 1070 would be the nvidia equivalent and that opens up the door for GTX 1070 / 1080 water blocks. As well, the 1070 is about ~25% faster in most DX11 titles. So there is also that incentive. Not that I game much.

Not to mention is still uses less power than the RX 580. Which will give me more loop headroom for my CPU (3930k). If I had more radiator space, I probably wouldn't care. But I am on a single 360mm/32mm radiator. My previous GPU was about 250W (GTX 780) and temps in my loop were "OK" but I didn't have headroom to OC. If I can slash 100w from my loop, I am a happy camper.
This thread has more than 10 replies. Click here to review the whole thread.

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off