Originally Posted by Klocek001
lol how so if CF 480X is around $500, needs a 650W PSU, and you just saw three games out of 13 fairly new ones have +60% scaling. that leaves 10 which don't scale to 60%, majority of which are either seeing negative scaling or very miserable one.
I read TPU review too, 480CF loses to a 1070 in 9 out of 16 games,while in most of the others it's got a very tiny lead, except for BO3. One game out of 16 that would make what you say about the performance and performance/price of CF 480 have any grounds.
It's 16% faster on avg without non-scaling games, and that's more like BO3 making not it look entirely embarassing on avg. That comes with no overclocking room for rx480 CF due to temperatures, unless you wanna make that $550 for AIB ones which will let you oc by 3-4%.
Warning: Spoiler! (Click to show)
and when did I say it's a good thing ? point me to the excat words I said. I said it's a feature that is missing on 1060 but rx480 having it actually doesn't make it a big disadvantage for 1060 since CF in new games is very bad. you got reading comprehension issues.
I agree that 1 1070 would be better then 2x 480's just to not have the hassle/bugs that is CF/SLI.
You dont need a 650W PSU though and they do overclock for more then 3%
1266mhz is base clock + 10% would be 1392mhz.
Most 480's hover around that, overclocks are around 10%
I know, a shocker.. around the same as NV cards.
1266mhz is max boost clock and everything on top of that is the OC.
With Nvidia cards people seem to look at minimum boost and base there OC on that while infact the max boost is alot higher.
Why the different standards ?
Because i'm lazy i will qoute an older post of myself here:"How come that everytime when we talk about OC on AMD cards people look at base clock out of the box including max boost speed and how much you can OC that instead of looking at reference base clock without boost speed like everyone does with Nvidia.
Basicly everyone says a 980ti can OC for 30~40%, because here they do look at reference base clock without boost.
Yet my 980ti boost to 1392mhz out of the box and can only OC to 1450mhz.
Now people do this with 1070/1080 to but still AMD overclocks can only be calculated by out of the box max boost speed ? (Factory OC included wich you cant include when its a Nvidia card)
A 1080 can boost to like 1890mhz stock yet we look at OC's from 1733 or even the base of 1600mhz, on average they run 2050mhz overclocked.
1890mhz to 2050mhz is an amazing overclock all of the sudden ?
Its only like 7~8%
RX480 base clock is 1120mhz so why dont we calculate OC percentages based on that like everyone does with Nvidia cards ? ( not that i think you should, should look at max boost clock IMO)
Boost out of box is 1266mhz.
1266mhz to 1390mhz is how many % ?
More then the 7~8% Nvidia cards do !
By these wierd standards people are using by default AMD cards will always be poor overclockers.
Offcourse everyone will say i am wrong but when i use the same method with Nvidia cards people say i am wrong to. (and wrong again when i use Nvidia so called method on AMD cards)
Why cant OC percentages be calculated the same way no matter if its Nvidia or AMD ?"