I think you may be mistaken. I was thinking this too but the numbers out there are total SYSTEM power draw. I saw numbers of over 300W and I thought to myself, "Wow, this is the 290/290x all over again." But that doesn't make any sense because with even an 8-pin connector, the card should only draw about 225 max though many power supplies can easily go over.
TBH, the power draw really doesn't matter except for people buying pre-mades (where they have bad power supplies and only one 6-pin cable from the power supply) or if your card is a furnace like the 290/390's were. It's nice that the 1060 runs cooler, but 175W to 120W is meh ... your room is not going to get that much hotter. It's not as drastic as the difference between the 970 and 290.
And how exactly do you expect that the display driver (from which the software reads the power draw) know the "total system power" consumption? The figure displayed by GPU-Z is just for the GPU itself (and not even for the whole card) and nothing else.
The reason why RX 480 is so inefficient and shows colossal increases in the power usage when pushed even further, is because AMD pushed it too far beyond the process capabilities right out of the box. They probably greatly underestimated nVidia's Pascals and once the actual figures were out, were forced to ramp up the clocks (and destroy the efficiency) to be even remotely competitive performance wise. According to certain press members who participated AMD's event at Macao, the default clocks were not disclosed at the time. Many of them say that it felt like AMD hadn't really decided them at that point.
Now imagine if the GPU would provide the same performance at 900MHz as it now does at 1266MHz. It would be a 90W card at most (instead of 150W).