Originally Posted by Redeemer
Yeah and Tegra was suppose to outperform Core2Duo..
The last 3 or 4 generations alone have apparently matched, beaten and then once again matched the previous generation of consoles too. I'd put that at the K1 being the only truthfully faster than an Xbox 360 or PS3 SoC out of those too...
Originally Posted by fateswarm
OK it can beat it on cherry picked benchmarks.
>complains about cherry picked benchmarks
>uses synthetic benchmark for results that clearly disagree with commonly known performance numbers (eg. 780 > Titan? Really?)
I'm looking at getting either a Radeon R9 290 or GTX 780 (Or was, I'm thinking I'll hold out until Maxwell then buy..Especially if FO4 is announced between now and then) and they generally compete in most benchmarks with the 780 being ever so slightly faster overall. The 290X competes with the Ti, but gets beaten when you compare OC to OC. In all honesty, it's a near reversal of last generation for straight performance in that this generation is relatively equal with nVidia gaining a slight lead due to OCing better, last gen it was AMD cards that pulled ahead due to OCing better although admittedly, it was a larger lead.
Not to mention, going from newegg prices the 780 is a bit more expensive than the R9 290. I know it's around $100 extra here in Australia. IMO, while it's a little faster that's out-valued by the price increase although this is the first time I've been able to consider a new nVidia card from a price/performance standpoint for generations. Typically you'd pay more than $100 extra for equal or ever so slightly better performance, or the same for worse performance. Take it from someone who is reading everything they can between Hawaii and GK110: If you think it's a cut and dry difference, it's just you and specifically the games you play/features you want to use/whatever. Sometimes Hawaii is faster within similar price points, sometimes GK110 is. Sometimes GK110 has features you want that Hawaii lacks, sometimes it doesn't or even visa versa. (eg. Mantle, TrueAudio, PhysX, CUDA, etc if you use any of those.) In all honesty, this is one of the most balanced generations that I care to remember only really beaten by the GeForce 6x00/Radeon x*00 generation. (Generally, the Radeon's were faster but they were only DX9.0b, GeForces were a tad slower overall but were more future proof with DX9.0c...History proved the GeForce right in that particular generation given that I was happily gaming on a 6800GS until 2007. A x850Pro or the like would have shown its age a lot earlier.)
Originally Posted by Imouto
Nvidia claimed nothing. I did the math there and may be wrong but I'm pretty confident about it. In fact I was aiming pretty low. The GTX 670 (Which isn't a full GK104) at 294mm^2 is 183% faster (2.8x) than the GTX 650 at 118mm^2 (GK107, segment that the GM107 is replacing) and the GTX 750 Ti is x2.4 the performance of a GTX 650 with just a 25% larger die.
If the GM104 jump in performance per square mm is similar to the Kepler one going from the 107 to the 104 chip you may see a huge gain if it is a ~300mm^2 again. Enough to beat the crap out of the GTX 580 and get close enough to the GTX 780 Ti which is 2.5x the performance of a GTX 750 Ti.
Remember that I'm always talking about Cycles performance, nothing else.
One thing to note that I probably should note myself more often is that those numbers are very rough, take Pitcairn vs Tahiti for example: Tahiti is quite a bit bigger, uses quite a bit more power and puts out quite a bit more heat but this doesn't translate into a big performance difference. A noticeable one worth the price of a 280/280X over a 270/270X? IMO, yes (I say this as someone who owns a HD7950 and has easy access to a HD7850 TwinFrozr) but it's a lesser amount than the difference in die size, power consumption and heat output. I can't talk for much else than gaming and mining in that scenario given that I prefer not to fold on my HD7950, it simply makes my room too warm.
Originally Posted by DVLux
I think that's the point. Besides, what consumer really needs six monitors? Probably only the people on OCN. XD
Heh, you'd be surprised: I set my girlfriend up with a 1280x1024 screen at her request (She saw how much I love rocking the 1080p/900p combo) and she's using it a lot. My Mums desktop died and she's using the screen that came with it in addition to her laptop, too. Teach users how to do it and generally they'll like it even if it merely means leaving one screen for movies, chat, etc and the other for whatever they're concentrating on, I'd prefer DP over DVI/HDMI/VGA simply because it's easy to find an adapter from DP to anything and DP is superior to the rest of them, especially given that it supported 60Hz 4k while HDMI2.0 (The one that also does that as opposed to 4k at 30Hz max) is still barely used. Personally, I'm starting to think about buying a 1440p screen to use as my main one with the 1080p one being used to watch movies while I play Civ V and wait for turns to progress/players to finish their turns.
You can, however just use the iGPU if you're running an APU for more than a few screens. Not many people would want 6 screen Eyefinity and even an Intel iGPU can handle the kind of tasks most would dedicate to their non-main screens, which I actually prefer overall given that YouTube can sometimes stutter in my second screen when I play certain games that are GPU bottlenecked.
Originally Posted by fateswarm
I meant people may have all the screen real estate they can handle. Their neck can't go further. Their eyes are comfortable.
They use windows.
I find it amusing that people often imply each monitor must have one window.
I find it amusing that you're completely ignoring the fact that changing windows often, etc means you'll be spending a lot of your time simply making sure each Window is visible...Never mind the fact that some people likely couldn't afford to have their windows that
small. I'd personally struggle to use more than 3 screens effectively beyond Eyefinity or the like (And even then..) but I'm not one to knock the way anyone uses their PC simply because people typically pick what works for them and as I'm sure you know, everyone is unique.