Yeah, it's pretty bad... almost like intel and how they're putting out three more sockets this coming year...
The history of Nvidia's G92 graphics processor is a long one, as these things go. The first graphics card based on it was the GeForce 8800 GT, which debuted in October of 2007. The 88000 GT was a stripped-down version of the G92 with a few bits and pieces disabled. The fuller implementation of G92 came in December '07 in the form of the GeForce 8800 GTS 512MB. This card initiated the G92's long history of brand confusion by overlapping with existing 320MB and 640MB versions of the GeForce 8800 GTS, which were based on an entirely different chip, the much larger (and older) G80. Those cards had arrived on the scene way back in November of 2006. As the winter of '07 began to fade into spring, Nvidia had a change of heart and suddenly started renaming the later members of the GeForce 8 series as "new" 9-series cards. Thus the GeForce 8800 GTS 512 became the 9800 GTX. And thus things remained for nearly ten weeks. Then, in response to the introduction of strong new competition, Nvidia shipped a new version of the G92 GPU with the same basic architecture but manufactured on a smaller 55nm fabrication process. This chip found its way to market aboard a slightly revised graphics card dubbed the GeForce 9800 GTX+. The base clock speeds on the GTX+ matched those of some "overclocked in the box" GeForce 9800 GTX cards, and the performance of the two was essentially identical, though the GTX+ did reduce power consumption by a handful of watts. Slowly, the GTX+ began replacing the 9800 GTX in the market, as the buying public scratched its collective head over the significance of that plus symbol. [...] |
Test notes You'll have to forgive us. Since Nvidia sprung this card on us in the middle of last week, and since we rather presumptuously had plans this past weekend, we were not about to go and formulate a revised test suite and produce an all-new set of benchmark results for this card and thirteen or so of its most direct competitors, with all new drivers and new games. Instead, we chose a strategy that very much mirrors Nvidia's, recycling a past product for a new purpose. In our case, we decided to rely upon our review of the GeForce GTX 285 and 295, published way back on January 15, for most of our test data. This unflinchingly lame, sad, and altogether too typical exercise in sheer laziness and feckless ridiculosity nets us several wholly insurmountable challenges in our weak attempt at evaluating this new product and its most direct competitor. First and foremost, of course, is the fact that video card drivers have changed one or two entire sub-point-release revisions since our last article. So although we tested the GeForce GTS 250 and Radeon HD 4850 1GB with recent drivers, the remainder of our results come from well-nigh ancient and unquestionably much slower and less capable driver software, because everyone knows that video card performance improves 15-20% with each driver release. Never mind the fact that the data you will see on the following pages will look, on the whole, entirely comparable across driver revisions. That is a sham, a mirage, and our other results are entirely useless even as a point of reference. As if that outrage weren't sufficient to get our web site operator's license revoked, you may be aware that as many as one or two brand-new, triple-A PC game titles have been released since we chose the games in our test suite, and their omission will surely cripple our ability to assess this year-and-a-half-old GPU. This fact is inescapable, and we must be made to suffer for it. Finally, in a coup de grace fitting of a Tarantino flick, two of the games we used were tested at a screen resolution of 2560x1600, clearly a higher resolution than anyone with a $150 graphics card would ever use for anything. Ever. Do not be swayed by the reasonable-sounding voice in your ear that points out both games were playable at this resolution on this class of hardware. Do not be taken in by the argument that using a very high resolution serves to draw out the differences between 512MB and 1GB graphics cards, and answer not the siren song of the future-proofing appeal. Nothing about this test is in any way "real world," and no one who considers himself legitimate as a gamer or, nay, a human being should have any part in such a travesty. You may wish to close this tab in your browser now. |
At this point in the review, Nvidia's marketing department would no doubt like for me to say a few words about some of its key points of emphasis of late, such as PhysX, CUDA, and GeForce 3D Vision. I will say a few words, but perhaps not the words that they might wish. CUDA is Nvidia's umbrella term for accelerating non-graphics applications on the GPU, about which we've heard much lately. ATI Stream is AMD's term for the same thing, and although we've heard less about it, it is very similar in nature and capability, as are the underlying graphics chips. In both cases, the first consumer-level applications for are only beginning to arrive, and they're mostly video encoders that face some daunting file format limitations. Both efforts show some promise, but I expect that if they are to succeed, they must succeed together by running the same programs via a common programming interface. In other words, I wouldn't buy one brand of GPU over the other expecting big advantages in the realm of GPU-compute capability-especially with a GPU as old as the G92 in the mix. One exception to this rule may be PhysX, which is wholly owned by Nvidia and supported in games like Mirror's Edge and... well, let me get back to you on that. I suspect PhysX might offer Nvidia something of an incremental visual or performance advantage in certain upcoming games, just as DirectX 10.1 might for AMD in certain others. As for GeForce 3D Vision, the GeForce GTS 250 is purportedly compatible with it, but based on my experience, I would strongly recommend getting a much more powerful graphics card (or two) for use with this stereoscopic display scheme. The performance hit would easily swallow up all the GTS 250 has to give-and then some. |