Overclock.net › Forums › Industry News › Hardware News › [vr-zone] HD 4850/4870 are not high-end-chips
New Posts  All Forums:Forum Nav:

[vr-zone] HD 4850/4870 are not high-end-chips - Page 6

post #51 of 86
Quote:
Originally Posted by nathris View Post
Now all I have to do it figure out what I'm going to do with the extra $150 I'm going to be saving by getting a 4870 while maintaining the same real world performance...

And who says AMD isn't spending money on R&D, lets compare what's happening this generation...

ATI:
-Implementing DX 10.1
-Making advances in power consumption
-Making use of new technologies (GDDR5, DisplayPort)
-Perfecting smaller manufacturing processes (55nm)
-Making noticeable advances in Crossfire, including using onboard to boost the discreet (nVidia is doing this too, but only after AMD announced it )

nVidia:
-Increasing size, cost, and heat for the sake of performance
-Using the same technology they've had since 2006
-Paying video game developers to optimize games for their cards
-Buying out smaller companies to try and make use of their technology

R&D isn't just about making things faster. I remember a quote from Civilization 4, when you research engineering:

All nVidia is doing is just adding more and more. Eventually everything will come crashing down, and when it does, the seemingly meaningless advances ATI is making now will put nVidia in a massive hole as they will have to start from square one researching things that ATI has already seamlessly integrated.
fanboy much? lol

it's one thing to root for ATI but to decide on purchases before the performance is even known is pretty dense.
Edited by bionh - 5/30/08 at 3:32pm
post #52 of 86
Quote:
Originally Posted by nathris View Post
ATI:
-Implementing DX 10.1
-Making advances in power consumption
-Making use of new technologies (GDDR5, DisplayPort)
-Perfecting smaller manufacturing processes (55nm)
-Making noticeable advances in Crossfire, including using onboard to boost the discreet (nVidia is doing this too, but only after AMD announced it )

nVidia:
-Increasing size, cost, and heat for the sake of performance
-Using the same technology they've had since 2006
-Paying video game developers to optimize games for their cards
-Buying out smaller companies to try and make use of their technology
I loled, looks like ATI's propaganda really works

DX 10.1 have no real world use as of yet, nVidia with "old" GDDR3 memory but 512-bit bus have more memory bandwidth. I can continue, but it won't make a difference anyways, since everyone believes what he wants to believe, no matter how things turn out in real world.
post #53 of 86
Where's the GeForce 8 series?
Anyways, since when was there a 9800GT?
    
CPUMotherboardGraphicsRAM
E8400 GIGABYTE GA-P35-DS3L eVGA 9800GT Stock Speeds GSkill 4GB (2x2GB) PC6400 
Hard DriveOptical DriveOSMonitor
Samsung F3 500GB Samsung 22X DVD Drive Windows 7 64-bit 22" Acer x223w 
KeyboardPowerCaseMouse
IBM Model M Model 1391401 Antec EarthWatts 500W Troglodytic Tech Terminal Razer Deathadder 
Mouse Pad
Steelseries Qck+ 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
E8400 GIGABYTE GA-P35-DS3L eVGA 9800GT Stock Speeds GSkill 4GB (2x2GB) PC6400 
Hard DriveOptical DriveOSMonitor
Samsung F3 500GB Samsung 22X DVD Drive Windows 7 64-bit 22" Acer x223w 
KeyboardPowerCaseMouse
IBM Model M Model 1391401 Antec EarthWatts 500W Troglodytic Tech Terminal Razer Deathadder 
Mouse Pad
Steelseries Qck+ 
  hide details  
Reply
post #54 of 86
Quote:
Originally Posted by Nowyn View Post
I loled, looks like ATI's propaganda really works

DX 10.1 have no real world use as of yet, nVidia with "old" GDDR3 memory but 512-bit bus have more memory bandwidth. I can continue, but it won't make a difference anyways, since everyone believes what he wants to believe, no matter how things turn out in real world.

DX10.1 (D3D10.1 whatever) does have its uses. It makes things faster and more efficient. It needs to be carefully implemented, however, because it seems to not play that nice with older hardware (that the whole Ubisoft debacle).
As for the bandwidth thing, well it's pretty obvious that it has more to do with manufacturing cost and design simplicity than anything else. Until we see real benchmarks we can't say which approach is better, but I bet at the end of the day both methods will perform similarly.
Magicbox
(17 items)
 
crapbox
(13 items)
 
 
CPUMotherboardGraphicsRAM
FX 8320 Sabertooth 990FX Nitro+ RX480 Kingston HyperX Fury 
Hard DriveHard DriveHard DriveCooling
Samsung 850 EVO  Kingston HyperX 3K Seagate Barracuda 7200.14 Noctua NH-D15 
OSOSMonitorKeyboard
Kubuntu  Windows 10 Pro Dell U2515H CM Quickfire TK (Cherry Blue) 
PowerCaseMouseMouse Pad
Cooler Master Silent Pro M 850W Enthoo Pro Logitech G502 Logitech G440 
Audio
Xonar DX 
CPUMotherboardGraphicsRAM
Sempron 3300+ HP stock mobo (laptop) 200M (IGP) 2x1GB PC3200 
Hard DriveOptical DriveOSMonitor
100GB ATA133 DVD/CDRW Kubuntu 32 bit 14.1" (1280x768) 
Power
6 cell 
  hide details  
Reply
Magicbox
(17 items)
 
crapbox
(13 items)
 
 
CPUMotherboardGraphicsRAM
FX 8320 Sabertooth 990FX Nitro+ RX480 Kingston HyperX Fury 
Hard DriveHard DriveHard DriveCooling
Samsung 850 EVO  Kingston HyperX 3K Seagate Barracuda 7200.14 Noctua NH-D15 
OSOSMonitorKeyboard
Kubuntu  Windows 10 Pro Dell U2515H CM Quickfire TK (Cherry Blue) 
PowerCaseMouseMouse Pad
Cooler Master Silent Pro M 850W Enthoo Pro Logitech G502 Logitech G440 
Audio
Xonar DX 
CPUMotherboardGraphicsRAM
Sempron 3300+ HP stock mobo (laptop) 200M (IGP) 2x1GB PC3200 
Hard DriveOptical DriveOSMonitor
100GB ATA133 DVD/CDRW Kubuntu 32 bit 14.1" (1280x768) 
Power
6 cell 
  hide details  
Reply
post #55 of 86
Quote:
Originally Posted by nathris View Post
Lol @ the people buying the GTX 260

Now all I have to do it figure out what I'm going to do with the extra $150 I'm going to be saving by getting a 4870 while maintaining the same real world performance...

And who says AMD isn't spending money on R&D, lets compare what's happening this generation...

ATI:
-Implementing DX 10.1
-Making advances in power consumption
-Making use of new technologies (GDDR5, DisplayPort)
-Perfecting smaller manufacturing processes (55nm)
-Making noticeable advances in Crossfire, including using onboard to boost the discreet (nVidia is doing this too, but only after AMD announced it )

nVidia:
-Increasing size, cost, and heat for the sake of performance
-Using the same technology they've had since 2006
-Paying video game developers to optimize games for their cards
-Buying out smaller companies to try and make use of their technology


R&D isn't just about making things faster. I remember a quote from Civilization 4, when you research engineering:



All nVidia is doing is just adding more and more. Eventually everything will come crashing down, and when it does, the seemingly meaningless advances ATI is making now will put nVidia in a massive hole as they will have to start from square one researching things that ATI has already seamlessly integrated.
Agreed with everything you just said.

I can't wait for another x1950xtx to shut nvidia up. Their idiotic, immature bashing AMD and Intel is retarded.
post #56 of 86
Quote:
Originally Posted by xHassassin View Post
Where's the GeForce 8 series?
Anyways, since when was there a 9800GT?
whose uses the 8 series now a days

9800GT will be the rebraned 8800GT, 65nm and 55nm.
post #57 of 86
Quote:
Originally Posted by Nowyn View Post
Well in 280 and 260 nVidia is using Improved Unified Shaders (2nd gen or whateber u wanna call it) and they claim that it performs better than 1st gen, so it's really not just beefed up (by adding more SP) version of G80/G92. Talking about ATI i haven't seen any info on architecture, so it might just be as u said "on way too many steroids".

As for 3870X2, lol, i remember lots of benchies illustrating crappy driver support resulting in CFX not working properly. As for 9800GX2, well u don't have one and never had, so don't make conclusion on how it works, quite a few ppl here have em and they are more than satisfied. Talking in general nVidia made an official statement before 9800GX2 release, that they don't like dual chip solution concept in general, but they made one and showed us all that thay can beat ATI any day. I hope u won't argue that 9800GX2 perform better than 3870X2 by more than noticeable margin.

Anyway it's not going anywhere. Those fanboy (of any brand) conclusions with no proofs and sometimes full of person's own stupidity (talking in general, not to offend any1) are just sick to read. People wanna live their dream where their favorite brand in on top and they are convincing themselves and everyone around that it will be the case. I'd say wait for new hardware to be released and then compare offerings.
Prove it man. You keep talking about how everyone else in the thread is a baseless, factless fanboy. But you are too, I'm sorry.
You have no proof and all you are doing is ****ting on ATI's products.

You live in a fanboy world just like the people you describe, the only difference is yours is green, instead of red.

And as for the 9800GX2 having no problems... good god, MHill (from this website) has two of these things, so i'm sure he knows what hes talking about, and even he has complained of stuttering.

Quote:
Originally Posted by Nowyn View Post
I loled, looks like ATI's propaganda really works

DX 10.1 have no real world use as of yet, nVidia with "old" GDDR3 memory but 512-bit bus have more memory bandwidth.
ATI doesn't have money for propaganda. ATI barely even has a marketing department.

Have fun with that 512bit bus and your GDDR3 (LMAO!) both of which are major reasons why this card is going to be spitting burning plasma out of the back of your case.
Edited by HugeDink - 5/30/08 at 4:23pm
Von Strauss
(13 items)
 
  
CPUMotherboardGraphicsRAM
Athlon 5000x2 BE @ 3.2 ASUS M2R-32 MVP Sapphire HD4850 700/1100 2x1gb G.Skill HZs 
Hard DriveOptical DriveOSMonitor
250gb WD IDE lol LG DVD-RAM w/ Lightscribe Vista Ultimate 32-bit Sceptre, 1280x1024, 5ms 
PowerCase
Corsair TX 650W Enermax Chakra w/ 250mm Side Panel Fan 
  hide details  
Reply
Von Strauss
(13 items)
 
  
CPUMotherboardGraphicsRAM
Athlon 5000x2 BE @ 3.2 ASUS M2R-32 MVP Sapphire HD4850 700/1100 2x1gb G.Skill HZs 
Hard DriveOptical DriveOSMonitor
250gb WD IDE lol LG DVD-RAM w/ Lightscribe Vista Ultimate 32-bit Sceptre, 1280x1024, 5ms 
PowerCase
Corsair TX 650W Enermax Chakra w/ 250mm Side Panel Fan 
  hide details  
Reply
post #58 of 86
Quote:
Originally Posted by Nowyn View Post
I loled, looks like ATI's propaganda really works

DX 10.1 have no real world use as of yet, nVidia with "old" GDDR3 memory but 512-bit bus have more memory bandwidth. I can continue, but it won't make a difference anyways, since everyone believes what he wants to believe, no matter how things turn out in real world.
Remember that we're talking about the next generation of graphics cards. DX10.1 makes things easier and more efficient for programmers. That means that with the few games coming out that nVidia won't buy out we might even see the 4870 outperforming the GTX 260.


And I knew I would get a lot of backlash from the nVidia fanboys (mostly due to the negative title) but like I said, those pros aren't revolutionary, but they are things that ATI is doing to set themselves apart. They realize they don't have the resources to compete with nVidia on the high-end (other than the X2s), so they have decided to take over everything else. They own the integrated and mobile market with the 780G and Hybrid Crossfire X, I haven't seen the numbers yet, but I don't think nVidia will be able to compete with the price/performance ratio of the 4400 and 4600 series, and they for sure won't be able to beat the 4800s.

All nVidia has right now is the high end GT200, which should be the best, and will probably be the best for another 1.5 years like the G80, but despite the fact that there will be a considerable number of OCN members paying nearly $500 for the GTX 260, and maybe 5 or 6 people paying nearly $700 for the GTX 280, in the rest of the market I wouldn't be surprised if the 4870 outsells the GTX 260 by 10:1, and when you consider the rest of the market, which nVidia has all but forgotten about, its going to be a good year for red.
    
CPUMotherboardGraphicsRAM
Core i5 4670k ASUS Maximus VI Gene Gigabyte GTX 460 1GB Kingston Hyper-X 
Hard DriveHard DriveHard DriveHard Drive
Samsung 830 OCZ Vertex 3 WD6401AALS WD5000AAKS 
CoolingOSMonitorMonitor
Noctua NH-D14 elementary OS Dell Ultrasharp U2312HM LG W2442PA-BF 
KeyboardPowerCaseMouse
Microsoft Sidewinder X4 Corsair HX750W Corsair Graphite 600T Logitech G700 
Audio
ASUS Xonar DG 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Core i5 4670k ASUS Maximus VI Gene Gigabyte GTX 460 1GB Kingston Hyper-X 
Hard DriveHard DriveHard DriveHard Drive
Samsung 830 OCZ Vertex 3 WD6401AALS WD5000AAKS 
CoolingOSMonitorMonitor
Noctua NH-D14 elementary OS Dell Ultrasharp U2312HM LG W2442PA-BF 
KeyboardPowerCaseMouse
Microsoft Sidewinder X4 Corsair HX750W Corsair Graphite 600T Logitech G700 
Audio
ASUS Xonar DG 
  hide details  
Reply
post #59 of 86
Quote:
Originally Posted by bionh View Post
fanboy much? lol

it's one thing to root for ATI but to decide on purchases before the performance is even known is pretty dense.
When we're pretty sure ATI will be able to compete, and cost $200 less, its not really dense, and decisions can be changed remember.

I myself, will go ATI even if it is slower, I can barely to afford to buy a second hand 6800GS, let alone a $459 GTX 260.
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
    
CPUMotherboardGraphicsRAM
Intel Core i5 3570k @ 4.5Ghz ASRock Z77 Pro3 Powercolor Radeon HD7950 3GB @ 1150/1350 4x4GB G.Skill Ares 2000Mhz CL9 
Hard DriveHard DriveHard DriveHard Drive
Samsung 840 250GB Western Digital Black 1TB WD1002FAEX Seagate Barracuda 3TB ST3000DM001 Samsung Spinpoint EcoGreen 2TB 
Optical DriveCoolingCoolingCooling
Pioneer DVR-220LBKS Noctua NH-D14 Scythe Gentle Typhoon 1850rpm Corsair AF140 Quiet Edition 
CoolingOSMonitorMonitor
Arcitc Cooling Acclero Twin Turbo II Arch Linux x86-64, amdgpu BenQ G2220HD BenQ G2020HD 
KeyboardPowerCaseMouse
Ducky Shine III Year of the Snake, Cherry Blue Silverstone Strider Plus 600w CoolerMaster CM690 II Black and White SteelSeries Sensei Professional 
Mouse PadAudioOther
Artisan Hien Mid Japan Black Large ASUS Xonar DX NZXT Sentry Mesh 30w Fan Controller 
  hide details  
Reply
post #60 of 86
Quote:
Originally Posted by HugeDink View Post
Prove it man. You keep talking about how everyone else in the thread is a baseless, factless fanboy. But you are too, I'm sorry.
You have no proof and all you are doing is ****ting on ATI's products.

...

Have fun with that 512bit bus and your GDDR3 (LMAO!) both of which are major reasons why this card is going to be spitting burning plasma out of the back of your case.
Well there were quite a few reliable sources about GTX 200 architecture (appeared in the new on OCN), so before making statement like yours just take a look.

And again, lol @ GDDR3 fail and all. Thing is the 2 pcb vs 1 pcb story all over again, there's nothing reasonable behind ur words. And if we are talking about cards TDP 4870 have a TDP of 157 Watt, so 4870X2 is gonna be close to 300 Watts and that's not something u can call "ice cool", now can u?

As for me, yeah there no secret that i prefer nVidia products for a quite a few reasons, but that just a matter of taste. What is different is that i never biased on ATI offering unlike u. (u can check all my posts, ain't gonna find it)

God i hate when people speculate on something that isn't out yet just for the sake of speculating. Too bad launch is 3 weeks away, so lots more "I hate nVidia and their procucts bite the dust. Have any proof? No, i just love ATI" porst will appear.
Edited by Nowyn - 5/31/08 at 1:30am
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Hardware News
Overclock.net › Forums › Industry News › Hardware News › [vr-zone] HD 4850/4870 are not high-end-chips