Overclock.net › Forums › AMD › AMD CPUs › AMD No longer a viable option for mid-high end?
New Posts  All Forums:Forum Nav:

AMD No longer a viable option for mid-high end? - Page 98  

post #971 of 1593
Quote:
Originally Posted by Thready View Post

maybe, but do you really think that there would be so much consumer standardization without the day demographic?

Well, what happened so far is that whenever gaming was involved, standards were ignored, proprietary implementations were introduced and documentation was missing or conveniently missed the important bits.

The AT and ATX standards for example, you can thank IBM and Intel for introducing them back when "gaming" on x86 was not yet a big thing.
When looking around at those "gaming" systems what you see is that color codes are skipped, big companies have proprietary connectors on some of their hardware and sizes of some boards do not fall into any standard size category in order to either save manufacturing costs or promote desktop cases that support their new unneeded board size. Essentially the standards are thrown out of the window.

In the past, things were even worse with gaming PCs. There were a dozen or so of "gaming computer" makers (atari, amiga, amstrad, sharp, nec, microsoft, sinclair, etc.). Well, none of them maintained any sort of compatibility with each other, even when using the same hardware. Everything was proprietary so that "gamers" would be forced to leave their games behind if they would choose to get a computer from another maker.

Now, when new tech is introduced it's sensical to have new standards come with it. Of course whatever uses the said new tech will also use the new standards that come with it. Just because "gaming" systems are usually amongst the first to use the new standards doesn't mean that gaming systems were responsible for introducing them. It's just that "gaming" hardware is usually over the top, with bunches of new features of all kinds (that rarely are useful in practice) and a perfect candidate to host overpriced new features that budget systems can't due to cost considerations.


In the end, when you leave it to the gamer, standards slowly vanish, proprietary stuff slowly appear and in the end you more or less have a similar situation to consoles and the likes.
Edited by PsyM4n - 3/30/14 at 2:57am
post #972 of 1593
Quote:
Originally Posted by PsyM4n View Post

Quote:
Originally Posted by Thready View Post

maybe, but do you really think that there would be so much consumer standardization without the day demographic?

Well, what happened so far is that whenever gaming was involved, standards were ignored, proprietary implementations were introduced and documentation was missing or conveniently missed the important bits.

The AT and ATX standards for example, you can thank IBM and Intel for introducing them back when "gaming" on x86 was not yet a big thing.
When looking around at those "gaming" systems what you see is that color codes are skipped, big companies have proprietary connectors on some of their hardware and sizes of some boards do not fall into any standard size category in order to either save manufacturing costs or promote desktop cases that support their new unneeded board size. Essentially the standards are thrown out of the window.

In the past, things were even worse with gaming PCs. There were a dozen or so of "gaming computer" makers (atari, amiga, amstrad, sharp, nec, microsoft, sinclair, etc.). Well, none of them maintained any sort of compatibility with each other, even when using the same hardware. Everything was proprietary so that "gamers" would be forced to leave their games behind if they would choose to get a computer from another maker.

Now, when new tech is introduced it's sensical to have new standards come with it. Of course whatever uses the said new tech will also use the new standards that come with it. Just because "gaming" systems are usually amongst the first to use the new standards doesn't mean that gaming systems were responsible for introducing them. It's just that "gaming" hardware is usually over the top, with bunches of new features of all kinds (that rarely are useful in practice) and a perfect candidate to host overpriced new features that budget systems can't due to cost considerations.


In the end, when you leave it to the gamer, standards slowly vanish, proprietary stuff slowly appear and in the end you more or less have a similar situation to consoles and the likes.

Not even close. Mobile, Console, Professional, Enterprise, they all use things long before we see them and they get passed down. "Gaming" and "Enthusiast" parts are all crippled hand-me-downs that get the features of the other much more demanding fields. Not just PC parts either, it applies to just about every single field imaginable. Businesses both demand and can afford the best. You as a consumer may have it when it becomes cheap enough to make, or you can just make due with the scraps.

Every Intel chip is a potential Xeon or mobile. Every AMD one an Opteron or FirePro. Every nVidia one a Quadro or even Tesla.

Other standards are also passed down from our "friends" in other sections; mSATA, SODIMM, mPCI-e. Even standard socket evolution like SATA, PCI-e, and DDR, they're done for servers first.

As for "unneeded" motherboard size, give an example. Most of the things riding on E-ATX use full-on server chipsets, and do in fact need that much room for that many traces in the board. There is not a single motherboard standard in use (BTX, ATX, mATX, E-ATX, mITX) that isn't used by Dell or HP or similar, assuming they didn't just start in the server/workstation world outright.
Forge
(17 items)
 
Forge-LT
(7 items)
 
 
CPUMotherboardGraphicsGraphics
Intel i7-5960X (4.625Ghz) ASUS X99-DELUXE/U3.1 EVGA 1080ti SC2 Hybrid EVGA 1080ti SC2 Hybrid 
RAMHard DriveCoolingOS
64GB Corsair Dominator Platinum (3000Mhz 8x8GB) Samsung 950 Pro NVMe 512GB EK Predator 240 Windows 10 Enterprise x64 
MonitorKeyboardPowerCase
2x Acer XR341CK Corsair Vengeance K70 RGB Corsair AX1200 Corsair Graphite 780T 
MouseAudioAudioAudio
Corsair Vengeance M65 RGB Sennheiser HD700 Sound Blaster AE-5 Audio Technica AT4040 
Audio
30ART Mic Tube Amp 
CPUMotherboardGraphicsRAM
i7-4720HQ UX501JW-UB71T GTX 960m 16GB 1600 9-9-9-27 
Hard DriveOSMonitor
512GB PCI-e SSD Windows 10 Pro 4k IPS 
  hide details  
Forge
(17 items)
 
Forge-LT
(7 items)
 
 
CPUMotherboardGraphicsGraphics
Intel i7-5960X (4.625Ghz) ASUS X99-DELUXE/U3.1 EVGA 1080ti SC2 Hybrid EVGA 1080ti SC2 Hybrid 
RAMHard DriveCoolingOS
64GB Corsair Dominator Platinum (3000Mhz 8x8GB) Samsung 950 Pro NVMe 512GB EK Predator 240 Windows 10 Enterprise x64 
MonitorKeyboardPowerCase
2x Acer XR341CK Corsair Vengeance K70 RGB Corsair AX1200 Corsair Graphite 780T 
MouseAudioAudioAudio
Corsair Vengeance M65 RGB Sennheiser HD700 Sound Blaster AE-5 Audio Technica AT4040 
Audio
30ART Mic Tube Amp 
CPUMotherboardGraphicsRAM
i7-4720HQ UX501JW-UB71T GTX 960m 16GB 1600 9-9-9-27 
Hard DriveOSMonitor
512GB PCI-e SSD Windows 10 Pro 4k IPS 
  hide details  
post #973 of 1593
Quote:
Originally Posted by KyadCK View Post

Not even close. Mobile, Console, Professional, Enterprise, they all use things long before we see them and they get passed down. "Gaming" and "Enthusiast" parts are all crippled hand-me-downs that get the features of the other much more demanding fields. Not just PC parts either, it applies to just about every single field imaginable. Businesses both demand and can afford the best. You as a consumer may have it when it becomes cheap enough to make, or you can just make due with the scraps.

Every Intel chip is a potential Xeon or mobile. Every AMD one an Opteron or FirePro. Every nVidia one a Quadro or even Tesla.

Other standards are also passed down from our "friends" in other sections; mSATA, SODIMM, mPCI-e. Even standard socket evolution like SATA, PCI-e, and DDR, they're done for servers first.

As for "unneeded" motherboard size, give an example. Most of the things riding on E-ATX use full-on server chipsets, and do in fact need that much room for that many traces in the board. There is not a single motherboard standard in use (BTX, ATX, mATX, E-ATX, mITX) that isn't used by Dell or HP or similar, assuming they didn't just start in the server/workstation world outright.

Yeah, the note I did was meant to be a comparison to other desktop parts that don't fall to the "gaming" territory. The whole post was meant to be about desktop PCs.

Examples of the board sizes I mentioned are the tall "xl atx" boards and the wider than atx but thinner than e-atx boards of some manufacturers that don't even have the same sizes between each other. Those aren't standards, they are like that just cause their makers wanted them to.
post #974 of 1593
I just don't understand the logic behind some guys using DX11 to compare CPU bottlenecks on a high-end GPU?? And they're the same guys who talk about being viable for next couple of years and talk about being future proof for some time at the least?? DX11 is future? Complete nonsense, AMD Mantle is taking off very quickly and has been adopted in some reputed gaming/graphics engines, that's the future. DX12 promises to match Mantle's performance levels, but that's going to take time to adopt since it's a completely new api and isn't as easy as patching a game to support it but still that's the future of gaming. Not to mention parallelization of games is proving good and more and more games are going that way. Now add all that up and again say that AMD is not VIABLE for MID-HIGH END!

Thoses who didn't care about above things are either unaware of these things or they don't want to digest it. We must consider these things, as techie I can't overlook these things.

Using selected apps and some poorly optimized games to prove a point is like trying to convince your own self that yeah we're not that bad. I can do the same and prove that FX-8350 is competitive with a 2700k! I consider that as cheating to yourself.
Edited by imran27 - 3/30/14 at 6:15am
    
CPUMotherboardGraphicsRAM
AMD FX-8150 ASUS M5A97 R2.0 AMD Radeon HD 6670 Corsair  
Hard DriveOptical DriveCoolingOS
Western Digital Caviar Blue Samsung Super WriteMaster AMD FX-8150 Stock Air Cooler Windows 8 PRO x64 
MonitorKeyboardPowerCase
Lenovo 17 inches CRT Lenovo Cooler Master Thunder 500W Cooler Master Elite 311 
MouseMouse PadAudio
Lenovo Lenovo Lenovo 
  hide details  
    
CPUMotherboardGraphicsRAM
AMD FX-8150 ASUS M5A97 R2.0 AMD Radeon HD 6670 Corsair  
Hard DriveOptical DriveCoolingOS
Western Digital Caviar Blue Samsung Super WriteMaster AMD FX-8150 Stock Air Cooler Windows 8 PRO x64 
MonitorKeyboardPowerCase
Lenovo 17 inches CRT Lenovo Cooler Master Thunder 500W Cooler Master Elite 311 
MouseMouse PadAudio
Lenovo Lenovo Lenovo 
  hide details  
post #975 of 1593
Quote:
Originally Posted by PachAz View Post

Alot of games dont even use multicore and even if they will do that in the future I doubt they will for anything more than 4 cores. We all know AMD superduper-core alreaddy performs worse or equal to intel quads, which tells me something very important. By the time all those games use all those cores the fx 8350 will be too old since its alreaddy behind the i5 3570k in pretty much any game, despite double the core. And even if thats the case, we can always buy old intel hexa cpus and get very good performance. Why do you suddenly think old xeon and x58 stuff has become so expensive and not anything remotely accossiated with AMD in terms of cpus? So no, AMD when it comes to gaming is not gamers choice and even less for the future proofness. And up to this point, intel has been the most future proof systems you can ever get. And yes im looking at it from a gamers perspective. Once my i5 at 4.8ghz is start lacking due to only 4 cores, I will get a old x58 or x79 system and use a high end server hexa core and OC it to 5ghz and blow away anything AMD ever will produce. Heck, I might even get a i7 sandy or ivy with HT for all I care. I wish you could say the same for AMD and be so optimistic....


"Future Proof" has to be one of the most ridiculous statements a person can make. There is no such thing. When that day comes the Intel processors of today will just be "less slow" then the AMD processors of today. They both will be obsolete.
post #976 of 1593
Quote:
Originally Posted by chrisjames61 View Post

"Future Proof" has to be one of the most ridiculous statements a person can make. There is no such thing. When that day comes the Intel processors of today will just be "less slow" then the AMD processors of today. They both will be obsolete.

i7-2600K purchased about 3 years ago.. and the current best on market at the similar price Range is an i7-4770K.
If I were to upgrade, what would I get in return? 15% increase.

First Gen i7's got owned by second gen, mostly due to drastic price reductions, while second generations still hold up well to two generations later and held their value well.
Sandy i7's were kind of future proof. as best as something can be at least.

It's not so ridiculous when you think about it.
i7 Sandy
(8 items)
 
Skylake Pentium
(8 items)
 
 
CPUMotherboardGraphicsRAM
i7-2600K ASRock Z75 Pro3 GTX 970 16GB 
Hard DriveOSMonitorPower
250GB Samsung 850 Evo WinXP Pro / Win7 Pro 1080p 60Hz Thermaltake Smart M850W 
CPUMotherboardGraphicsRAM
Pentium G4400 ASRock H110M-HDV GTX 550 Ti 8GB (2x4) 
Hard DriveOSPowerOther
250GB SSD Windows 7 500W 300Mbps WLAN 
  hide details  
i7 Sandy
(8 items)
 
Skylake Pentium
(8 items)
 
 
CPUMotherboardGraphicsRAM
i7-2600K ASRock Z75 Pro3 GTX 970 16GB 
Hard DriveOSMonitorPower
250GB Samsung 850 Evo WinXP Pro / Win7 Pro 1080p 60Hz Thermaltake Smart M850W 
CPUMotherboardGraphicsRAM
Pentium G4400 ASRock H110M-HDV GTX 550 Ti 8GB (2x4) 
Hard DriveOSPowerOther
250GB SSD Windows 7 500W 300Mbps WLAN 
  hide details  
post #977 of 1593
Quote:
Originally Posted by Shadow11377 View Post

i7-2600K purchased about 3 years ago.. and the current best on market at the similar price Range is an i7-4770K.
If I were to upgrade, what would I get in return? 15% increase.

First Gen i7's got owned by second gen, mostly due to drastic price reductions, while second generations still hold up well to two generations later and held their value well.
Sandy i7's were kind of future proof. as best as something can be at least.

It's not so ridiculous when you think about it.

Yeah only because Intel has slowed down and has no need to increase performance. It's their whole tick tock cycle.

Though it's probably going to stay like that from now on
post #978 of 1593
Quote:
Originally Posted by Shadow11377 View Post

i7-2600K purchased about 3 years ago.. and the current best on market at the similar price Range is an i7-4770K.
If I were to upgrade, what would I get in return? 15% increase.

First Gen i7's got owned by second gen, mostly due to drastic price reductions, while second generations still hold up well to two generations later and held their value well.
Sandy i7's were kind of future proof. as best as something can be at least.

It's not so ridiculous when you think about it.[/quote


You don't get it. That's not future proof. People talk of "future proof" like a slightly faster processor somehow won't become obsolete. Let's say the I7 is 25% faster than the 8350. Which is about tight. A new game comes out that is very taxing on the CPU. Chances are it will be unplayable on the 8350 and the "future proof" I7.
post #979 of 1593
Quote:
Originally Posted by chrisjames61 View Post

You don't get it. That's not future proof. People talk of "future proof" like a slightly faster processor somehow won't become obsolete. Let's say the I7 is 25% faster than the 8350. Which is about tight. A new game comes out that is very taxing on the CPU. Chances are it will be unplayable on the 8350 and the "future proof" I7.


Intel: Buy i7-2600K in January 2011 (almost 3.5 years ago). Maybe give it a moderate overclock to 4.5 GHz or so (just about all 2600Ks should able to do this easily on air cooling). Still a very fast CPU today, and faster than any of AMD's current offerings.

AMD: Buy FX-8150 in late 2011. Slower than 2600K when it came out. Upgrade to Vishera based FX-8320 in late 2012 (I know quite a few BD owners did this upgrade). Still slower than 2600K, much less an overclocked one. Not to mention there is no future upgrade path past Vishera on the desktop.

See the pattern here? No need to upgrade an Intel CPU because the Sandy Bridge chips from over 3 years ago are still faster than AMD's current chips.

Think about it from the Intel owner's point of view. If you bought a Sandy Bridge CPU 3.5 years ago and it remained faster than AMD's newest offerings over 3 years later, I'm sure you would feel that the Intel chip is pretty "future-proof" as well.

I remember back when I built my Athlon X2 rig back in 2005. When I built it, it was faster than anything Intel had at the time. I had it overclocked to 2.8 GHz, and it basically performed like the FX-62. But it didn't remain faster than Intel's offerings for anywhere close to 3 years, since Core 2 Duo was released about a year after I built my rig. So for my current Sandy Bridge rig to be over 3 years old and still faster than AMD's newest offerings is plenty future-proof by my standard.
Edited by 996gt2 - 3/30/14 at 11:33am
5 GHz SFF Box
(18 items)
 
   
CPUMotherboardGraphicsRAM
Core i7-2700K @ 5.0 GHz, 1.38V Asus Maximus IV GENE Asus GTX 670 DC II 4x4GB Samsung 30nm @ DDR3-2133 9-9-9-21 1.5V 
Hard DriveHard DriveHard DriveHard Drive
Plextor M3 SSD WD Velociraptor 500GB WD Caviar Black 1TB WD Caviar Green 2TB 
CoolingOSMonitorKeyboard
Thermalright HR-02 (GT AP-15 Push/Pull) Windows 7 Pro x64 LG 27" 2560x1440 S-IPS (Calibrated with Eye-One) CM Quickfire Rapid 
PowerCaseMouseMouse Pad
Seasonic X-750 Silverstone SG09 Logitech MX518 Steelseries QcK 
Audio
Asus Xonar DX + Shure SRH840 
CPUMotherboardRAMHard Drive
Core i5-3570K Gigabyte H61N-USB3 Mini-ITX 2x4GB Samsung 30nm DDR3 Samsung 830 128GB SSD 
Hard DriveOSPowerCase
WD Scorpio Blue 500GB Win 7 Pro x64 Antec 90W DC-DC/Delta power brick Antec ISK 110 
  hide details  
5 GHz SFF Box
(18 items)
 
   
CPUMotherboardGraphicsRAM
Core i7-2700K @ 5.0 GHz, 1.38V Asus Maximus IV GENE Asus GTX 670 DC II 4x4GB Samsung 30nm @ DDR3-2133 9-9-9-21 1.5V 
Hard DriveHard DriveHard DriveHard Drive
Plextor M3 SSD WD Velociraptor 500GB WD Caviar Black 1TB WD Caviar Green 2TB 
CoolingOSMonitorKeyboard
Thermalright HR-02 (GT AP-15 Push/Pull) Windows 7 Pro x64 LG 27" 2560x1440 S-IPS (Calibrated with Eye-One) CM Quickfire Rapid 
PowerCaseMouseMouse Pad
Seasonic X-750 Silverstone SG09 Logitech MX518 Steelseries QcK 
Audio
Asus Xonar DX + Shure SRH840 
CPUMotherboardRAMHard Drive
Core i5-3570K Gigabyte H61N-USB3 Mini-ITX 2x4GB Samsung 30nm DDR3 Samsung 830 128GB SSD 
Hard DriveOSPowerCase
WD Scorpio Blue 500GB Win 7 Pro x64 Antec 90W DC-DC/Delta power brick Antec ISK 110 
  hide details  
post #980 of 1593
Quote:
Originally Posted by 996gt2 View Post

Intel: Buy i7-2600K in January 2011 (almost 3.5 years ago). Maybe give it a moderate overclock to 4.5 GHz or so (just about all 2600Ks should able to do this easily on air cooling). Still a very fast CPU today, and faster than any of AMD's current offerings.

AMD: Buy FX-8150 in late 2011. Slower than 2600K when it came out. Upgrade to Vishera based FX-8320 in late 2012 (I know quite a few BD owners did this upgrade). Still slower than 2600K, much less an overclocked one. Not to mention there is no future upgrade path past Vishera on the desktop.

See the pattern here? No need to upgrade an Intel CPU because the Sandy Bridge chips from over 3 years ago are still faster than AMD's current chips.

Think about it from the Intel owner's point of view. If you bought a Sandy Bridge CPU 3.5 years ago and it remained faster than AMD's newest offerings over 3 years later, I'm sure you would feel that the Intel chip is pretty "future-proof" as well.

I remember back when I built my Athlon X2 rig back in 2005. When I built it, it was faster than anything Intel had at the time. I had it overclocked to 2.8 GHz, and it basically performed like the FX-62. But it didn't remain faster than Intel's offerings for anywhere close to 3 years, since Core 2 Duo was released about a year after I built my rig. So for my current Sandy Bridge rig to be over 3 years old and still faster than AMD's newest offerings is plenty future-proof by my standard.

http://www.newegg.com/Product/Product.aspx?Item=N82E16819115070

$340.

http://www.newegg.com/Product/Product.aspx?Item=N82E16819113285

$159.

You spent twice as much to future proof your PC for 3 years. I do not think that is worth it. You might, and good for you. You saved money. But how does this specific example translate to the masses?

You are using such a specific example. With all of science and technology, there is always 1 example that, if you look closely enough, goes against the grain. Sure, your $340 CPU is faster than AMD's top end CPU, but what does that mean? How does that mean that AMD is unable to compete? AMD has the talent and technology to make a very fast CPU. It doesn't do that as much as Intel does because they know that people who consider themselves mid-high end do not think paying almost $400 for a CPU is reasonable for them.


$
Zen
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5-6600k Gigabyte Z170XP-SLI RX 480 4GB 24 GB DDR4 2133MHz 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 evo OCZ Vertex 4 Crucial MX 500 256GB WD Black 3 TB 
Hard DriveCoolingOSMonitor
Seagate 1 TB H60 Windows 10 Asus mx27a 
MonitorKeyboardPowerMouse
hp 2009 m Corsair MR Brown Antec Earthwatts 650 Razer Naga 
Audio
Soundblaster Omni 
  hide details  
Zen
(17 items)
 
  
CPUMotherboardGraphicsRAM
i5-6600k Gigabyte Z170XP-SLI RX 480 4GB 24 GB DDR4 2133MHz 
Hard DriveHard DriveHard DriveHard Drive
Samsung 850 evo OCZ Vertex 4 Crucial MX 500 256GB WD Black 3 TB 
Hard DriveCoolingOSMonitor
Seagate 1 TB H60 Windows 10 Asus mx27a 
MonitorKeyboardPowerMouse
hp 2009 m Corsair MR Brown Antec Earthwatts 650 Razer Naga 
Audio
Soundblaster Omni 
  hide details  
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: AMD CPUs
This thread is locked  
Overclock.net › Forums › AMD › AMD CPUs › AMD No longer a viable option for mid-high end?