Overclock.net - An Overclocking Community - Reply to Topic

Thread: [The Register] AMD agrees to cough up $35-a-chip payout over eight-core Bulldozer advertising fiasco Reply to Thread
Title:
Message:

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in


  Additional Options
Miscellaneous Options

  Topic Review (Newest First)
09-26-2019 10:58 AM
Liranan
Quote: Originally Posted by cssorkinman View Post
Dual cores have been pretty much unbearable for everyday use since around the release of windows 7 for me.

There was a time where an 8320 would absolutely hand my ddr 3 equipped intel i7's running as quad's their ass in BF1 multiplayer at 1080p - the minimum and average fps finally merging when I underclocked the Vishera to about 2.4 ghz. The quads weren't aging well by comparison.

The game now seems to have been updated to make it easier on the quads - which was advantageous to both the publisher and Intel. Has been a common theme in the bulldozer story - ( firestrike gimped it , early versions of cpu-z bench etc and now it's being played out with Ryzen - userbench ) such is the economic pressure on software companies to bow to Intel - " it's good to be the king".

I've always touted the Vishera 8 cores I have as being much quicker in the desktop than my i7's - After trying to figure out why this is the case - it appears that much of it is an unintended result of software being written to take full advantage of the processing power Intel chips offered - loading the cpu fully much more often than the AMD's. Watching cpu usage , about the only time Vishera get's pushed to 95% usage or above is during stress tests or during load screens on games/apps, it's incredibly rare for them not to have plenty of resources available to start or switch apps in the desktop. I've even noticed this when running my X6 thubans, they don't seem to "stall" for a few moments when opening or changing apps nearly as often as my 2,3 and 4 gen i7's. Worth noting that I tend to run both overclocked with power savings features disabled .

My son's i7 Omen laptop with the 7th gen , ssd and ddr4 is the first Intel rig I've owned that feels as nimble.
While the 2500K was faster in games than the 83xx's, it lagged behind when it came to multitasking and what I used my PC for: rendering. So they had their strengths.
09-25-2019 09:10 AM
cssorkinman
Quote: Originally Posted by tpi2007 View Post
Yeah, dual core Pentiums went out of being usable for AAA gaming circa 2016/2017, but the price bracket compared to an FX-8320 is not the same. Anyway, Bulldozer was in no way "too ahead of its time in 2011". You got what you paid for, a lot of wimpy cores good for mulththreading, but not so good for single threaded. On Intel's side, you could just spend more and get a 2600K or 3770K during the relevant 2011/2012 period and call it a day, with both CPUs standing the test of time better than Bulldozer/Piledriver and with a much more balanced performance profile across a wide range of applications over all these years.
Dual cores have been pretty much unbearable for everyday use since around the release of windows 7 for me.

There was a time where an 8320 would absolutely hand my ddr 3 equipped intel i7's running as quad's their ass in BF1 multiplayer at 1080p - the minimum and average fps finally merging when I underclocked the Vishera to about 2.4 ghz. The quads weren't aging well by comparison.

The game now seems to have been updated to make it easier on the quads - which was advantageous to both the publisher and Intel. Has been a common theme in the bulldozer story - ( firestrike gimped it , early versions of cpu-z bench etc and now it's being played out with Ryzen - userbench ) such is the economic pressure on software companies to bow to Intel - " it's good to be the king".

I've always touted the Vishera 8 cores I have as being much quicker in the desktop than my i7's - After trying to figure out why this is the case - it appears that much of it is an unintended result of software being written to take full advantage of the processing power Intel chips offered - loading the cpu fully much more often than the AMD's. Watching cpu usage , about the only time Vishera get's pushed to 95% usage or above is during stress tests or during load screens on games/apps, it's incredibly rare for them not to have plenty of resources available to start or switch apps in the desktop. I've even noticed this when running my X6 thubans, they don't seem to "stall" for a few moments when opening or changing apps nearly as often as my 2,3 and 4 gen i7's. Worth noting that I tend to run both overclocked with power savings features disabled .

My son's i7 Omen laptop with the 7th gen , ssd and ddr4 is the first Intel rig I've owned that feels as nimble.
09-24-2019 10:35 PM
tpi2007
Quote: Originally Posted by dlee7283 View Post
There was a time people here were seriously pushing the dual core Pentium G3258 over the 8320 because of how high it overclocked and the IPC but I feel like Bulldozer probably outdoes it now in modern gaming even with its flaws.

It was just too ahead of its time in 2011 when the market was going through more of a stagnation cycle and wanted to just stick to what worked and were familiar with in what Intel provided which was already good enough for years to come with Sandy Bridge.

Yeah, dual core Pentiums went out of being usable for AAA gaming circa 2016/2017, but the price bracket compared to an FX-8320 is not the same. Anyway, Bulldozer was in no way "too ahead of its time in 2011". You got what you paid for, a lot of wimpy cores good for mulththreading, but not so good for single threaded. On Intel's side, you could just spend more and get a 2600K or 3770K during the relevant 2011/2012 period and call it a day, with both CPUs standing the test of time better than Bulldozer/Piledriver and with a much more balanced performance profile across a wide range of applications over all these years.
09-24-2019 08:59 PM
dlee7283 There was a time people here were seriously pushing the dual core Pentium G3258 over the 8320 because of how high it overclocked and the IPC but I feel like Bulldozer probably outdoes it now in modern gaming even with its flaws.

It was just too ahead of its time in 2011 when the market was going through more of a stagnation cycle and wanted to just stick to what worked and were familiar with in what Intel provided which was already good enough for years to come with Sandy Bridge.
09-15-2019 02:38 PM
CynicalUnicorn
Quote: Originally Posted by Redwoodz View Post
And yet what is Intel's latest tech efforts focusing on? MOAR CORES!!!! Bulldozer deserves credit. It was a GAME CHANGER in the industry.
Agreed, AMD's server market share dropped even lower as a direct result.
09-11-2019 04:57 PM
cssorkinman
Quote: Originally Posted by ToTheSun! View Post
Then you'd call every single card on the market right now disappointing.
Actually I'm quite impressed with the $125 ish 570 powercolor red devil.
09-11-2019 04:26 PM
The Robot
Quote: Originally Posted by Alex132 View Post
The vast majority could be played
Fact is, Sony never had true backward compatibility prior to x86. Heck, there are some PS2 games that are bugged on some PS2 revisions (there were 19 of them!). The only ones who did were Nintendo with Gameboy and DS, each new generation could play any game from the older one. Also Wii U can play any GC game 100% natively, even though it's not officially supported.
Essentially, AMD gave Sony the first 100% backward-compatible platform that can increase it's power over the generations and keep all the titles playable. As for Iris 5200, it could never could match PS4 GPU, which is roughly equivalent to GTX 470, while Iris only matches GTS 450. Also it lacks any form of async compute. I'm not even talking about it's cost, it would've never been viable for a $400 console that Sony and MS wanted to sell at a profit.
AMD on the PS4: We gave it the hardware Nvidia couldn't
09-11-2019 03:30 PM
WannaBeOCer
Quote: Originally Posted by KyadCK View Post
Last I checked, this is Overclock.net.

Current hardware ray tracing behind a black box is awful. You can't see the pretty reflections when there aren't enough pixels on the screen to give them any detail. Current software ray tracing that is not limited to one hardware vendor and does not require useless dedicated hardware is better. Not that your argument did anything to counter mine at all, it's just useless fluff.

Iris Pro 6200 is Broadwell, and from 2015. Good job buddy.

Some* PS1 games could be played. That list changed significantly between PS2 models. Heck, what PS2 games you could play changed between PS2 models.

PS3 had actual PS2 hardware in it for... one or two revisions. Then support was dropped. Something to do about it being too hard to emulate? Being on the same general architecture (x86) for more than one generation is a very big deal for consoles.

I don't recall saying anything was good or bad. I asked for the magical X79 8-core HEDT chip. Do you have one?
Which software ray tracing are you referring to? Last I checked every ray tracing implementation was GPU accelerated. From your explanation DxR is considered software ray tracing since it "is not limited to one hardware vendor and does not require useless dedicated hardware is better." I'm sure AMD's Radeon Rays would benefit from nVidia's RT cores. I'm going to suggest they add support, it's a pretty sweet SDK and runs well on nVidia's Turing already.
09-11-2019 03:24 PM
Alex132
Quote: Originally Posted by KyadCK View Post
Last I checked, this is Overclock.net.
Oh yeah, it is! Neat.





Quote: Originally Posted by KyadCK View Post
Current hardware ray tracing behind a black box is awful.
Well best AMD catch up to Nvidia then, sadly they haven't.



Quote: Originally Posted by KyadCK View Post
You can't see the pretty reflections when there aren't enough pixels on the screen to give them any detail. Current software ray tracing that is not limited to one hardware vendor and does not require useless dedicated hardware is better. Not that your argument did anything to counter mine at all, it's just useless fluff.
You do know you can enable DxR ray tracing without an RTX card right? Tensor cores just accelerate it. Also 'useless dedicated hardware'? Calm down there eh, AMD will catch up.


Quote: Originally Posted by KyadCK View Post
Iris Pro 6200 is Broadwell, and from 2015. Good job buddy.
The butter fingers hit 6 instead of 5, I meant to post this: https://www.notebookcheck.net/Intel-...0.90965.0.html




Quote: Originally Posted by KyadCK View Post
Some* PS1 games could be played. That list changed significantly between PS2 models. Heck, what PS2 games you could play changed between PS2 models.


PS3 had actual PS2 hardware in it for... one or two revisions. Then support was dropped. Something to do about it being too hard to emulate? Being on the same general architecture (x86) for more than one generation is a very big deal for consoles.
The vast majority could be played, hence why the Wiki article lists incompatibilities rather than compatible titles.



That's neither here nor there though, the point is that it's not due to AMD hardware that allows them to suddenly have backwards compatibility. It's due to, as you said, being on a x86 compatible platform.





Quote: Originally Posted by KyadCK View Post
I don't recall saying anything was good or bad. I asked for the magical X79 8-core HEDT chip. Do you have one?
1) / means or.
2) Xeons could be considered prosumer back then as they were more within reach of the average consumer compared to now.



Example: https://www.overclock.net/forum/5-in...you-cared.html
09-11-2019 03:02 PM
KyadCK
Quote: Originally Posted by Alex132 View Post
So did Celerons back in the day for frequency, it's a meaningless feat to real world anything.



Ah yes, hardware accelerated Ray Tracing is awful! It just makes it so much easier for game devs to make beautifully lit environments if done properly. It's not like 90% of photography and cinematography is about lightning. I've never been a fan of things looking good if they come from the leatherjacketman company.



https://www.notebookcheck.net/Intel-....125593.0.html






https://www.quora.com/Can-you-play-PS1-games-on-PS2
https://whirlpool.net.au/wiki/ps2_faq_compatibility


Early PS3 supported PS2 games as well. It's nothing really to do with it being AMD hardware, that was a very nice reach.


Being the first doesn't make it good, it's only a marketing timing dependent gimmick.
Last I checked, this is Overclock.net.

Current hardware ray tracing behind a black box is awful. You can't see the pretty reflections when there aren't enough pixels on the screen to give them any detail. Current software ray tracing that is not limited to one hardware vendor and does not require useless dedicated hardware is better. Not that your argument did anything to counter mine at all, it's just useless fluff.

Iris Pro 6200 is Broadwell, and from 2015. Good job buddy.

Some* PS1 games could be played. That list changed significantly between PS2 models. Heck, what PS2 games you could play changed between PS2 models.

PS3 had actual PS2 hardware in it for... one or two revisions. Then support was dropped. Something to do about it being too hard to emulate? Being on the same general architecture (x86) for more than one generation is a very big deal for consoles.

I don't recall saying anything was good or bad. I asked for the magical X79 8-core HEDT chip. Do you have one?
This thread has more than 10 replies. Click here to review the whole thread.

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off