[The Register] AMD agrees to cough up $35-a-chip payout over eight-core Bulldozer advertising fiasco - Page 17 - Overclock.net - An Overclocking Community

Forum Jump: 

[The Register] AMD agrees to cough up $35-a-chip payout over eight-core Bulldozer advertising fiasco

Reply
 
Thread Tools
post #161 of 170 (permalink) Old 09-11-2019, 02:02 PM
Hey I get one of these!
 
KyadCK's Avatar
 
Join Date: Aug 2011
Location: Chicago
Posts: 7,296
Rep: 307 (Unique: 217)
Quote: Originally Posted by Alex132 View Post
So did Celerons back in the day for frequency, it's a meaningless feat to real world anything.



Ah yes, hardware accelerated Ray Tracing is awful! It just makes it so much easier for game devs to make beautifully lit environments if done properly. It's not like 90% of photography and cinematography is about lightning. I've never been a fan of things looking good if they come from the leatherjacketman company.



https://www.notebookcheck.net/Intel-....125593.0.html






https://www.quora.com/Can-you-play-PS1-games-on-PS2
https://whirlpool.net.au/wiki/ps2_faq_compatibility


Early PS3 supported PS2 games as well. It's nothing really to do with it being AMD hardware, that was a very nice reach.


Being the first doesn't make it good, it's only a marketing timing dependent gimmick.
Last I checked, this is Overclock.net.

Current hardware ray tracing behind a black box is awful. You can't see the pretty reflections when there aren't enough pixels on the screen to give them any detail. Current software ray tracing that is not limited to one hardware vendor and does not require useless dedicated hardware is better. Not that your argument did anything to counter mine at all, it's just useless fluff.

Iris Pro 6200 is Broadwell, and from 2015. Good job buddy.

Some* PS1 games could be played. That list changed significantly between PS2 models. Heck, what PS2 games you could play changed between PS2 models.

PS3 had actual PS2 hardware in it for... one or two revisions. Then support was dropped. Something to do about it being too hard to emulate? Being on the same general architecture (x86) for more than one generation is a very big deal for consoles.

I don't recall saying anything was good or bad. I asked for the magical X79 8-core HEDT chip. Do you have one?

Forge
(18 items)
Forge-LT
(7 items)
CPU
AMD Threadripper 1950X
Motherboard
Gigabyte X399 Designare
GPU
EVGA 1080ti SC2 Hybrid
GPU
EVGA 1080ti SC2 Hybrid
RAM
32GB G.Skill TridentZ RGB (4x8GB 3200Mhz 14-14-14)
Hard Drive
Intel 900P 480GB
Hard Drive
Samsung 950 Pro 512GB
Power Supply
Corsair AX1200
Cooling
EK Predator 240
Case
Corsair Graphite 780T
Operating System
Windows 10 Enterprise x64
Monitor
2x Acer XR341CK
Keyboard
Corsair Vengeance K70 RGB
Mouse
Corsair Vengeance M65 RGB
Audio
Sennheiser HD700
Audio
Sound Blaster AE-5
Audio
Audio Technica AT4040
Audio
30ART Mic Tube Amp
CPU
i7-4720HQ
Motherboard
UX501JW-UB71T
GPU
GTX 960m
RAM
16GB 1600 9-9-9-27
Hard Drive
512GB PCI-e SSD
Operating System
Windows 10 Pro
Monitor
4k IPS
▲ hide details ▲
KyadCK is offline  
Sponsored Links
Advertisement
 
post #162 of 170 (permalink) Old 09-11-2019, 02:24 PM
✾ ✿ ❀ ❁
 
Alex132's Avatar
 
Join Date: Dec 2009
Posts: 8,352
Rep: 342 (Unique: 274)
Quote: Originally Posted by KyadCK View Post
Last I checked, this is Overclock.net.
Oh yeah, it is! Neat.





Quote: Originally Posted by KyadCK View Post
Current hardware ray tracing behind a black box is awful.
Well best AMD catch up to Nvidia then, sadly they haven't.



Quote: Originally Posted by KyadCK View Post
You can't see the pretty reflections when there aren't enough pixels on the screen to give them any detail. Current software ray tracing that is not limited to one hardware vendor and does not require useless dedicated hardware is better. Not that your argument did anything to counter mine at all, it's just useless fluff.
You do know you can enable DxR ray tracing without an RTX card right? Tensor cores just accelerate it. Also 'useless dedicated hardware'? Calm down there eh, AMD will catch up.


Quote: Originally Posted by KyadCK View Post
Iris Pro 6200 is Broadwell, and from 2015. Good job buddy.
The butter fingers hit 6 instead of 5, I meant to post this: https://www.notebookcheck.net/Intel-...0.90965.0.html




Quote: Originally Posted by KyadCK View Post
Some* PS1 games could be played. That list changed significantly between PS2 models. Heck, what PS2 games you could play changed between PS2 models.


PS3 had actual PS2 hardware in it for... one or two revisions. Then support was dropped. Something to do about it being too hard to emulate? Being on the same general architecture (x86) for more than one generation is a very big deal for consoles.
The vast majority could be played, hence why the Wiki article lists incompatibilities rather than compatible titles.



That's neither here nor there though, the point is that it's not due to AMD hardware that allows them to suddenly have backwards compatibility. It's due to, as you said, being on a x86 compatible platform.





Quote: Originally Posted by KyadCK View Post
I don't recall saying anything was good or bad. I asked for the magical X79 8-core HEDT chip. Do you have one?
1) / means or.
2) Xeons could be considered prosumer back then as they were more within reach of the average consumer compared to now.



Example: https://www.overclock.net/forum/5-in...you-cared.html

| This Cannot Continue | We Are Become As Gods | This Cannot Continue |

Vehicles:
Current: None.
Ex: '07 Fiat Palio, '07 Honda Accord Type-S
Illya
(26 items)
Yuki
(23 items)
CPU
Intel i9 9900K @ 5.2Ghz
Motherboard
Gigabyte Aorus Master
GPU
EVGA 1080 Ti FTW3
RAM
G.Skill Trident Z 32GB 4000Mhz
Hard Drive
Samsung 970 Pro 512GB
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
WD Red 8TB
Power Supply
Seasonic Prime Titanium 850W
Cooling
EK Velocity CPU Block
Cooling
EK 1080 Ti FTW3 GPU Block
Cooling
HardwareLabs GTX420
Cooling
HardwareLabs GTS280
Cooling
HardwareLabs GTS140
Cooling
EK D5 140 Glass Pump/res
Case
Phanteks Evolv X
Operating System
Windows 10 Pro
Monitor
Asus ROG Swift PG279Q 165Hz 1440p
Monitor
I-INC 1920x1200 TN 27"
Keyboard
Ducky One 2 Midnight
Mouse
Logitech G502
Mousepad
CoolerMaster Swift-RX XL
Audio
Sennhesier HD650
Audio
Schiit Jotunheim Amp/DAC
Audio
Samson C01U Microphone
Audio
Edifier R1700BT
Other
Oculus Rift
CPU
Intel 2500K 5Ghz
Motherboard
ASUS P8P67 Pro
GPU
EVGA 1080 Ti FTW3
RAM
G.Skill RipJaws X 2133Mhz 16GB
Hard Drive
120GB Corsair Neutron GTX
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
WD Red 8TB
Power Supply
EVGA G3 850W
Cooling
Hyper 212
Cooling
EK Vardar fans
Case
Corsair 270R w/TG mod
Operating System
Windows 10
Monitor
Asus ROG Swift PG279Q
Monitor
I-Inc 1200p
Keyboard
Razer BlackWidow 2013
Mouse
Logitech G502 Proteus Spectrum
Mousepad
CoolerMaster Swift-RX XL
Audio
Sennheiser HD650
Audio
Schiit Jotunheim Balanced DAC + Amp
Audio
Edifier R1700BT Speakers
Audio
Samson C01U Microphone
Other
Oculus Rift
Other
PS4 Controller
▲ hide details ▲


Alex132 is offline  
post #163 of 170 (permalink) Old 09-11-2019, 02:30 PM
sudo apt install sl
 
WannaBeOCer's Avatar
 
Join Date: Dec 2009
Posts: 5,516
Rep: 177 (Unique: 123)
Quote: Originally Posted by KyadCK View Post
Last I checked, this is Overclock.net.

Current hardware ray tracing behind a black box is awful. You can't see the pretty reflections when there aren't enough pixels on the screen to give them any detail. Current software ray tracing that is not limited to one hardware vendor and does not require useless dedicated hardware is better. Not that your argument did anything to counter mine at all, it's just useless fluff.

Iris Pro 6200 is Broadwell, and from 2015. Good job buddy.

Some* PS1 games could be played. That list changed significantly between PS2 models. Heck, what PS2 games you could play changed between PS2 models.

PS3 had actual PS2 hardware in it for... one or two revisions. Then support was dropped. Something to do about it being too hard to emulate? Being on the same general architecture (x86) for more than one generation is a very big deal for consoles.

I don't recall saying anything was good or bad. I asked for the magical X79 8-core HEDT chip. Do you have one?
Which software ray tracing are you referring to? Last I checked every ray tracing implementation was GPU accelerated. From your explanation DxR is considered software ray tracing since it "is not limited to one hardware vendor and does not require useless dedicated hardware is better." I'm sure AMD's Radeon Rays would benefit from nVidia's RT cores. I'm going to suggest they add support, it's a pretty sweet SDK and runs well on nVidia's Turing already.

Silent
(20 items)
CPU
Core i9 9900K... CoffeeTime! @ 4.2Ghz w/ 1v
Motherboard
Maximus VIII Formula
GPU
Radeon VII @ 1900Mhz/1250Mhz w/ 1v
RAM
TeamGroup Xtreem 16GB 3866Mhz CL15
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 1TB
Hard Drive
Samsung 850 Evo 500GB
Power Supply
EVGA SuperNova 1200w P2
Cooling
EK Supremacy Full Copper Clean
Cooling
XSPC D5 Photon v2
Cooling
Black Ice Gen 2 GTX360 x2
Cooling
EK-Vector Radeon VII - Copper + Plexi
Case
Thermaltake Core X5 Tempered Glass Edition
Operating System
Clear Linux
Monitor
Acer XF270HUA
Keyboard
Cherry MX Board 6.0
Mouse
Logitech G600
Mouse
Alugraphics GamerArt
Audio
Definitive Technology Incline
Audio
SMSL M8A
▲ hide details ▲
WannaBeOCer is offline  
Sponsored Links
Advertisement
 
post #164 of 170 (permalink) Old 09-11-2019, 03:26 PM
 
The Robot's Avatar
 
Join Date: Mar 2013
Posts: 2,311
Rep: 129 (Unique: 81)
Quote: Originally Posted by Alex132 View Post
The vast majority could be played
Fact is, Sony never had true backward compatibility prior to x86. Heck, there are some PS2 games that are bugged on some PS2 revisions (there were 19 of them!). The only ones who did were Nintendo with Gameboy and DS, each new generation could play any game from the older one. Also Wii U can play any GC game 100% natively, even though it's not officially supported.
Essentially, AMD gave Sony the first 100% backward-compatible platform that can increase it's power over the generations and keep all the titles playable. As for Iris 5200, it could never could match PS4 GPU, which is roughly equivalent to GTX 470, while Iris only matches GTS 450. Also it lacks any form of async compute. I'm not even talking about it's cost, it would've never been viable for a $400 console that Sony and MS wanted to sell at a profit.
AMD on the PS4: We gave it the hardware Nvidia couldn't

Main
(17 items)
Nintendo DS
(8 items)
CPU
6700K
Motherboard
Gigabyte Z170X-Gaming 3
GPU
MSI GTX 1080 Gaming X
RAM
G.Skill Ripjaws V 16GB 3000
Hard Drive
Samsung 850 Evo 500GB
Hard Drive
WD Blue 3TB
Power Supply
EVGA 650 G2
Cooling
Noctua NH-D15S
Cooling
Nanoxia Deep Silence 140mm
Cooling
Nanoxia Deep Silence 120mm
Case
Corsair 400Q
Operating System
Windows 10 Enterprise
Monitor
ViewSonic XG2703-GS 1440p
Keyboard
Leopold FC750 (MX Brown)
Mouse
Logitech Performance Mouse MX
Audio
Mayflower Objective2 + ODAC Rev. B Combo
Audio
Audio-Technica ATH-A990Z
CPU
ARM946E-S 67.028 MHz
CPU
ARM7TDMI 33.514 MHz
RAM
4 MB
Hard Drive
256 kB
Power Supply
850 mAh
Operating System
DS OS
Monitor
3" 256×192 18-bit
Monitor
3" 256×192 18-bit
▲ hide details ▲

Last edited by The Robot; 09-11-2019 at 03:40 PM.
The Robot is offline  
post #165 of 170 (permalink) Old 09-11-2019, 03:57 PM
New to Overclock.net
 
cssorkinman's Avatar
 
Join Date: Apr 2009
Posts: 9,454
Rep: 463 (Unique: 263)
Quote: Originally Posted by ToTheSun! View Post
Then you'd call every single card on the market right now disappointing.
Actually I'm quite impressed with the $125 ish 570 powercolor red devil.
cssorkinman is offline  
post #166 of 170 (permalink) Old 09-15-2019, 01:38 PM
Canada Goose Puncher
 
CynicalUnicorn's Avatar
 
Join Date: Jun 2013
Location: Extra-west Virginia
Posts: 9,125
Quote: Originally Posted by Redwoodz View Post
And yet what is Intel's latest tech efforts focusing on? MOAR CORES!!!! Bulldozer deserves credit. It was a GAME CHANGER in the industry.
Agreed, AMD's server market share dropped even lower as a direct result.

Quote: Originally Posted by TheBadBull View Post
someone sig this


CynicalUnicorn is offline  
post #167 of 170 (permalink) Old 09-24-2019, 07:59 PM
New to Overclock.net
 
dlee7283's Avatar
 
Join Date: Oct 2008
Location: Memphis,TN
Posts: 5,081
Rep: 286 (Unique: 235)
There was a time people here were seriously pushing the dual core Pentium G3258 over the 8320 because of how high it overclocked and the IPC but I feel like Bulldozer probably outdoes it now in modern gaming even with its flaws.

It was just too ahead of its time in 2011 when the market was going through more of a stagnation cycle and wanted to just stick to what worked and were familiar with in what Intel provided which was already good enough for years to come with Sandy Bridge.

dlee7283 is offline  
post #168 of 170 (permalink) Old 09-24-2019, 09:35 PM
Waiting for 7nm EUV
 
tpi2007's Avatar
 
Join Date: Nov 2010
Posts: 11,363
Rep: 894 (Unique: 503)
Quote: Originally Posted by dlee7283 View Post
There was a time people here were seriously pushing the dual core Pentium G3258 over the 8320 because of how high it overclocked and the IPC but I feel like Bulldozer probably outdoes it now in modern gaming even with its flaws.

It was just too ahead of its time in 2011 when the market was going through more of a stagnation cycle and wanted to just stick to what worked and were familiar with in what Intel provided which was already good enough for years to come with Sandy Bridge.

Yeah, dual core Pentiums went out of being usable for AAA gaming circa 2016/2017, but the price bracket compared to an FX-8320 is not the same. Anyway, Bulldozer was in no way "too ahead of its time in 2011". You got what you paid for, a lot of wimpy cores good for mulththreading, but not so good for single threaded. On Intel's side, you could just spend more and get a 2600K or 3770K during the relevant 2011/2012 period and call it a day, with both CPUs standing the test of time better than Bulldozer/Piledriver and with a much more balanced performance profile across a wide range of applications over all these years.


tpi2007 is offline  
post #169 of 170 (permalink) Old 09-25-2019, 08:10 AM
New to Overclock.net
 
cssorkinman's Avatar
 
Join Date: Apr 2009
Posts: 9,454
Rep: 463 (Unique: 263)
Quote: Originally Posted by tpi2007 View Post
Yeah, dual core Pentiums went out of being usable for AAA gaming circa 2016/2017, but the price bracket compared to an FX-8320 is not the same. Anyway, Bulldozer was in no way "too ahead of its time in 2011". You got what you paid for, a lot of wimpy cores good for mulththreading, but not so good for single threaded. On Intel's side, you could just spend more and get a 2600K or 3770K during the relevant 2011/2012 period and call it a day, with both CPUs standing the test of time better than Bulldozer/Piledriver and with a much more balanced performance profile across a wide range of applications over all these years.
Dual cores have been pretty much unbearable for everyday use since around the release of windows 7 for me.

There was a time where an 8320 would absolutely hand my ddr 3 equipped intel i7's running as quad's their ass in BF1 multiplayer at 1080p - the minimum and average fps finally merging when I underclocked the Vishera to about 2.4 ghz. The quads weren't aging well by comparison.

The game now seems to have been updated to make it easier on the quads - which was advantageous to both the publisher and Intel. Has been a common theme in the bulldozer story - ( firestrike gimped it , early versions of cpu-z bench etc and now it's being played out with Ryzen - userbench ) such is the economic pressure on software companies to bow to Intel - " it's good to be the king".

I've always touted the Vishera 8 cores I have as being much quicker in the desktop than my i7's - After trying to figure out why this is the case - it appears that much of it is an unintended result of software being written to take full advantage of the processing power Intel chips offered - loading the cpu fully much more often than the AMD's. Watching cpu usage , about the only time Vishera get's pushed to 95% usage or above is during stress tests or during load screens on games/apps, it's incredibly rare for them not to have plenty of resources available to start or switch apps in the desktop. I've even noticed this when running my X6 thubans, they don't seem to "stall" for a few moments when opening or changing apps nearly as often as my 2,3 and 4 gen i7's. Worth noting that I tend to run both overclocked with power savings features disabled .

My son's i7 Omen laptop with the 7th gen , ssd and ddr4 is the first Intel rig I've owned that feels as nimble.
cssorkinman is offline  
post #170 of 170 (permalink) Old 09-26-2019, 09:58 AM
New to Overclock.net
 
Liranan's Avatar
 
Join Date: Nov 2010
Location: Soviet China... Oh wait..
Posts: 8,660
Rep: 610 (Unique: 295)
Quote: Originally Posted by cssorkinman View Post
Dual cores have been pretty much unbearable for everyday use since around the release of windows 7 for me.

There was a time where an 8320 would absolutely hand my ddr 3 equipped intel i7's running as quad's their ass in BF1 multiplayer at 1080p - the minimum and average fps finally merging when I underclocked the Vishera to about 2.4 ghz. The quads weren't aging well by comparison.

The game now seems to have been updated to make it easier on the quads - which was advantageous to both the publisher and Intel. Has been a common theme in the bulldozer story - ( firestrike gimped it , early versions of cpu-z bench etc and now it's being played out with Ryzen - userbench ) such is the economic pressure on software companies to bow to Intel - " it's good to be the king".

I've always touted the Vishera 8 cores I have as being much quicker in the desktop than my i7's - After trying to figure out why this is the case - it appears that much of it is an unintended result of software being written to take full advantage of the processing power Intel chips offered - loading the cpu fully much more often than the AMD's. Watching cpu usage , about the only time Vishera get's pushed to 95% usage or above is during stress tests or during load screens on games/apps, it's incredibly rare for them not to have plenty of resources available to start or switch apps in the desktop. I've even noticed this when running my X6 thubans, they don't seem to "stall" for a few moments when opening or changing apps nearly as often as my 2,3 and 4 gen i7's. Worth noting that I tend to run both overclocked with power savings features disabled .

My son's i7 Omen laptop with the 7th gen , ssd and ddr4 is the first Intel rig I've owned that feels as nimble.
While the 2500K was faster in games than the 83xx's, it lagged behind when it came to multitasking and what I used my PC for: rendering. So they had their strengths.

Quote:
Quote:
Originally Posted by faraz1729 go_quote.gif
Haha, Liranan, you creep.

Tacitus - The more corrupt the state, the more numerous the laws

Only when the last tree has died and the last river been poisoned and the last fish been caught will we realise we cannot eat money. - Cree Indian Proverb
Liranan is offline  
Reply

Quick Reply
Message:
Options

Register Now

In order to be able to post messages on the Overclock.net - An Overclocking Community forums, you must first register.
Please enter your desired user name, your email address and other required details in the form below.
User Name:
If you do not want to register, fill this field only and the name will be used as user name for your post.
Password
Please enter a password for your user account. Note that passwords are case-sensitive.
Password:
Confirm Password:
Email Address
Please enter a valid email address for yourself.
Email Address:

Log-in



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Show Printable Version Show Printable Version
Email this Page Email this Page


Forum Jump: 

Posting Rules  
You may post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off