Overclock.net banner
10,361 - 10,380 of 11,808 Posts
The AMD rig uses a RTX 3080 while the Intel rig have a 5070TI ?
I know its alot of extra work, but i think i would be better if both had same GFX card when testing at 1080p (y)
I plan to. I know I have to redo the benchmark, but i wanted an early indicator as im impatient. I'm using windows ltsc 2019 on the amd system, which cannot run anything beyond basic dx12 as thats a windows 21h2 thing. Windows 2019 was just for suicide latency runs.
 
Update on refresh:

it makes sense.... Intel struggle right now with a deep reconstruction following error after error from the management and the silicon business (started from the 10nm drama but they had the architecture back then)

the refresh of a failed generation (deserved or not) on a dead socket would benefit .... who? just make a small refresh for having the marketing happy and call it a day. Focus on the next big thing.

they have the OEM market since AMD has not the volumes that intel still has so they will sell anyway in such a high-demand world starved of cpus...

The AMD rig uses a RTX 3080 while the Intel rig have a 5070TI ?
I know its alot of extra work, but i think i would be better if both had same GFX card when testing at 1080p (y)
it completely false the results....I mean.....why not even compare two different games at that point? I read a lot of criticism on youtubers but ... if the answer is such methodology....jeez

I see some streamers using high refresh rates on pvp FPSs that are on X3Ds.....or they lie or they are happy of them....than there is always that engine or that driver set that could be broken for that particular configuration but at the end the real values came out and amd is ahead nowadays, no reptilian conspiracy in sight
 
I plan to. I know I have to redo the benchmark, but i wanted an early indicator as im impatient. I'm using windows ltsc 2019 on the amd system, which cannot run anything beyond basic dx12 as thats a windows 21h2 thing. Windows 2019 was just for suicide latency runs.
"Unequal comparisons" should only be done by AMD users
You should have learnt your place until now🤭

In another note

Another example of someone that has the ability to collect interesting data but the bad testing methodology yields uninteresting or borderline false results
At least the TLDR is what we always knew as long as something is inside the cache everything seems to be fine, the moment you start going outside things get pretty messy
 
it completely false the results....I mean.....why not even compare two different games at that point? I read a lot of criticism on youtubers but ... if the answer is such methodology....jeez

I see some streamers using high refresh rates on pvp FPSs that are on X3Ds.....or they lie or they are happy of them....than there is always that engine or that driver set that could be broken for that particular configuration but at the end the real values came out and amd is ahead nowadays, no reptilian conspiracy in sight
Not so much. Results will most likely be the same. Nvidias dx11 threading hasn't changed from Ampere to Blackwell. The OS are different as well so im amazed you aren't pointing that out either. Both using nvidias previous driver, rebar off. Windows 10 v2019 is the fastest OS for DX11 as well. Textures are minimal, and if anything they are larger on the 285k system as I ran 1080, and amd had 720. I'm aware of texture compression varying between architectures but hopefully the data posted aren't affected.

I encourage anyone who has WH3 to run the mirrors of madness benchmark to help add more data.

My personal critique is that the Intel could be even better, I only ran cl30-44-38-28 8000MT or so, and the amd 9800x3d was -30 pbo 5425mhz, 8400 cl32-47-40-XX GDM off, sync.

Anyone is free to critique, feel free to even beat the performance.

I already know the outcome of 285k vs 9800x3d. ;)
 
Not so much. Results will most likely be the same. Nvidias dx11 threading hasn't changed from Ampere to Blackwell. The OS are different as well so im amazed you aren't pointing that out either. Both using nvidias previous driver, rebar off. Windows 10 v2019 is the fastest OS for DX11 as well. Textures are minimal, and if anything they are larger on the 285k system as I ran 1080, and amd had 720. I'm aware of texture compression varying between architectures but hopefully the data posted aren't affected.
the entire methodology is a mess I thought would sum everything up but didn't want to feel rude lol

really, if reading critics to famous youtubers/channels is starting from such tests well..... don't get me wrong, I love users forums and testing dictated by passion...but come on..... apple to oranges are always apples and oranges...no matter if it is coming from intel amd or cyrix dedicated threads like someone is implying....fanboys are fanboys, that's fine either but at least some kind of honesty to each others. cheers
 
the entire methodology is a mess I thought would sum everything up but didn't want to feel rude lol

really, if reading critics to famous youtubers/channels is starting from such tests well..... don't get me wrong, I love users forums and testing dictated by passion...but come on..... apple to oranges are always apples and oranges...no matter if it is coming from intel amd or cyrix dedicated threads like someone is implying....fanboys are fanboys, that's fine either but at least some kind of honesty to each others. cheers
Dude relax, its just a preliminary test. The 285k vs 9800 will be done on 23h2 spectre for intel and 22h2 ltsc for amd. Amds performance might go up or down. HAGS will be a factor.

You are kidding yourself if you think AMDs lows will improve magically. That low point is AMDs prime bottleneck. Whereas intels is just latency everywhere.
 
Dude relax, its just a preliminary test. The 285k vs 9800 will be done on 23h2 spectre for intel and 22h2 ltsc for amd. Amds performance might go up or down. HAGS will be a factor.

You are kidding yourself if you think AMDs lows will improve magically. That low point is AMDs prime bottleneck. Whereas intels is just latency everywhere.
Stop this at once!!!!
Happiness Facial expression Tooth Blond Internet meme
 
the refresh of a failed generation (deserved or not) on a dead socket would benefit .... who? just make a small refresh for having the marketing happy and call it a day. Focus on the next big thing.

they have the OEM market since AMD has not the volumes that intel still has so they will sell anyway in such a high-demand world starved of cpus...
The "next big thing" will still be low volume, high cost chips that are rushed out the door. Arrow lake's only real problem is price. If $100 USD chips don't exist, it will be a repeat of ARL.

OEM PC makers rejected arrow lake LGA due to price, so they will reject the refresh. OEM's now have nothing to launch until 1H2027 if they are lucky. OEM's want Meteor Lake-PS, Lunar Lake-PS, and Panther Lake-PS and they aren't getting it. My guess is they dump LGA or exit the PC market.
 
Dude relax, its just a preliminary test. The 285k vs 9800 will be done on 23h2 spectre for intel and 22h2 ltsc for amd. Amds performance might go up or down. HAGS will be a factor.

You are kidding yourself if you think AMDs lows will improve magically. That low point is AMDs prime bottleneck. Whereas intels is just latency everywhere.
I’m more than relaxed, i’m 41ys … i’m not getting mad for a post on a enthusiast pc forum

I don’t even mind if amds lows will increase, i was just pointing out a flawed testing methodology…

The "next big thing" will still be low volume, high cost chips that are rushed out the door. Arrow lake's only real problem is price. If $100 USD chips don't exist, it will be a repeat of ARL.

OEM PC makers rejected arrow lake LGA due to price, so they will reject the refresh. OEM's now have nothing to launch until 1H2027 if they are lucky. OEM's want Meteor Lake-PS, Lunar Lake-PS, and Panther Lake-PS and they aren't getting it. My guess is they dump LGA or exit the PC market.
i work for a foreign company and i’m based in italy… we have dell as client provider and there is no amd option in the whole portfolio… even the servers hp provider offers me only intel based platforms

This is for the last 10ys

Volumes can make things cheaper and as i said intel is well ahead amd volumes side amd there is market for both i would say
 
I’m more than relaxed, i’m 41ys … i’m not getting mad for a post on a enthusiast pc forum

I don’t even mind if amds lows will increase, i was just pointing out a flawed testing methodology…
You tried to point out irony as if it has any use. its less flawed than the source that prompted me to even run the benchmark. Some of blackbirds numbers are very misleading.

At best, maybe 15fps more with pci-e gen 5 for avg and highs, but the lows will remain (making dips feel worse). But I'm suspicious of even that gain being possible, as the benchmark was made for intel alder lake/raptor lake. But its just simple math for producing Y framerates with X pci e bandwidth.

This isnt even a good use of time since zen 6 is up to 12 cores per ccd, and would smoke the lows, im 6 months too late.
 
  • Rep+
Reactions: katasmc

1.1% increase for the 4090 at 1080 unknown settings using gen 4. 1.22% 1440, 1.82 at 4k.

That is a trend going UP in resolution. 50% res scale of 720p, which is what I ran on the 3080 is very likely gaining nothing, gen 4 <--> gen 5.

Better hope theres gains elsewhere.
 
  • Rep+
Reactions: katasmc
i work for a foreign company and i’m based in italy… we have dell as client provider and there is no amd option in the whole portfolio… even the servers hp provider offers me only intel based platforms

This is for the last 10ys

Volumes can make things cheaper and as i said intel is well ahead amd volumes side amd there is market for both i would say
New 2025 "Dell Pro" now feature AMD along with intel models. The intel models are raptor lake, but if you pay ridiculous prices, you can "upgrade" the CPU to get arrow lake. If you say AMD doesn't exist there, neither does arrow lake then. HP and Lenovo are the same.

OEM's said "no" to arrow lake because the price did not go down on volume orders.

"What we're really seeing is much greater demand from our customers for n-1 and n-2 products so that they can continue to deliver system price points that consumers are really demanding," explained Intel's Michelle Johnston Holthaus. "As we've all talked about, the macroeconomic concerns and tariffs have everybody kind of hedging their bets and what they need to have from an inventory perspective. And Raptor Lake is a great part. Meteor Lake and Lunar Lake are great as well, but come with a much higher cost structure, not only for us, but at the system ASP price points for our OEMs as well."
If intel can't deliver cheap chips like they used to with Nova Lake, LGA1954 will be yet another one-and-done socket. OEM's will be pushing Raptor Lake until 2028 Titan Lake which will be pure insanity.
 
Okay, wow. X3D actually does work well in WH3

The "main" minimum is still better on Intel though, even though there's an intel dip at the very start lol....

Amd:

View attachment 2721628

Intel:

View attachment 2721635
I encourage anyone who has WH3 to run the mirrors of madness benchmark to help add more data.

Anyone is free to critique, feel free to even beat the performance.

I already know the outcome of 285k vs 9800x3d. ;)
I actually bought this game to take a look at this made up AMDIP since it was only ~20$
But yeah, i found a drop, a drop to 480 minimum fps
Screenshot Software Technology Game Personal computer


But seeing that Arrow Lake having the same dropoff at ~25 sec to 446 fps i'm inclined to say that the "problem" dont lie with the platform we are using.
The game menus was acting really wonky in win24H2 and would stop to respond sometimes/crash, so i think its more likely this simply is a badly optimized game/benchmark and/or conflict with the Nvidia drivers (had to use my win10 install to get any meaningful data)

Anyway, just my 2cent, would not recommend this game as a benchmark
 
I actually bought this game to take a look at this so called AMDIP since it was only ~20$
But yeah, i found a drop, a drop to 480 minimum fps
View attachment 2721780

But seeing that Arrow Lake having the same dropoff at ~25 sec to 446 fps i'm inclined to say that the "problem" dont lie with the platform we are using.
The game menus was acting really wonky in win24H2 and would stop to respond sometimes/crash, i think its more likely this simply is a badly optimized game/benchmark and/or conflict with the Nvidia drivers (had to use my win10 install to get any meaningful data)

Anyway, just my 2cent, would not recommend this game as a benchmark
I'm down to remove it from the list, and I think you'll find it's using more than 8 cores, hence your better performance, or making use of the 2ccd bandwidth. One of few the games where the 9950X3D > 9800X3D in dual ccd mode.

It was released in 2022 so its quite old and doesn't rely on any special graphics technologies. I only used it as an example because it was the first game benchmark used by blackbird with suspicious results.

Your fps is very, very high but the average is even more spread from the lows. Were you running very high cpu clocks?

Now, i want to make it super clear that I do not care which platform is better. It simply doesn't matter to me at all. I'm just interested in optimizing and exploring the echoes of thoughts that I tend to find interesting. Everyone cant be right and everyone cant be wrong so there has to be something more worth discovering.
 
COD Warzone, Space Marine 2, Cyberpunk 2077, Stellar Blade, Horizon Zero Dawn and Forbidden west, FF16, Clair Obscur Expedition 33, AC Titles and many more, even on CS2
Warzone yes (free)
SM2, dont have it
CP2077 yes
Stellar Blade, dont have it
Horizon Zero Dawn Remastered yes
FF16, dont have it
Clair Obscur Expedition 33, I would buy it in a heartbut, but is it even easy to bench?
I have some of the older AC titles, but I hate Ubi launcher
CS2 I have, but cmon, X3D wins that >_> 350+ p1
 
Warzone yes (free)
SM2, dont have it
CP2077 yes
Stellar Blade, dont have it
Horizon Zero Dawn Remastered yes
FF16, dont have it
Clair Obscur Expedition 33, I would buy it in a heartbut, but is it even easy to bench?
I have some of the older AC titles, but I hate Ubi launcher
CS2 I have, but cmon, X3D wins that >_> 350+ p1
Latest game Wuchang Fallen Feathers is UE5, Steam review seems quite negative with regard to performance and pretty much everyone that list their system has AMD chip there. I don't have the game but I saw a few reviews no one mentioned performance, bad UE5 optimization or another AMD dip issue?
 
Latest game Wuchang Fallen Feathers is UE5, Steam review seems quite negative with regard to performance and pretty much everyone that list their system has AMD chip there. I don't have the game but I saw a few reviews no one mentioned performance, bad UE5 optimization or another AMD dip issue?
I think there are different kinds of dips, the real original AMDip is extremely difficult to measure and would take too much time and effort to understand.

The purpose of the bench off is to get a rough proxy of AMD's lows relative to averages as well as top 5% fps relative to averages, but the main point is to prove that the 285K (and 265K) is WAY better than YouTubers lead one to believe. What system is better will always depend on needs. But overall, I prefer the 285K. If I wasn't so lazy, and the asrock so full of bugs, id put in the effort for an 8800 profile.
 
10,361 - 10,380 of 11,808 Posts