Overclock.net banner

The 1st win7 scheduler benchs are IN + x264HD + WINRAR + 7ZIP + file link + ingame batman + crysis 1

70K views 237 replies 80 participants last post by  Quantum Reality  
#1 ·
Link: (I use 64bits version): http://www.mediafire.com/?y3ba4xcs50w6ki7

The BEFORE/AFTER windows scheduler patch.

My rig is in the sig. The only point not listed there is my cpu clock@4.6ghz (Only with multiplier). I know there are more efficient ways of doing it, but what matters here are the differences before and after the "magic patch".

To do this, I uninstalled the patch, then run cine, batman arkham and saints row the 3rd. After that, without any other changes, I run the same tests, on the same order.

If you guys need more info, just ask. Glad to help. I have no real experience with this kind of testing. Feel free to point if I did anything wrong.

PS: Please, can we NOT turn this into BD bashing? I'm so tired of that. This is more of an info thread for those interested in the gains. I know I'm likely wasting my time… but…

CINEBENCH:
Very little gain… it's there, run more than once. CPU is now recognized as 4c/8t.

cinebeforeafter.jpg

CINEBENCH with 1 thread:
cine1corebeforeafter.jpg

The one with 4c/8t is the after. Gain of 0.01… wow… world shattering.
tongue.gif
ahahah.

CINEBENCH 6 threads:

cine6cores.jpg
CINEBENCH 4 threads:

cine4cores.jpg

BATMAN ARKHAM CITY
… no visible gain. I did get a +3fps minimum frames, comparing 2 passes on each configuration… if you consider this palpable gain.
I did get a feeling of it running better, but it COULD be wishful thinking. Since I don't have numbers to back me up, I won't make a case here.
batman_BEFOREAFTER.jpg

BEFORE:
batmanac_2011_12_15_15_19_26_424.jpg

AFTER:
batmanac_2011_12_15_15_43_54_336.jpg

NEW STUFF: Ingame batman arkham city with hottie catwoman.
Well, I felt perhaps I'd see better performance ingame, instead of just running the benchmark feature. So I loaded the game and, without moving the characer from her place, rotated the camera until I got the lowest FPS. Again, change is very small... it's there, though... I ran it twice, then let the screen rest, moved here and there... those are the best results.

CONFIG:
batmanconfigs_IN.jpg

BEFORE:
batmanbefore_IN.jpg

AFTER:
batmanafter_IN.jpg

Again, I kept track of fps fluctuation and all and I can say, without fear of being wrong, that there is a difference. It is, however, still minimal. Do NOT expect miracles, or enough change to make a game go from unplayable to playable.

SAINTS ROW the 3rd:
Wow… here there is a BIG difference. The game plays much better after the patch. It's more fluid, no stuttering…

To take this picture, I ran around and found a spot where my fps would suffer more… somewhere easy to mark. I then proceed to wait for 5 minutes or so, taking pics whenever my fps was the lowest. Really, there's nothing more to say here than… GREAT!
Even being aware there's a problem with AMD drivers and low FPS, it's perfectly playable now (it wasn't before, at least for me). Here I saw the magic happening
biggrin.gif


EDIT: low graphics card usage is more of a crossfire thing atm, I think. From steam forums:
But there is still hope since volition posted a new thread on the official SR3 PC forums stating:

"Our team is well aware of the issues some users are seeing with regard to performance when meeting our minimum / recommended PC specs. Since the official release date of Saints Row: The Third, we've been working on optimizations to the game, which included a small patch on November 24th. We've also been in talks with AMD and Nvidia to look for ways to increase performance across the board both before, and after the game has been released, and we hope to have either a driver update, a game patch, or a combination of both in the near future to correct these issues. "

Hopefully it will be fixed eventually.

saintsrowthethird_dx11_2011_12_15_15_37_15_939.jpg

BEFORE:
sr3_BEFORE.jpg

AFTER:
saintsrowthethird_dx11_2011_12_15_15_36_59_649.jpg

EDIT:
SERIOUS SAM 3:
Also solid improvements. Settings are everything on ultra, 1080p. I got the spot where I found my fdps took the greatest hit, just in the beginning of the game. Saved… then loaded and took pics without moving. The pics were taken during the period of 2-3 minutes, whenever it hit a lower grade I'd SC it. As with saints row, the game now uses my xfire config better, allowing for greater fps:
BEFORE:
sam3before.jpg

AFTER:
sam3_AFTER.jpg

CRYSIS 1: The king of benches, I suppose... or it used to be. No matter, let's see how our patch deals with this one:

crysis_beforeafter.jpg

Palpable difference. Up to now, All games show some gain, no matter how small. Good news huh? =D

EDIT: 12/16/2011

WINRAR: Here folks... I have to give the news... worse results with the patch. I ran this test twice, but I don't know if I did it correctly. I just made it up.... so please patience with the noob.

I ran winrar on a x264.mp4 file of 1.18g. I let the compression reach exactly 80%, paused and took a pic. This is my best results in 2 passes:

rarbeforeafter.jpg

As you can see, it's clearly better withou the patch.

So, there it is, 1st official bad news
biggrin.gif


7ZIP: as requested, here's the benchmark. Same archive used with winRAR. 2 passes each, paused at 85%. No difference at all. I'm keeping the 7zip to use with my patch, thank you very much =D

7zipbench.jpg

Looks like MS pulled the plug on the patch. I'll keep it, because gaming in where I need the most performance. Guess it's each own's call.

http://www.brightsideofnews.com/news/2011/12/16/microsoft-releases-amd-bulldozer-patch-by-mistake2c-incomplete-download.aspx (Thanks radaja for the link)

X264HD benchs: To do this, I used a fresh boot each time.

x264 HD BENCHMARK 3.0 RESULTS WITHOUT PATCH

Please do not compare it with older versions of the benchmark!
Please copy/paste everything below the line to to report your data

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Results for x264.exe r1342

encoded 1442 frames, 88.17 fps, 3899.02 kb/s
encoded 1442 frames, 89.51 fps, 3899.02 kb/s
encoded 1442 frames, 89.01 fps, 3899.02 kb/s
encoded 1442 frames, 87.98 fps, 3899.02 kb/s
encoded 1442 frames, 44.70 fps, 3966.07 kb/s
encoded 1442 frames, 44.41 fps, 3947.72 kb/s
encoded 1442 frames, 44.64 fps, 3959.38 kb/s
encoded 1442 frames, 44.81 fps, 3952.28 kb/s

x264 HD BENCHMARK 3.0 RESULTS WITH PATCH
Please do not compare it with older versions of the benchmark!
Please copy/paste everything below the line to to report your data

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Results for x264.exe r1342

encoded 1442 frames, 91.44 fps, 3899.02 kb/s
encoded 1442 frames, 91.67 fps, 3899.02 kb/s
encoded 1442 frames, 90.46 fps, 3899.02 kb/s
encoded 1442 frames, 90.63 fps, 3899.02 kb/s
encoded 1442 frames, 43.90 fps, 3961.70 kb/s
encoded 1442 frames, 44.56 fps, 3960.99 kb/s
encoded 1442 frames, 44.59 fps, 3961.46 kb/s
encoded 1442 frames, 44.60 fps, 3958.33 kb/s

Not sure what to make of this. This bench was made as a request from a user.

CONCLUSION (edited_:
Great for gaming. It's a free patch, and the results go from barely seen to solid gains. For those of you who have a BD, it's a bloody MUST.

EDIT: It's a tradeoff. Some things turned out slower, like winRAR.

For those of you who don't have a BD… well, it's a cool patch, but shouldn't be enough of an impact to make a decision of getting one or not, I believe.

Pax,
Alex.
 
#4 ·
awesome.. thanks for the info. just installed the hotfix, and rebooted. won't be able to test games till i get home, but it's something to hope for software that makes purchasing the BD worth it.. And here I was debating spending Christmas money on an intel rig (i've been running AMD for the past 12yrs solid)
 
#8 ·
Run a multithread benchmark with 4-6 threads and monitor core affinity....
 
#9 ·
Quote:
Originally Posted by djriful View Post

Got a question, it is now recognized as 4 cores? It is really a 4Cores with 8 thread FX81x0?
The module-architecture points to that, so yes, sort of.
 
#10 ·
Your GPU usage is 50% on each core for the "BEFORE" Saints Row screenshot and 56% for the "AFTER" screenshot. Does Saints Row have any known issues with CFX/SLI, is that game normally CPU bottlenecked, or is it an issue with Bulldozer CPUs when running multiple GPUs?
 
#11 ·
Quote:
Originally Posted by Kand View Post

I like how your GPU usage is only 50% on both chips. CPU Bottlenecks ftw.
I'm curious.
What if you ran those games with Crossfire disabled?
I post no lies
biggrin.gif

It is what it is. No bashing, but no blind eye as well.

and I don't think I can disable xfire on the 6990. I believe only the gtx 590 has that option.
 
#15 ·
Quote:
Originally Posted by alexmaia_br View Post

teach me how and I'll do it gladly.
I am on a XP machine at work... but I believe you can check core affinity under Resource Monitor -> CPU. Just put a check on your benchmark .exe and it should give you specific thread level information. You may have to enable the Core field.
 
#16 ·
How about GTA IV? That's a cpu eatin' game.
biggrin.gif
If you have any modern games and the time, just go ahead and bench away, we shall appreciate.
Quote:
Originally Posted by djriful View Post

Got a question, it is now recognized as 4 cores? It is really a 4Cores with 8 thread FX81x0?
That's part of the update. As for whether the proc is actually 8core is up to your definition. 8 communist cores.
 
#17 ·
Quote:
Originally Posted by DuckieHo View Post

I am on a XP machine at work... but I believe you can check core affinity under Resource Monitor -> CPU. Just put a check on your benchmark .exe and it should give you specific thread level information. You may have to enable the Core field.
Just a note, to reach the Resource monitor, you can open Task Manager (right-click task bar and select Task Manager) and there is a button near the bottom that will open the Resource Monitor, iirc. I think that's probably the easiest way to get there.
 
#18 ·
Quote:
Originally Posted by Frosty88 View Post

Your GPU usage is 50% on each core for the "BEFORE" Saints Row screenshot and 56% for the "AFTER" screenshot. Does Saints Row have any known issues with CFX/SLI, is that game normally CPU bottlenecked, or is it an issue with Bulldozer CPUs when running multiple GPUs?
There's a problem with crossfire, I think. A lot of user complaining in steam forums. Doesn't seem specific to BD CPU
 
#19 ·
Quote:
Originally Posted by djriful View Post

Then I wonder why AMD claim first true 8 cores...
I highly doubt AMD made that statement.

Intel's Beckton (Nehalem) was released in March 2010:
333
 
#20 ·
Quote:
Originally Posted by BlackVenom View Post

How about GTA IV? That's a cpu eatin' game.
biggrin.gif
If you have any modern games and the time, just go ahead and bench away, we shall appreciate.
That's part of the update. As for whether the proc is actually 8core is up to your definition. 8 communist cores.
I can do a bunch more. But I have to finish correcting my students tests.
A bunch of law students on my neck because I missed the deadline of returning final exams due to geekiness extreme is no fun ;D
 
#22 ·
So 14 to 17 min, and 36.4 to 43.3...

17.6% gain in min, and 15.9% gain in the screen.

Nice gains, but bulldozer was clearly so far behind you're wasting at least $350 with that 6990.

I wouldn't get caught up in Windows calling it a 4c8t chip, it just makes sense since performance wise skipping every other core produces better results just like it does with Intels chips.

HT provides about a 5% increase per core, AMD's design provides 70% of the performance of the first core, it's not a 4c/8t chip.

AMD called it the first DESKTOP eight core chip, and it is. AMD had 12 core server chips to match Intels eight core chips last gen.
 
#25 ·
Black Ops was a seriously CPU bound game.. might have to install that and see the difference in fps there.
and then quickly uninstall it again.. lol
 
#26 ·
You have to wonder how AMD is going to react now that this update shows BD for what it is (I'm not bashing) which is their attempt at a HT style CPU. Will this cause a backlash as many people notice/realize that they don't actually have an 8core CPU (by the old/accepted definition of a core IIRC)? I defended BD before it was released because I bought into the marketing thinking AMD couldn't weather a fallout. But to be honest I should have understood what it was. The thing I want to know is why they did pretty much the same mistakes that Intel made with netburts (massive pipe length etc.)?

Either way, I'm hoping this helps in some way for those with BD. Maybe the OP should investigate if Core Parking will now also show up as an issue since windows seems to be treating BD as a Hyper Threaded CPU.