Overclock.net - An Overclocking Community

Overclock.net - An Overclocking Community (https://www.overclock.net/forum/)
-   Benchmarking Software and Discussion (https://www.overclock.net/forum/21-benchmarking-software-discussion/)
-   -   Excel Benchmark (https://www.overclock.net/forum/21-benchmarking-software-discussion/1642077-excel-benchmark.html)

ir88ed 11-15-2017 06:58 AM

Hey benchmarkers! For my job I do a lot of my work with pretty huge Excel spreadsheets, and I have seen the Ryzen vids appearing to show a performance edge for the 1700x vs. the 7700k (my current workstation PC). I would like to tap the hive mind here to see if this holds up in a real-world experiment.

In my spreadsheets I am loath to convert my formulas into values, as I can't go back later to verify that the calculations were performed correctly. But this creates huge issues with sorting, as sorting with formulas can take a really long time. I have put together a modestly complex spreadsheet of random data and formulas. I would be curious to see how the intel chips with faster cores (OC is fine) stack up against the AMD chips with more cores (again, OC is fine) doing a simple sort on one of the columns. Excel can take advantage of multiple cores, so it should be an interesting battle between the intel and amd chips.

If you are interested, here is a link to my spreadsheet on googledocs, which now includes a benchmark button and timer courtesy of huzzug:

https://drive.google.com/open?id=1tVTAdP6WczJvlEAJNGhZXX87UJXH6vAV

Original non-timer sheet is here:
https://drive.google.com/open?id=1UTgkRg4yfCcoSZDxhIUD1Q99e7-26Tqx

You will need to download the spreadsheet and run the benchmark in your local MS Excel (not in google sheets). Just see how long it takes your PC to complete the sort. Note that you can only run the benchmark once in the worksheet. Once the sheet is sorted the calc times drop to less than 30 seconds if you re-bench. Just close and open again if you want to do a second run. Please do no more than two runs unless you reboot. For some reason the sort times seem drop significantly the third time the bencmark is run. Don't blame me, I didn't make Excel.

It would be helpful to have the following information:
CPU:
#Cores:
CPU clock speed:
Total system memory:
memory speed:
Excel version:
32/64 bit:
time to sort (in seconds):

Here is what I have compiled so far:

The Pook 11-15-2017 08:45 AM

Quote:
CPU: 6400 Skylake
#CPU cores: 4
CPU clock speed: 2.7GHz
Total system memory: 8GB
Excel ver: 2016
Time to sort (in seconds): 131 seconds

ir88ed 11-15-2017 09:02 AM

Nice! Any chance you will do it again with your 4.58GHz OC, or whatever will stay stable for the sort? I will make some graphs if I get enough data, but I am guessing your 6400 is pretty close to the 7700K in my workstation at the same frequency.

The Pook 11-15-2017 09:10 AM

I bought a 7700K, had to flash the BIOS off the modded one, killed the 7700K, and unable to back flash the BIOS now. Stuck at stock. frown.gif

ir88ed 11-15-2017 09:14 AM

Quote:
Originally Posted by The Pook View Post

I bought a 7700K, had to flash the BIOS off the modded one, killed the 7700K, and unable to back flash the BIOS now. Stuck at stock. frown.gif
That sucks.

mllrkllr88 11-15-2017 12:12 PM

Nice man!! This is interesting for sure biggrin.gif

Its difficult to tell and I am guessing there is some error in the way you are timing it, but it appears that core count is effecting the bench result. It would be interesting to see if the 7700K produced the same result with 2/2 vs 4/8. Do you have the ability to turn off HT and reduce the core count on your 7700K computer? I might join in the fun when I get some free time, I can run i9 7940x (14c) and see how core scaling holds up.

Ok, its a bit of a pipe dream here, but it would be cool to see this worked into an actual benchmark program haha. With a start button and auto timer function, but requiring MS Office installed is a huge drawback for benchers.

I cant want to see more thumb.gif

bajer29 11-15-2017 12:25 PM

CPU: i5 6500
#CPU cores: 4
CPU clock speed: 3.6GHz
Total system memory: 8GB
Excel ver: 2016
Time to sort (in seconds): 154 seconds

daffy.duck 11-15-2017 01:09 PM

CPU: Ryzen R5 1600
#CPU cores: 6
CPU clock speed: 3.90GHz
Total system memory: 16GB
Excel ver: 2016
Time to sort (in seconds): 106 seconds

ir88ed 11-15-2017 01:25 PM

Quote:
Originally Posted by daffy.duck View Post

CPU: Ryzen R5 1600
#CPU cores: 6
CPU clock speed: 3.90GHz
Total system memory: 16GB
Excel ver: 2016
Time to sort (in seconds): 106 seconds

Here come the Ryzens. I suspected that core count was going to be a big factor and the R5 1600 does a good job of making that point. It is going toe to toe with a 7700K @ 4.5GHz and my 6 core 5930K @ 3.7GHz.

ir88ed 11-15-2017 07:40 PM

Quote:
Originally Posted by mllrkllr88 View Post

Its difficult to tell and I am guessing there is some error in the way you are timing it, but it appears that core count is effecting the bench result.
Agreed; timing by hand is error prone and a poor approach.

I have looked at some of the visual basic code in excel in spreadsheets that have process timers (like: http://exceltrader.net/excel-benchmark/ ) but it would take me quite a while to figure out how to adapt this. I will do some looking to see if there isn't a more clear example somewhere.

spinFX 11-15-2017 08:03 PM

Quote:
Originally Posted by mllrkllr88 View Post

Nice man!! This is interesting for sure biggrin.gif

Its difficult to tell and I am guessing there is some error in the way you are timing it, but it appears that core count is effecting the bench result. It would be interesting to see if the 7700K produced the same result with 2/2 vs 4/8. Do you have the ability to turn off HT and reduce the core count on your 7700K computer? I might join in the fun when I get some free time, I can run i9 7940x (14c) and see how core scaling holds up.

Ok, its a bit of a pipe dream here, but it would be cool to see this worked into an actual benchmark program haha. With a start button and auto timer function, but requiring MS Office installed is a huge drawback for benchers.

I cant want to see more thumb.gif

I think if you have an excel spreadsheet that has performance issues you are using the wrong program though. Most people would move to a proper database if they ran into issues with excel, rather than upgrading all their systems

I wonder if using the Apache POI libraries to interact with the excel files would be a similar comparison to using excel itself. Could probably set up a benchmark without ms excel and try to emulate the way excel would do things.

ir88ed 11-15-2017 10:47 PM

Quote:
Originally Posted by spinFX View Post

I think if you have an excel spreadsheet that has performance issues you are using the wrong program though. Most people would move to a proper database if they ran into issues with excel, rather than upgrading all their systems

I wonder if using the Apache POI libraries to interact with the excel files would be a similar comparison to using excel itself. Could probably set up a benchmark without ms excel and try to emulate the way excel would do things.

Part of me agrees that there could be a better solution than excel, and I am pretty ignorant of the capabilities of an actual database. I get new datasets frequently; would I set up a database for each of them? I guess that seems like a lot of infrastructure for one-off analyses. Excel allows me to visually see the data structure, which is really helpful with making the formulas for calculations on complex data. Also I do a lot of sorting/formatting/mining of the results, and need flexibility for analysis. For instance, with a data base could I easily do new calculations to pull out all the genes that pass a pairwise ttest between two conditions and are negatively correlated with a given pheontype? That is a two minute job in excel.

huzzug 11-16-2017 02:20 AM

Alrighty, I bit and made a little update for the workbook. I basically automated the entire process to not require anyone to manually sort nor to record time on a stopwatch. Just click the "Benchmark". I'll want to make a few more updates for the sheet because I do not know whether it tests all areas of the CPU or only cache (doesn't look like, but the data is still on the file.

Linky

moustang 11-16-2017 03:02 PM

CPU: 8700K
#CPU cores: 6
CPU clock speed: 5.1GHz
Total system memory: 16GB
Excel ver: 2010
Time to sort (in seconds): 91.19 seconds

ir88ed 11-16-2017 08:16 PM

Quote:
Originally Posted by moustang View Post

CPU: 8700K
#CPU cores: 6
CPU clock speed: 5.1GHz
Total system memory: 16GB
Excel ver: 2010
Time to sort (in seconds): 91.19 seconds

Added. A bit surprised that your system scored lower than my 5930K @ 4.8GHz. Nice OC, btw.
I reran the updated benchmark (with huzzug's timer), and got the same result as before, but with more significant figures this time smile.gif
I wonder if the difference is excel 2010 vs 2016? If we end up with a bunch of data, it will be interesting to see if this is a factor.

ir88ed 11-16-2017 08:17 PM

Quote:
Originally Posted by huzzug View Post

Alrighty, I bit and made a little update for the workbook. I basically automated the entire process to not require anyone to manually sort nor to record time on a stopwatch. Just click the "Benchmark". I'll want to make a few more updates for the sheet because I do not know whether it tests all areas of the CPU or only cache (doesn't look like, but the data is still on the file.

Linky

Huge shout out to huzzug for putting in a timer into the spreadsheet! This is a huge improvement. Thanks huzzug!

huzzug 11-16-2017 10:17 PM

Can you change the OP file to the newer link in my post as I made a minor change to how the range is picked for sorting the data.
Edit: Disabled ability to save the worksheet as saving the sheet in the same order that the code sorts gives incorrect results

moustang 11-17-2017 06:06 AM

Quote:
Originally Posted by ir88ed View Post

Added. A bit surprised that your system scored lower than my 5930K @ 4.8GHz. Nice OC, btw.
I reran the updated benchmark (with huzzug's timer), and got the same result as before, but with more significant figures this time smile.gif
I wonder if the difference is excel 2010 vs 2016? If we end up with a bunch of data, it will be interesting to see if this is a factor.

It's a very memory intensive process. I suspect that the difference is down to memory architecture more than CPU speed. Your 4 channel mesh being faster than the 2 channel I have.

But Excel versions could have something to do with it as well. I guess it's time for me to break down and upgrade my Office software.

killerhz 11-17-2017 06:16 AM

CPU: 4790
#CPU cores: 4
CPU clock speed: 4.7GHz
Total system memory: 32GB
Excel ver: 2016
Time to sort (in seconds): 94.91 seconds (had to edit )

The Pook 11-17-2017 08:51 AM

Probably should include RAM speed in the results?

And no idea if I was supposed to submit my actual clocks or my turbo clocks so I just submitted my base clocks. Just kind of confused why my i5 6400 is out performing an i5 6500 unless it's down to RAM speed.

Panther Al 11-17-2017 10:26 AM

Gonna try this when I get home from the office. Kinda curious now myself, and wonder how much high core counts help (Running 6950X),

ir88ed 11-17-2017 11:13 AM

Quote:
Originally Posted by Panther Al View Post

Gonna try this when I get home from the office. Kinda curious now myself, and wonder how much high core counts help (Running 6950X),

Please do! Enterprise had mentioned he was going to give his 16-core 1950x threadripper a go, so the two of these will be a fascinating comparison (non-geeks will roll their eyes at this point).

ir88ed 11-17-2017 11:35 AM

Quote:
Originally Posted by The Pook View Post

Probably should include RAM speed in the results?

And no idea if I was supposed to submit my actual clocks or my turbo clocks so I just submitted my base clocks. Just kind of confused why my i5 6400 is out performing an i5 6500 unless it's down to RAM speed.

huzzug has done a few upgrades to the sheet, including a built-in timer. The numbers may bounce around a bit while the benchmark is finalized, but it is easy to run so just download the newest version and see where you fall. Also, there may be many reasons why an individual PC runs faster or slower than a similar machine, so I don't see any reason not to include mem speed in the chart.

The Pook 11-17-2017 12:18 PM

RAM is only running at DDR4-3066 since I was having problems with crashing in PUBG at XMP DDR4-3600. Why am I beating 7700Ks at much higher clocks? confused.gif


Quote:
CPU: i5 6400 Skylake
#CPU cores: 4
CPU clock speed: 2.7GHz
Total system memory: 8GB
RAM Speed: DDR4-3066
Excel ver: 2016
Time to sort (in seconds): 98.45 seconds

Panther Al 11-17-2017 03:11 PM

Huh.

6950X, 10c/20t, at 4.0GHz, 32GB RAM at 2666, Excel 2007, 167s. Will have to see what might not be working right, else it might be the ancient Excel I am running.

*emit*

Did another run - this time with the original file, no real difference in time. However, after a few false starts, loaded up HWMonitor, and noticed that it appears that Excel 2007 only seems to feed two cores. Interesting to know.

huzzug 11-17-2017 10:04 PM

I think MS added ability to use more than 2 cores to it's Office products with Office 2010. Also @ OP, if you're looking to do work with data, you should try to get PowerPivot addon onto your excel '10 or later. It's attuned to some of the tasks data scientists do with other software's.

With the current excel results, can some of you try to adjust your overclocks on your systems to see how the results variate.

Update: Excel (2010 & later) use multiple cores with formulas but when running macros, it's limited to one core. On my system, it pegs my Core 3 @ ~90% whereas others hover ~25-30%.

This bench could give ipc differences between cpu (don't know why pook is getting the result what he's getting), but I'd like to know the how the scores variate when changing cpu speed and ram speeds.

moustang 11-19-2017 07:38 AM

OK, I've done some memory timings tweaking on my system. Now running 3600 speed memory with 15-15-15-35 timings. Some minor tweaks to the advance timings.

Downloaded the latest file and this is what I got....

CPU: 8700K
#CPU cores: 6
CPU clock speed: 5.1GHz
Total system memory: 16GB
RAM Speed: 3600 @ 15-15-15-35 timings)
Excel ver: 2010
Time to sort (in seconds): 86.84 seconds





**** EDIT***

I just noticed that on the first page I'm listed as having an 8600K. That's incorrect. I have an 8700K.CPU.

moustang 11-19-2017 07:58 AM

Quote:
Originally Posted by Panther Al View Post

Huh.

6950X, 10c/20t, at 4.0GHz, 32GB RAM at 2666, Excel 2007, 167s. Will have to see what might not be working right, else it might be the ancient Excel I am running.

*emit*

Did another run - this time with the original file, no real difference in time. However, after a few false starts, loaded up HWMonitor, and noticed that it appears that Excel 2007 only seems to feed two cores. Interesting to know.

I've now run the test 3 times with HWMonitor running. A few things I've noted from it.

#1. This Excel benchmark never uses more than 2 cores.
#2. Even on the two cores it's using the CPU load is very small. My individual core temperatures never exceeded 45C while running this benchmark but they routinely hit 55C when running something like 3DMark or gaming, and have hit 63C under Prime95 stress testing. This benchmark is making very little use of the CPU
#3. Based off my first result compared with my latest result I'm convinced that this Excel test is primarily influenced by RAM speeds. I got a 5 second drop in processing time without changing a single thing other than RAM timings in my system.

huzzug 11-19-2017 09:38 AM

I made a little bigger change to the way the file benchmarks. I've changed how the final data is being sorted since that column is dependent on the data at the rear columns. The changes that I've made:

1. Truncated the data from the original 60,000 odd rows to just 1,000 rows where every cell is being calculated.
2. You need not close the file and restart to get correct score. The file now should be consistent whether you're benchmarking for the first time or 10th.

With just the above changes, the file takes ~3 times the time it took to bench the first time, while also pegging the main thread ~70-90%. Currently, the cells are only doing additions, divisions and getting averages across a small range of cells. I'd like to know any more suggestions that you guys like to see implemented.

P.S. I'm not an advanced user with VBA's, so I may not be able to accomplish everything that you may have in mind, but do let me know and I'll try to incorporate them.

Data1000.zip 1282k .zip file

I let the original file run on my work system the entire night with the changes that I made. My core 3 seems to be pegged at more than 70% and rest of the cores are pegged ~ 30.


ir88ed 11-19-2017 07:10 PM

Quote:
Originally Posted by moustang View Post


I just noticed that on the first page I'm listed as having an 8600K. That's incorrect. I have an 8700K.CPU.

Updated

AlphaC 11-19-2017 07:25 PM

Have you seen these for Monte Carlo?
https://www.techspot.com/review/1345-amd-ryzen-7-1800x-1700x/page3.html
https://www.techspot.com/review/1497-intel-core-i7-8700k/
https://us.hardware.info/reviews/7602/11/intel-core-i7-8700k--i5-8600k--i5-8400-coffee-lake-review-affordable-six-cores-benchmarks-web-browsing-and-microsoft-office-word-and-excel-2016n
https://www.hardwareluxx.ru/index.php/artikel/hardware/prozessoren/43220-coffee-lake-intel-core-i7-8700k-i5-8600k-i5-8400-test.html?start=6
https://www.ocaholic.ch/modules/smartsection/item.php?itemid=3990&page=3
https://www.benchmark.rs/artikal/test_intel_coffee_lake_-_core_i7_8700k_i_core_i5_8600k-4439/9

"Big Number crunch"
https://pctuning.tyden.cz/hardware/procesory-pameti/48751?start=9

Black Scholes
https://www.overclockersclub.com/reviews/intel_core_i7_8700k__core_i5_8400/5.htm

Would be interesting to compare Excel 2010 and 2016. Excel 2010 has less bloat.

edit:
first run of the original sheet without timer on Sandy Bridge i5-2500K @ 4.6GHz is just under 3 min 20s ( < 200seconds) , 2x4GB DDR3 1600 CL9, with a ton of other stuff open & running Excel 2016 on Win 7 Pro

huzzug's test had 183.25seconds spit out on that machine when I closed Firefox and other stuff (I still had all the background apps such as antivirus).

There's something horribly wrong with this test's multi-threading if I can score nearly as high as any i7 newer than 2nd gen.

Running huzzug's test in Excel 2016 on a Ryzen 7 1700X set to stock clocks results in 30.28 seconds , 2nd run 28.97seconds. I'll have to retest it a few times , the CPU usage is really low.
2x8GB DDR4 3200 CL16 (Hynix with manual timings)
Excel 2016 , Windows 7 Pro



The original sheet on the same setup resulted in about 3 min 29 seconds (=209 seconds) , with the Process Monitor showing on average 6% CPU usage (so horribly single threaded , as 1/16 ~ 6% of CPU). A subsequent run on a different column resulted in 3 min 19 seconds (=199 seconds) so it may be sensitive to XFR kicking in faster.

ir88ed 11-19-2017 07:26 PM

Quote:
Originally Posted by huzzug View Post

Warning: Spoiler! (Click to show)
I made a little bigger change to the way the file benchmarks. I've changed how the final data is being sorted since that column is dependent on the data at the rear columns. The changes that I've made:

1. Truncated the data from the original 60,000 odd rows to just 1,000 rows where every cell is being calculated.
2. You need not close the file and restart to get correct score. The file now should be consistent whether you're benchmarking for the first time or 10th.

With just the above changes, the file takes ~3 times the time it took to bench the first time, while also pegging the main thread ~70-90%. Currently, the cells are only doing additions, divisions and getting averages across a small range of cells. I'd like to know any more suggestions that you guys like to see implemented.

P.S. I'm not an advanced user with VBA's, so I may not be able to accomplish everything that you may have in mind, but do let me know and I'll try to incorporate them.

Data1000.zip 1282k .zip file

I let the original file run on my work system the entire night with the changes that I made. My core 3 seems to be pegged at more than 70% and rest of the cores are pegged ~ 30.
[\SPOILER]

I am hesitant to replace the benchmark after people have already made runs and data collection has started, but I could include it on the top post as a better threaded option with a second spreadsheet when we are sure it is what we want.

I am doing a benchmark time by CPU speed (3.0 --> 4.8Ghz) graph now, and hope to have data up soon. Short answer is that CPU speed does affect sort time (at least on my machine), and clearly all cores are not being pegged to 100%. This is an Excel benchmark after all, and I was hoping that this benchmark would reflect real-world performance of Excel and allow users to make informed decisions about build specs.

huzzug 11-19-2017 08:10 PM

Currently trying to research more options to spread load across multiple threads and better utilization. From what I've gathered these past few days is that excel cannot utilize more than one core / thread if you're using VBA (mine does use VBA) but general calculations within the sheets with formulas can run parallel (it's why you see ~25% utilization on my sheets on other cores).

My file to me seems to be a mix of both, but I'm still looking to incorporate lookups to tie in RAM with proc usage as well.

It will take time since I find 2 hours to spend in the evening on this, so I'll be a little slow with updates.

My next goal is to find what gives better utilization across threads as well as to get consistent loads on each core.

AlphaC 11-20-2017 08:46 AM

My observation :The original file , with simply clicking the column filter (column heading) and hitting "sort by smallest to largest" results in a horribly singlethreaded test. There's no way an i5 is faster than a i7-6950x.

ir88ed : I'm unsure of the purpose of sorting the columns , but if you need to sort multiple times you might consider using a Pivot Table or using MATLAB code to process the data in parallel and then spitting it back out into the Excel sheet as a new sheet altogether.

huzzug 11-20-2017 08:51 AM

Quote:
Originally Posted by AlphaC View Post

My observation :The original file , with simply clicking the column filter (column heading) and hitting "sort by smallest to largest" results in a horribly singlethreaded test. There's no way an i5 is faster than a i7-6950x.

ir88ed : I'm unsure of the purpose of sorting the columns , but if you need to sort multiple times you might consider using a Pivot Table or using MATLAB code to process the data in parallel and then spitting it back out into the Excel sheet as a new sheet altogether.

I like this suggestion, but wouldn't that basically be benchmarking MATLAB and not Excel ?

AlphaC 11-20-2017 08:54 AM

My point is Excel installed normally (i.e. no custom install) is default 32 bit.

huzzug 11-20-2017 09:02 AM

But, how would that affect benching excel ?

AlphaC 11-20-2017 09:06 AM

Quote:
Originally Posted by huzzug View Post

But, how would that affect benching excel ?

64 bit might be better tongue.gif

Especially for over 4 GB of virtual RAM...

https://msdn.microsoft.com/en-us/vba/excel-vba/articles/excel-performance-and-limit-improvements
Quote:
Large data sets and the 64-bit version of Excel

The 64-bit version of Excel 2010 is not constrained to 2 GB of RAM like the 32-bit version applications. Therefore, the 64-bit version of Excel 2010 enables users to create much larger workbooks. The 64-bit version of Windows enables a larger addressable memory capacity, and Excel is designed to take advantage of that capacity. For example, users are able to fill more of the grid with data than was possible in previous versions of Excel. As more RAM is added to the computer, Excel uses that additional memory, allows larger and larger workbooks, and scales with the amount of RAM available.

In addition, because the 64-bit version of Excel enables larger data sets, both the 32-bit and 64-bit versions of Excel 2010 introduce improvements to common large data set tasks such as entering and filling down data, sorting, filtering, and copying and pasting data. Memory usage is also optimized to be more efficient in both the 32-bit and 64-bit versions of Excel.
Quote:
Calculation improvements

Starting in Excel 2007, multithreaded calculation improved calculation performance.

Starting in Excel 2010, additional performance improvements were made to further increase calculation speed. Excel 2010 can call user-defined functions asynchronously. Calling functions asynchronously improves performance by allowing several calculations to run at the same time. When you run user-defined functions on a compute cluster, calling functions asynchronously enables several computers to be used to complete the calculations. For more information, see Asynchronous User-Defined Functions.


Multi-core processing

Excel 2010 made additional investments to take advantage of multi-core processors and increase performance for routine tasks. Starting in Excel 2010, the following features use multi-core processors: saving a file, opening a file, refreshing a PivotTable (for external data sources, except OLAP and SharePoint), sorting a cell table, sorting a PivotTable, and auto-sizing a column.

For operations that involve reading and loading or writing data, such as opening a file, saving a file, or refreshing data, splitting the operation into two processes increases performance speed. The first process gets the data, and the second process loads the data into the appropriate structure in memory or writes the data to a file. In this way, as soon as the first process begins reading a portion of data, the second process can immediately start loading or writing that data, while the first process continues to read the next portion of data. Previously, the first process had to finish reading all the data in a certain section before the second process could load that section of the data into memory or write the data to a file.

https://msdn.microsoft.com/en-us/vba/excel-vba/articles/excel-improving-calcuation-performance
Quote:
Drill-down approach to finding obstructions

The drill-down approach starts by timing the calculation of the workbook, the calculation of each worksheet, and the blocks of formulas on slow-calculating sheets. Do each step in order and note the calculation times.
To find obstructions using the drill-down approach

Ensure that you have only one workbook open and no other tasks are running.

Set calculation to manual.

Make a backup copy of the workbook.

Open the workbook that contains the Calculation Timers macros, or add them to the workbook.

Check the used range by pressing Ctrl+End on each worksheet in turn.

This shows where the last used cell is. If this is beyond where you expect it to be, consider deleting the excess columns and rows and saving the workbook. For more information, see the "Minimizing the used range" section in Excel performance: Tips for optimizing performance obstructions.

Run the FullCalcTimer macro.

The time to calculate all the formulas in the workbook is usually the worst-case time.

Run the RecalcTimer macro.

A recalculation immediately after a full calculation usually gives you the best-case time.

Calculate workbook volatility as the ratio of recalculation time to full calculation time.

This measures the extent to which volatile formulas and the evaluation of the calculation chain are obstructions.

Activate each sheet and run the SheetTimer macro in turn.

Because you just recalculated the workbook, this gives you the recalculate time for each worksheet. This should enable you to determine which ones are the problem worksheets.

Run the RangeTimer macro on selected blocks of formulas.

For each problem worksheet, divide the columns or rows into a small number of blocks.

Select each block in turn, and then run the RangeTimer macro on the block.

If necessary, drill down further by subdividing each block into a smaller number of blocks.

Prioritize the obstructions.


Speeding up calculations and reducing obstructions

It is not the number of formulas or the size of a workbook that consumes the calculation time. It is the number of cell references and calculation operations, and the efficiency of the functions being used.


Because most worksheets are constructed by copying formulas that contain a mixture of absolute and relative references, they usually contain a large number of formulas that contain repeated or duplicated calculations and references.

Avoid complex mega-formulas and array formulas. In general, it is better to have more rows and columns and fewer complex calculations. This gives both the smart recalculation and the multithreaded calculation in Excel a better opportunity to optimize the calculations. It is also easier to understand and debug. The following are a few rules to help you speed up workbook calculations.

In particular
Quote:
Avoid single-threaded functions:

PHONETIC
CELL when either the "format" or "address" argument is used
INDIRECT
GETPIVOTDATA
CUBEMEMBER
CUBEVALUE
CUBEMEMBERPROPERTY
CUBESET
CUBERANKEDMEMBER
CUBEKPIMEMBER
CUBESETCOUNT
ADDRESS where the fifth parameter (the sheet_name) is given
Any database function (DSUM, DAVERAGE, and so on) that refers to a pivot table
ERROR.TYPE
HYPERLINK
VBA and COM add-in user defined functions

huzzug 11-20-2017 09:17 AM

Quote:
Originally Posted by AlphaC View Post

Quote:
Originally Posted by huzzug View Post

But, how would that affect benching excel ?

64 bit might be better tongue.gif

Especially for over 4 GB of virtual RAM...

https://msdn.microsoft.com/en-us/vba/excel-vba/articles/excel-performance-and-limit-improvements
Quote:
Large data sets and the 64-bit version of Excel

The 64-bit version of Excel 2010 is not constrained to 2 GB of RAM like the 32-bit version applications. Therefore, the 64-bit version of Excel 2010 enables users to create much larger workbooks. The 64-bit version of Windows enables a larger addressable memory capacity, and Excel is designed to take advantage of that capacity. For example, users are able to fill more of the grid with data than was possible in previous versions of Excel. As more RAM is added to the computer, Excel uses that additional memory, allows larger and larger workbooks, and scales with the amount of RAM available.

In addition, because the 64-bit version of Excel enables larger data sets, both the 32-bit and 64-bit versions of Excel 2010 introduce improvements to common large data set tasks such as entering and filling down data, sorting, filtering, and copying and pasting data. Memory usage is also optimized to be more efficient in both the 32-bit and 64-bit versions of Excel.
Quote:
Calculation improvements

Starting in Excel 2007, multithreaded calculation improved calculation performance.

Starting in Excel 2010, additional performance improvements were made to further increase calculation speed. Excel 2010 can call user-defined functions asynchronously. Calling functions asynchronously improves performance by allowing several calculations to run at the same time. When you run user-defined functions on a compute cluster, calling functions asynchronously enables several computers to be used to complete the calculations. For more information, see Asynchronous User-Defined Functions.


Multi-core processing

Excel 2010 made additional investments to take advantage of multi-core processors and increase performance for routine tasks. Starting in Excel 2010, the following features use multi-core processors: saving a file, opening a file, refreshing a PivotTable (for external data sources, except OLAP and SharePoint), sorting a cell table, sorting a PivotTable, and auto-sizing a column.

For operations that involve reading and loading or writing data, such as opening a file, saving a file, or refreshing data, splitting the operation into two processes increases performance speed. The first process gets the data, and the second process loads the data into the appropriate structure in memory or writes the data to a file. In this way, as soon as the first process begins reading a portion of data, the second process can immediately start loading or writing that data, while the first process continues to read the next portion of data. Previously, the first process had to finish reading all the data in a certain section before the second process could load that section of the data into memory or write the data to a file.

https://msdn.microsoft.com/en-us/vba/excel-vba/articles/excel-improving-calcuation-performance
Quote:
Drill-down approach to finding obstructions

The drill-down approach starts by timing the calculation of the workbook, the calculation of each worksheet, and the blocks of formulas on slow-calculating sheets. Do each step in order and note the calculation times.
To find obstructions using the drill-down approach

Ensure that you have only one workbook open and no other tasks are running.

Set calculation to manual.

Make a backup copy of the workbook.

Open the workbook that contains the Calculation Timers macros, or add them to the workbook.

Check the used range by pressing Ctrl+End on each worksheet in turn.

This shows where the last used cell is. If this is beyond where you expect it to be, consider deleting the excess columns and rows and saving the workbook. For more information, see the "Minimizing the used range" section in Excel performance: Tips for optimizing performance obstructions.

Run the FullCalcTimer macro.

The time to calculate all the formulas in the workbook is usually the worst-case time.

Run the RecalcTimer macro.

A recalculation immediately after a full calculation usually gives you the best-case time.

Calculate workbook volatility as the ratio of recalculation time to full calculation time.

This measures the extent to which volatile formulas and the evaluation of the calculation chain are obstructions.

Activate each sheet and run the SheetTimer macro in turn.

Because you just recalculated the workbook, this gives you the recalculate time for each worksheet. This should enable you to determine which ones are the problem worksheets.

Run the RangeTimer macro on selected blocks of formulas.

For each problem worksheet, divide the columns or rows into a small number of blocks.

Select each block in turn, and then run the RangeTimer macro on the block.

If necessary, drill down further by subdividing each block into a smaller number of blocks.

Prioritize the obstructions.


Speeding up calculations and reducing obstructions

It is not the number of formulas or the size of a workbook that consumes the calculation time. It is the number of cell references and calculation operations, and the efficiency of the functions being used.


Because most worksheets are constructed by copying formulas that contain a mixture of absolute and relative references, they usually contain a large number of formulas that contain repeated or duplicated calculations and references.

Avoid complex mega-formulas and array formulas. In general, it is better to have more rows and columns and fewer complex calculations. This gives both the smart recalculation and the multithreaded calculation in Excel a better opportunity to optimize the calculations. It is also easier to understand and debug. The following are a few rules to help you speed up workbook calculations.

In particular
Quote:
Avoid single-threaded functions:

PHONETIC
CELL when either the "format" or "address" argument is used
INDIRECT
GETPIVOTDATA
CUBEMEMBER
CUBEVALUE
CUBEMEMBERPROPERTY
CUBESET
CUBERANKEDMEMBER
CUBEKPIMEMBER
CUBESETCOUNT
ADDRESS where the fifth parameter (the sheet_name) is given
Any database function (DSUM, DAVERAGE, and so on) that refers to a pivot table
ERROR.TYPE
HYPERLINK
VBA and COM add-in user defined functions

I've seen the advantages to using 64bit, but VBS still work on one thread and seems to be 32bit. The general functions within excel are what can utilize more cores. My question is more to do with the current excel bench because I doubt you're excel is occupying more than 200MB of RAM even with 32bit.

NightAntilli 11-20-2017 10:13 AM

Just for fun, I tried it with my FX-8320, but, it is using only one core in Excel 2013, resulting in a time of 218.68 seconds;

CPU: FX-8320
#CPU cores: 8
CPU clock speed: 4.5 GHz
Total system memory: 16GB
RAM speed: 1866 MHz
Excel ver: 2013
Time to sort (in seconds): 218.68 seconds

stealth83 11-20-2017 03:46 PM

CPU: i7 6700K
#CPU cores: 4
CPU clock speed: 4.0/4.2GHz
Total system memory: 16GB
RAM Speed: DDR4-3000
Excel ver: 2010
Time to sort (in seconds): 109.12 seconds


airisom2 11-20-2017 04:09 PM

I tried the benchmark in LibreOffice Calc and got 62 seconds on my rig.

huzzug 11-20-2017 06:12 PM

Does Libre support VBA? Also, what are your system specs?

ir88ed 11-20-2017 06:16 PM

Quote:
Originally Posted by airisom2 View Post

I tried the benchmark in LibreOffice Calc and got 62 seconds on my rig.

I tried the same thing before I started this thread. LibreOffice was really fast on the sort. Then it immediately crashed. Not slinging mud, just my experience. Even if that was a fluke, it seemed a bit clunky, kind of a middle ground between google sheets and Excel.

ir88ed 11-20-2017 06:22 PM

Quote:
Originally Posted by NightAntilli View Post

Just for fun, I tried it with my FX-8320, but, it is using only one core in Excel 2013, resulting in a time of 218.68 seconds;

CPU: FX-8320
#CPU cores: 8
CPU clock speed: 4.5 GHz
Total system memory: 16GB
RAM speed: 1866 MHz
Excel ver: 2013
Time to sort (in seconds): 218.68 seconds
Added

ir88ed 11-20-2017 06:22 PM

Quote:
Originally Posted by stealth83 View Post

CPU: i7 6700K
#CPU cores: 4
CPU clock speed: 4.0/4.2GHz
Total system memory: 16GB
RAM Speed: DDR4-3000
Excel ver: 2010
Time to sort (in seconds): 109.12 seconds

Added

spinFX 11-20-2017 09:24 PM

Quote:
Originally Posted by ir88ed View Post

Part of me agrees that there could be a better solution than excel, and I am pretty ignorant of the capabilities of an actual database. I get new datasets frequently; would I set up a database for each of them? I guess that seems like a lot of infrastructure for one-off analyses. Excel allows me to visually see the data structure, which is really helpful with making the formulas for calculations on complex data. Also I do a lot of sorting/formatting/mining of the results, and need flexibility for analysis. For instance, with a data base could I easily do new calculations to pull out all the genes that pass a pairwise ttest between two conditions and are negatively correlated with a given pheontype? That is a two minute job in excel.

Yeah well if you want the ease of use for quick, once-off databases you could use Access, but it's probably the worst of all database software haha.
There are quick ways to get databases (eg some flavour of sql) going quite quickly, and then doing sorts and calcs etc would be done with SQL. Obviously this would require learning that language but once you did you would have far more power at your fingertips for doing your calcs and analysis.

But hey, if you can get it done in excel and you can do it 2 minutes, I guess that is the right program tongue.gif

airisom2 11-21-2017 02:45 AM

Quote:
Originally Posted by huzzug View Post

Does Libre support VBA? Also, what are your system specs?

It seems to run the macro fine, and all I had to do disable the macro security. I don't think LO is 100% compatible with VBA code, though. The rig is in my sig; 4.3GHz 4930K, 16GB 2400MHz. Your 1,000 line benchmark was abysmally slow on LO, like two seconds per row slow.
Quote:
Originally Posted by ir88ed View Post

I tried the same thing before I started this thread. LibreOffice was really fast on the sort. Then it immediately crashed. Not slinging mud, just my experience. Even if that was a fluke, it seemed a bit clunky, kind of a middle ground between google sheets and Excel.

Strange. The spreadsheet was really unresponsive with the zoom at 55%, and changing it to 100% made things normal. That might help.

ir88ed 11-21-2017 06:33 AM

Quote:
Originally Posted by huzzug View Post

With the current excel results, can some of you try to adjust your overclocks on your systems to see how the results variate.

This bench could give ipc differences between cpu (don't know why pook is getting the result what he's getting), but I'd like to know the how the scores variate when changing cpu speed and ram speeds.

Below is a graph of sort times in Excel 2016 by CPU speed. It is clear from the graph that processor speed is playing a large role in performance in this benchmark. I will repeat one or more of the frequencies with slower ram speed to see if that has an effect.

Here are some notes:
- Speeds between 3.0 and 4.8GHz were tested with at least two replicates (except the 3GHz which has only one)
- Each frequency tested was done with a freshly booted system with no other windows open
- Replicates were done with a freshly opened spreadsheet
- Replicates beyond the second one showed significantly lower sort times than the first two, and were not included. For frequencies with more than two replicates, the system was rebooted and a second round of testing was done.



Also, data if you are interested:
GHz Sort Time (s)
3 124.35
3.1 121.25 120.9 121.82 121.15
3.2 118.87 118.73
3.3
3.4 111.75 114.3
3.5
3.6 105.77 109.29
3.7
3.8 105.89 107.44
3.9
4 103.24 103.46
4.1
4.2 100.53 101.16
4.3
4.4 98.66 98.84
4.5
4.6 96.73 95.02
4.7
4.8 92.5 90.14

ir88ed 11-21-2017 06:39 AM

Quote:
Originally Posted by moustang View Post

Downloaded the latest file and this is what I got....

CPU: 8700K
#CPU cores: 6
CPU clock speed: 5.1GHz
Total system memory: 16GB
RAM Speed: 3600 @ 15-15-15-35 timings)
Excel ver: 2010
Time to sort (in seconds): 86.84 seconds

I reran my sorts in a more controlled fashion and ended with slower overall times. This moves you to first place... for now thumb.gif
Not surprising given the blistering 5.1GHz OC. Respect.

japau 11-21-2017 06:54 AM

I run the benchmark on 8700k It seems to only run on one core so no multi threading for excel? checked thread usage with HWInfo64 while the benchmark ran.

Happy Hepo 11-21-2017 06:58 AM

CPU: i5 4670K
#Cores: 4 (no HT)
CPU clock speed: 4,3 GHz
Total system memory: 16 GB
memory speed: 1866MHz
Excel ver.: 2013
time to sort in seconds: 130,43

ir88ed 11-21-2017 07:17 AM

Quote:
Originally Posted by japau View Post

I run the benchmark on 8700k It seems to only run on one core so no multi threading for excel? checked thread usage with HWInfo64 while the benchmark ran.
Which excel version? Sounds like multi-threading came on in 2010.

japau 11-21-2017 07:53 AM

Newest Office 2016 professional (windows 10 pro 64bit)

I managed to run 95sec so its in line with the other posters, so i think none has seen 100% usage on all threads.

ir88ed 11-21-2017 08:37 AM

Quote:
Originally Posted by japau View Post

Newest Office 2016 professional (windows 10 pro 64bit)

I managed to run 95sec so its in line with the other posters, so i think none has seen 100% usage on all threads.
To be fair, the purpose of the benchmark is not a CPU benchmark, but rather a way to see what systems perform best with regards to Excel usage.

I wanted people who are building a workstation around Excel and were on the fence between a 7700k and a 1700x (like me) to have data to back up their build decisions.

LostParticle 11-21-2017 09:11 AM



ENTERPRISE 11-21-2017 03:33 PM

Great idea, will test my system tommorow.


mllrkllr88 11-21-2017 08:03 PM

Firstly, awesome project! I am so impressed at how far this project has come in just a week. I want to thank @ir88ed and @huzzug, good work guys!

I did some testing with my 14 core chip today. I tested a few different variables to see what effects the total run time. The results are surprising to say the least.

Here is my test setup:
  • New install of W8.1 64
  • Office Pro 2010


Core/Thread Testing i9 7940X
14 Cores / 28 Threads = 93.93 seconds
14 Cores / 14 Threads = 93.26 seconds
2 Cores / 2 Threads = 93.29 seconds

In the pictures below you can see, there is basically no scaling potential with core/thread count. If you watch the cores while the bench is running you will see that only 50% of 1 core is being utilized. The score difference is negligible.
Core Test Screenshots (Click to show)
14/28


14/14


2/2




Amount of Memory
31.8 Gb usable by OS = 93.93 seconds
2.4 Gb usable by OS = 93.68 seconds

I am using quad channel (8x4) for both tests. To test less memory I used a console command to limit the amount of memory the OS can use. I reused the 32GB score from my core testing since its the same 14/28 for the baseline. As you can see, there is basically no scaling potential with the physical amount of memory. The score difference is negligible.
Amount of Memory Screenshots (Click to show)
31.8Gb System Memory


2.41Gb System Memory




Memory Timings
CL16-16-16-36 2T = 93.68 seconds
CL12-11-11-24 1T = 91.4 seconds
For this test I will compare the timings only. The DRAM frequency will be 3600 for both tests. The test only compares super tight timings vs XMP timings. In order to run super tight memory with high frequency in 64bit OS's, the amount of memory used by the OS needs to be less than 4gb. For this test I will reuse my result from the memory quantity testing as the baseline.

Finally we see a gain!! This is logical, we expect the performance to be increased with tight timings.
Memory Timings Screenshots (Click to show)
3600 CL16-16-16-36 2T


3600 CL12-11-11-24 1T



Thats all for now, but I have more tests planned for the future. Next time I run some LN2 on 7740X, I will try and run this bench at high frequencies and see what happens
cheers.gif

huzzug 11-21-2017 11:41 PM

I still have the test with the entire 60K rows of data, but also 10K & 5K as well. Let me know if anyone of you want to run them as well. For giggles

AlphaC 11-22-2017 09:36 AM

Tested Ryzen 7 1700X at stock again.

There's some variance on XFR kicking in and also with antivirus.

Original sheet without timer is roughly 3 min 24 seconds. My original run the other day had 3 min 29 seconds on first run.

Huzzug's variation with a fresh boot obtained 212.46 seconds. I noticed the antivirus icon kick in on the first run. Closing Excel and running again resulted in 199.29 seconds.

I really don't know what to make of it because I definitely obtained 30 seconds the other day.

ir88ed 11-22-2017 09:45 AM

Quote:
Originally Posted by AlphaC View Post

Tested Ryzen 7 1700X at stock again.

There's some variance on XFR kicking in and also with antivirus.

Original sheet without timer is roughly 3 min 24 seconds. My original run the other day had 3 min 29 seconds on first run.

Huzzug's variation with a fresh boot obtained 212.46 seconds. I noticed the antivirus icon kick in on the first run. Closing Excel and running again resulted in 199.29 seconds.

I really don't know what to make of it because I definitely obtained 30 seconds the other day.


Hmmm... We have a 1600 Ryzen that did a 106 on the original sheet.
The 30 second run sounds like the results you get when you hit the button a second time and the sheet is already presorted.
1700X should beat a 1600 in clock speed alone, so >200 seconds makes me think something is amiss. I would be expecting a number in the 90's or low 100's.

AlphaC 11-22-2017 09:46 AM

I did a full redownload of the file and rebooted and ran the benchmark straight off the overclock.net site.

I obtained 199.18 seconds.

less than 100 seconds suggests multi-thread kicked in. Their Ryzen 5 was overclocked to 3.9 (which is the XFR speed of R7 1700X).

edit: for all intents and purposes, you should base your conclusions around 200 seconds , since when I manually timed I obtained roughly 3 min 25 seconds or so. We know Ryzen IPC is around Haswell and my Sandy Bridge system was also around 200 seconds when overclocked to 4.6.

edit2: you should adjust OP to note that you cannot run the benchmark more than once without reopening it.

ir88ed 11-22-2017 09:54 AM

Quote:
Originally Posted by AlphaC View Post

I did a full redownload of the file and rebooted and ran the benchmark straight off the overclock.net site.

I obtained 199.18 seconds.

less than 100 seconds suggests multi-thread kicked in.
Any chance that "enable multi-threaded" option is not selected? Excel Options -> advanced -> formulas -> enable multi-threaded calculation

I think that is set by default, but grasping at straws here.

Quote:
Originally Posted by AlphaC View Post

you should adjust OP to note that you cannot run the benchmark more than once without reopening it.
Good point. I will make it clearer.

AlphaC 11-22-2017 09:59 AM

I checked, it is not the case. It just had 5-6% CPU usage (1 thread) even though 16 were available.

edit: He's on Windows 10 x 64.

japau 11-22-2017 10:47 AM




CPU: i7 8700k
#CPU cores: 6c/12t
CPU clock speed: 5000 / 4500 Cache
Total system memory: 16GB
RAM Speed: DDR4-4000-17-17-17-2T
Excel ver: 2016 Pro
Time to sort (in seconds): 95.42 seconds

Cheers!

mllrkllr88 11-22-2017 11:44 AM

Quote:
Originally Posted by ir88ed View Post

Any chance that "enable multi-threaded" option is not selected? Excel Options -> advanced -> formulas -> enable multi-threaded calculation

I will check this out on my test OS too. As it stands right now it looks like most of the scaling is coming from core/cache, none from core count/HT for Excel 2010.

NightAntilli 11-22-2017 12:38 PM

Quote:
Originally Posted by AlphaC View Post

Tested Ryzen 7 1700X at stock again.

There's some variance on XFR kicking in and also with antivirus.

Original sheet without timer is roughly 3 min 24 seconds. My original run the other day had 3 min 29 seconds on first run.

Huzzug's variation with a fresh boot obtained 212.46 seconds. I noticed the antivirus icon kick in on the first run. Closing Excel and running again resulted in 199.29 seconds.

I really don't know what to make of it because I definitely obtained 30 seconds the other day.
Something must be wrong. My FX-8320 got 218 seconds. There's no way a 1700X is equally as slow. Throttling?

AlphaC 11-22-2017 03:46 PM

Quote:
Originally Posted by NightAntilli View Post

Something must be wrong. My FX-8320 got 218 seconds. There's no way a 1700X is equally as slow. Throttling?

Nope. It is AIDA64 AVX / Prime 95 AVX stable, there's no way it is throttling with only one core used.

The CPU wasn't overclocked in these tests. All that shows is 3.5-3.9GHz (depending on XFR kicking in) is roughly faster than 4.6GHz FX-8320 for these types of workloads.

The biggest difference is I'm running Windows 7 Pro x 64 on a fully loaded system with antivirus, not a clean one. Pausing antivirus (without clean restart) results in a score of around 199 seconds. I'd say 200 seconds is about right , even with antivirus on and 2nd run on the same boot I had ~200 seconds.

edit: ran ExcelTrader benchmark


(http://exceltrader.net/984/benchmark_et-xls-an-excel-benchmark-for-traders/)
top is back when system was clean

cssorkinman 11-22-2017 04:55 PM

Quote:
Originally Posted by ir88ed View Post

Quote:
Originally Posted by AlphaC View Post

Tested Ryzen 7 1700X at stock again.

There's some variance on XFR kicking in and also with antivirus.

Original sheet without timer is roughly 3 min 24 seconds. My original run the other day had 3 min 29 seconds on first run.

Huzzug's variation with a fresh boot obtained 212.46 seconds. I noticed the antivirus icon kick in on the first run. Closing Excel and running again resulted in 199.29 seconds.

I really don't know what to make of it because I definitely obtained 30 seconds the other day.


Hmmm... We have a 1600 Ryzen that did a 106 on the original sheet.
The 30 second run sounds like the results you get when you hit the button a second time and the sheet is already presorted.
1700X should beat a 1600 in clock speed alone, so >200 seconds makes me think something is amiss. I would be expecting a number in the 90's or low 100's.

Like this?
Warning: Spoiler! (Click to show)

ir88ed 11-22-2017 05:35 PM

Note that you can only run the benchmark once in the worksheet. Once the sheet is sorted the calc times drop to less than 30 seconds if you re-bench.

Best way is to reboot the system, open the sheet and sort only one time.

cssorkinman 11-22-2017 05:58 PM

Quote:
Originally Posted by ir88ed View Post

Note that you can only run the benchmark once in the worksheet. Once the sheet is sorted the calc times drop to less than 30 seconds if you re-bench.

Best way is to reboot the system, open the sheet and sort only one time.

I get roughly the same numbers as alpha C does on the benches he is referencing using my 1800X when running them properly.

AlphaC 11-22-2017 06:02 PM

At least we confirmed it is not Windows 7 that's making a difference tongue.gif

cssorkinman 11-22-2017 07:45 PM

Quote:
Originally Posted by AlphaC View Post

At least we confirmed it is not Windows 7 that's making a difference tongue.gif

Piqued my curiosity anyhow. 188 seconds first run second 30 seconds - win 10, 64 bit . 69 on the bench you provided.

mllrkllr88 11-22-2017 09:19 PM

Quote:
Originally Posted by ir88ed View Post

Any chance that "enable multi-threaded" option is not selected? Excel Options -> advanced -> formulas -> enable multi-threaded calculation

In my copy of 2010 it was not enabled by default. However, once enabled the score didn't really change much. The score changed from 91.4 > 90.92

CPU: i9 7940X
#CPU cores: 14c/28t
CPU clock speed: 5g
Total system memory: 2.4GB
RAM Speed: 3600c12
Excel ver: 2010 Pro
Time to sort (in seconds): 90.92 seconds


cssorkinman 11-23-2017 07:56 AM

Forgot to mention that it was with excel 2016.

ir88ed 11-23-2017 06:41 PM

Quote:
Originally Posted by cssorkinman View Post

I get roughly the same numbers as alpha C does on the benches he is referencing using my 1800X when running them properly.
Quote:
Originally Posted by AlphaC View Post

At least we confirmed it is not Windows 7 that's making a difference tongue.gif
Quote:
Originally Posted by mllrkllr88 View Post

In my copy of 2010 it was not enabled by default. However, once enabled the score didn't really change much. The score changed from 91.4 > 90.92

CPU: i9 7940X
#CPU cores: 14c/28t
CPU clock speed: 5g
Total system memory: 2.4GB
RAM Speed: 3600c12
Excel ver: 2010 Pro
Time to sort (in seconds): 90.92 seconds


Updated. Really 2.4GB ram?
Edt: NM, I see 32GB on your screenshot

LostParticle 11-23-2017 10:41 PM

Quote:
Originally Posted by LostParticle View Post



@ir88ed, I've submitted my benchmark approx. two days ago (post #56). Why am I not added in the chart?

i7-4790K, 4c 8t, per core OC: x48, x49, x49, x50, cache x44
16 GB DDR3, 2400 MHz, 10-11-12-24, 1T
Office 2016 Pro Plus 64 bit
Win 10 Pro
Benchmark time: 82:59 seconds

huzzug 11-24-2017 05:27 AM

I made a few further changes to the workbook. Data1000.zip 193k .zip file

Also, you need not close the workbook to re-run the bench. It should provide consistent score.

LostParticle 11-24-2017 05:45 AM

Quote:
Originally Posted by huzzug View Post

I made a few further changes to the workbook. Data1000.zip 0k .zip file

Also, you need not close the workbook to re-run the bench. It should provide consistent score.

I cannot extract your zipped file. Here's what I get:


LostParticle 11-24-2017 05:46 AM

I've loaded my all core x47 OC profile, cache x44. I rerun the benchmark once, right after rebooting.




Win 10 Pro, Office 2016 Pro.
16 GB DDR3, 2400 MHz, 10-11-12-24, 1T

huzzug 11-24-2017 06:03 AM

Quote:
Originally Posted by LostParticle View Post

Quote:
Originally Posted by huzzug View Post

I made a few further changes to the workbook. Data1000.zip 0k .zip file

Also, you need not close the workbook to re-run the bench. It should provide consistent score.

I cannot extract your zipped file. Here's what I get:


Goofed up. You can try the new link now

LostParticle 11-24-2017 07:26 AM

Quote:
Originally Posted by huzzug View Post

Goofed up. You can try the new link now

Okay, now it works.

My results (configuration given in my previous post):


huzzug 11-24-2017 12:04 PM

Is that bench with the latest file? Something seems amiss. It takes my system ~an hour to complete the bench. How are you able to finish it within that time?

ENTERPRISE 11-24-2017 12:27 PM

CPU: Threadripper 1950X
#CPU cores: 16
CPU clock speed: 4.1GHz
Total system memory: 32GB
Excel ver: 2016
Time to sort (in seconds): 95.44

 

Looks like Excel favours core speed vs core count.

 


AlphaC 11-24-2017 01:21 PM

ENTERPRISE, how is that possible that a Threadripper at 4.1GHz is 2x faster than Ryzen 7? (assuming multi-threading has no effects)

There has to be some sort of memory channel and latency dependence , as IPC and clockspeeds are similar.

This thread definitely needs more datapoints.

ENTERPRISE 11-24-2017 02:30 PM

Quote:
Originally Posted by AlphaC View Post

ENTERPRISE, how is that possible that a Threadripper at 4.1GHz is 2x faster than Ryzen 7? (assuming multi-threading has no effects)

There has to be some sort of memory channel and latency dependence , as IPC and clockspeeds are similar.

This thread definitely needs more datapoints.

 

I am unsure, I am also currently running my Memory at 3200Mhz (Quad Channel) with timings of 14-14-14-28 combined with my 1950x @ 4.1Ghz. Other than that I can see no other particular factors. I ran the bench multiple times, this is definitely a correct score.


ir88ed 11-24-2017 07:10 PM

Quote:
Originally Posted by ENTERPRISE View Post

I am unsure, I am also currently running my Memory at 3200Mhz (Quad Channel) with timings of 14-14-14-28 combined with my 1950x @ 4.1Ghz. Other than that I can see no other particular factors. I ran the bench multiple times, this is definitely a correct score.

Added.

Very interesting that Excel clearly gains from core speed, but still the 1950X with one of the lower clock speeds so far is able to knock out a pretty fast bench.
Quote:
Originally Posted by AlphaC View Post


This thread definitely needs more datapoints.
Yep, more data will help.

mllrkllr88 11-24-2017 07:15 PM

Quote:
Originally Posted by LostParticle View Post

@ir88ed, I've submitted my benchmark approx. two days ago (post #56). Why am I not added in the chart?

i7-4790K, 4c 8t, per core OC: x48, x49, x49, x50, cache x44
16 GB DDR3, 2400 MHz, 10-11-12-24, 1T
Office 2016 Pro Plus 64 bit
Win 10 Pro
Benchmark time: 82:59 seconds

This score seems a little bit off. Based on those speeds I think you should be in the 95+ second zone.

huzzug 11-24-2017 07:22 PM

Are you guys running the file that I updated recently? My results seem to be way off if you guys are in fact running that file. I'll incorporate version numbers into it to better understand which file is running.

ir88ed 11-24-2017 07:22 PM

Quote:
Originally Posted by mllrkllr88 View Post

This score seems a little bit off. Based on those speeds I think you should be in the 95+ second zone.

Maybe, but he does have the second highest core speed so far. Machine to machine variances could account for the additional couple of seconds.

I may have to push my 5930K to 5.0 when it gets cold out, and see if it can make it through a bench. Definitely can't make it through timespy at that speed. The CPU bit kills it everytime; curse those growing metal crystals!

mllrkllr88 11-24-2017 07:34 PM

I'm starting to wonder about Excel 2010 vs 2016 variance too. We need to test both versions and maybe make 2 lists of subs if the results show too much different.

Hopefully this weekend I can get setup with 7740x, I should be able to pass this bench at 5.6g or so. For me, core scaling is almost non-existent. It looks like CPU core speed and memory scaling (both timings and frequency) is most of it.

Oh and yea, I am running 32gb of system memory but i setup the OS to only use 2.4gb (posted earlier about why I am doing this).

LostParticle 11-24-2017 10:55 PM

Quote:
Originally Posted by huzzug View Post

Is that bench with the latest file? Something seems amiss. It takes my system ~an hour to complete the bench. How are you able to finish it within that time?

Isn't it obvious, from the screenshot of my post #82, that I was benchmarking the DATA1000 file that you provided? I thought it was made clear from the context but also up there, where Excel displays the file name, it is clearly visible...

LostParticle 11-24-2017 11:07 PM

This morning I have run five (5) rounds of this benchmark again. I benchmarked the "benchmarked_randomized_data2" file, 62762 rows, 39722 KB on my file explorer. After each run I was rebooting my computer. On my last attempt I also left my computer idle for approx. five (5) minutes with the spreadsheet loaded in Excel 2016, before hitting benchmark.


i7-4790K, 4c 8t, per core OC: x48, x49, x49, x50, cache x44
16 GB DDR3, 2400 MHz, 10-11-12-24, 1T
Office 2016 Pro Plus 64 bit
Win 10 Pro


My results:

Round 1: 82.52 seconds Warning: Spoiler! (Click to show)

Sorry, I forgot to switch CPU-Z tabs! redface.gif


Round 2: 82.09 seconds Warning: Spoiler! (Click to show)


Round 3: 82.28 seconds. In this one I have included HWiNFO64 which is loaded on Windows startup. You can get an idea about what's going on while the bench is running Warning: Spoiler! (Click to show)


Round 4: 82.57 seconds. In this screenshot my version of Excel is clearly visible. Warning: Spoiler! (Click to show)


Round 5: 81.75 seconds (approx. 5 minutes idle, with the spreadsheet loaded, before benchmarking). My best time. Warning: Spoiler! (Click to show)


Remember: before each round I was rebooting my computer.

Perhaps the best way to prove I am legit would be a video. I do not own a video camera or a smartphone (because I do not like them), to record in video the entire process though.

Thank you.


PS: According to my personal opinion, my system is not running at its fully potential this period because I have not clean-installed Windows 10 Fall Creators Update, yet.

huzzug 11-24-2017 11:56 PM

Well my question wasn't because I doubted any of you but because I'm getting ~2500secs on the benchmark which is pretty significant. Mine is Excel 2010. Maybe my system is doing something weird. It's an sandy bridge (don't know the model because work) but it's running stock.

Edit: Seems I found something. Our IT had Office 32-bit installed on our systems. Seems that's what causing these variations. Well, anybody have a formal request to submit their corporate IT dept for upgrading the versions of Office to 64-bit ?

NightAntilli 11-25-2017 04:56 AM

Quote:
Originally Posted by huzzug View Post

I made a few further changes to the workbook. Data1000.zip 193k .zip file

Also, you need not close the workbook to re-run the bench. It should provide consistent score.
Ok. With this one I just got 26.68 seconds on my FX8320... I thought it was a fluke, so, I closed and re-opened it, ran again, 28.04 seconds. I checked if multi-threading was enabled, it wasn't. I enabled it, then I got 52.15 & 46.68 seconds. Disabled multi-threading again, I get 29.96 seconds. What gives? CPU usage with MT disabled is ~30%, with it enabled it's ~50%.

ir88ed 11-25-2017 09:25 AM

Quote:
Originally Posted by LostParticle View Post



i7-4790K, 4c 8t, per core OC: x48, x49, x49, x50, cache x44
16 GB DDR3, 2400 MHz, 10-11-12-24, 1T
Office 2016 Pro Plus 64 bit
Win 10 Pro


Perhaps the best way to prove I am legit would be a video. I do not own a video camera or a smartphone (because I do not like them), to record in video the entire process though.

Thank you.

No need. It is clear to me that you are on the level. smile.gif

Your system, despite being a 4700 series chip, is running very fast with one core at 5.0 and the cache running at 4.4. This is a very interesting observation, IMO.

mllrkllr88 11-25-2017 09:25 AM

Quote:
Originally Posted by LostParticle View Post

This morning I have run five (5) rounds of this benchmark again.

Nice testing man, looks good!

I don't think anyone doubted your scores were legitimate and genuinely produced. Its just that your scores are a little bit out of range from what we have seen (albeit very limited sample size). We have also seen scores out of range on the other end too, so its obvious there are quite a few user variables with this bench. Keep pushing it thumb.gif

LostParticle 11-25-2017 10:51 AM

Thank you, mllrkllr88 and ir88ed, it is always beneficial to get encouraged. smile.gif

@ir88ed, if you consider it appropriate, update please my submission with my best timing, which is 81.75 seconds, as you can see in my post #93.


I don't know if it interests anyone of you, but I've run the same benchmark on my Linux installation. Similarly, I was rebooting before each benchmark run. I just raised my OC a bit to: x48 x49 x50 x50, cache x44, and RAM timings the same.

Here are my results...
Warning: Spoiler! (Click to show)




I would love it if someone would/could provide something similar for Microsoft Word and Access.

Thank you!

AlphaC 11-25-2017 11:47 AM

Tempted to walk into a computer store with a whole bunch of random systems with this file on a USB stick.

Would be bonkers to run it on every i7/i5/Ryzen system that has Excel installed. They'd probably kick me out though.
Quote:
Originally Posted by huzzug View Post

Well my question wasn't because I doubted any of you but because I'm getting ~2500secs on the benchmark which is pretty significant. Mine is Excel 2010. Maybe my system is doing something weird. It's an sandy bridge (don't know the model because work) but it's running stock.

Edit: Seems I found something. Our IT had Office 32-bit installed on our systems. Seems that's what causing these variations. Well, anybody have a formal request to submit their corporate IT dept for upgrading the versions of Office to 64-bit ?

Most people are running 32 bit because that is what is installed by default. Having a 64-bit Office breaks compatibility with some older plug-ins and whatnot.

----

DATA1000: I'm getting 15 seconds on i5-2500k @ 4.6GHz , second run after closing Excel and reopening = 13.89


Office 2016 full version number Stable release‎: ‎1710 (16.0.8625.2121)

Ryzen 7 system still takes about 200 seconds for the original benchmark and a bit under 20 seconds for the DATA1000 Warning: Spoiler! (Click to show)



ir88ed 11-25-2017 12:27 PM

Quote:
Originally Posted by LostParticle View Post

Thank you, mllrkllr88 and ir88ed, it is always beneficial to get encouraged. smile.gif

@ir88ed, if you consider it appropriate, update please my submission with my best timing, which is 81.75 seconds, as you can see in my post #93.


I don't know if it interests anyone of you, but I've run the same benchmark on my Linux installation. Similarly, I was rebooting before each benchmark run. I just raised my OC a bit to: x48 x49 x50 x50, cache x44, and RAM timings the same.

Here are my results...
Warning: Spoiler! (Click to show)




I would love it if someone would/could provide something similar for Microsoft Word and Access.

Thank you!
Updated


All times are GMT -7. The time now is 04:20 PM.

Powered by vBulletin® Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.

User Alert System provided by Advanced User Tagging (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.
vBulletin Security provided by vBSecurity (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.

vBulletin Optimisation provided by vB Optimise (Pro) - vBulletin Mods & Addons Copyright © 2019 DragonByte Technologies Ltd.