Multi workbook and Memory Consumption

Memory Limits
4 posts / 0 new
Last post
Hengjia Chen A+ 0
Multi workbook and Memory Consumption

I bump into a problem this week with a multi workbook setup 14 workbooks, three tiers with parent, child and grandchild, 168mb in total.

During one of the worst attempt to export a custom historical income statement module took 6 hours, or hour and half for duplicate.

IT had a diagnostics, quoting high memory utilization and ordered a 16GB ram and installed it.

So now have got exorbitant 32GB RAM. But when try with export module again there seems to be an invisible cap the memory consumption had never been over 50% or 16GB.

A second test is run with same collection of workbook and loading 30 tabs in Chrome, memory consumption shoot up to 80% range.

I wonder if someone had experience similar issue and a potential fix?

Michael Hutchens A+ 188

Hi Hengjia,

While it's pleasantly surprising that you've been able to build a multi-linked workbook Modano project with a combined size of 168MB, this is way outside of the recommended use case for both Excel and Modano.

Put another way, Modano is not designed to replace an ERP system or be used to build enormous models with huge data sets. It is limited by the memory and performance limitations of Microsoft Excel itself, which can theoretically handle enormous files, but only when building spreadsheets based on data without a lot of formulas.

As a general rule, we've always tried to keep our Excel workbooks under 10MB in the *.xlsb file format, which equates to ~30MB in *.xlsx file format, and find we experience performance issues once the total combined file size of a multi-linked workbook projects exceeds 50MB - 80MB, depending on the content within the files.

Your current total file size is 2x anything we've ever built.

From my experience, you either use a proper 3-way financial model and scope with discipline to limit file size and complexity if you're analysing a large business, or you move out of Excel into a platform designed for huge data analysis.

You may be able to hang in there building enormous files but it's simply not what Excel or Modano are designed to be used for, and you can expect impracticality, volatility, file corruption, and data loss.

Basically it's a dangerous world to be in.

I'm sorry we can't give you a golden bullet, but until something like quantum physics leads to a major leap in the performance capabilities of Microsoft Excel, you really need to respect its limits.

Regards, M.

Michael Hutchens A+ 188

As a P.S. to the above, we usually find there are ways to dramatically reduce workbook size and improve performance, but it's a very complex process of identifying where the size and memory are being used and working out how things can be stored and done differently.

This is outside the scope of this forum post but something you could discuss with our Services team with a view to mandating them to review your model and see if they can assist.


Hengjia Chen A+ 0

Hi Michael,

It is not my original work, I am working on some extension only. They were build by my genius colleagues in the version 10 days based on CoA 6 & 7.

And during a conversation today it was mentioned the workbooks were working smooth back then.

If say for example I check the size of each worksheet will there be a rough indication to pin point the one(s) that has made the file size unduly large?