Memory Usage Issues - StatsByCube
-
- Regular Participant
- Posts: 156
- Joined: Tue Aug 17, 2010 11:51 am
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 7
Memory Usage Issues - StatsByCube
Hi there
Our RAM is sitting at 98%, and causing big issues with the model.
The StatsByCube cube shows that Tm1 is consuming 32GB, but windows task manager shows it sitting at 78GB
Why is would this be the case?
Also, we previously had the the same issue where RAM was at 98%. We added an extra 16GB of RAM and it just used it all up and went back to 98%
Would greatly appreciate some advice!!!
Many thanks
Chris
Our RAM is sitting at 98%, and causing big issues with the model.
The StatsByCube cube shows that Tm1 is consuming 32GB, but windows task manager shows it sitting at 78GB
Why is would this be the case?
Also, we previously had the the same issue where RAM was at 98%. We added an extra 16GB of RAM and it just used it all up and went back to 98%
Would greatly appreciate some advice!!!
Many thanks
Chris
- Steve Rowe
- Site Admin
- Posts: 2455
- Joined: Wed May 14, 2008 4:25 pm
- OLAP Product: TM1
- Version: TM1 v6,v7,v8,v9,v10,v11+PAW
- Excel Version: Nearly all of them
Re: Memory Usage Issues - StatsByCube
Hi,
Your first place to check for where the rest of the memory is the stats for server cube and the Garbage Memory.
Garbage Memory is memory that TM1 has used but is not at the moment.
Common reasons for a large Garbage Memory are Multi threaded start-up options or very large views being generated in TI or by end-users.
With the numbers you give it does sound like there is something very specific to your system that is causing the garbage to be high. The trigger event will be something that takes along time because there is a lot of work happening or something that would take a long time but is running multi-threaded.
If it's not Garbage memory then just take a hard look at the cube stats and make sure the relative sizes make sense to you as it is not unknown for the cube stats to be a bit off in some releases, though I don't think this is the case given the size of the missing memory.
Your first place to check for where the rest of the memory is the stats for server cube and the Garbage Memory.
Garbage Memory is memory that TM1 has used but is not at the moment.
Common reasons for a large Garbage Memory are Multi threaded start-up options or very large views being generated in TI or by end-users.
With the numbers you give it does sound like there is something very specific to your system that is causing the garbage to be high. The trigger event will be something that takes along time because there is a lot of work happening or something that would take a long time but is running multi-threaded.
If it's not Garbage memory then just take a hard look at the cube stats and make sure the relative sizes make sense to you as it is not unknown for the cube stats to be a bit off in some releases, though I don't think this is the case given the size of the missing memory.
Technical Director
www.infocat.co.uk
www.infocat.co.uk
-
- Regular Participant
- Posts: 156
- Joined: Tue Aug 17, 2010 11:51 am
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 7
Re: Memory Usage Issues - StatsByCube
Hi Steve
Thanks very much for your assistance. I have looked at the StatsForServer cube, and it tells me the following:
Memory Used = 85GB
Memory in Garbage = 49GB
Server Stats:
RAM = 84GB
Tm1 Process Usage = 78GB
So questions I have:
Are the above stats an issue? From what I gather, although RAM is at 98%, Tm1 still has 49GB available for it to use? I'm pretty sure that multi-threaded start up is enabled.
Regards
Chris
Thanks very much for your assistance. I have looked at the StatsForServer cube, and it tells me the following:
Memory Used = 85GB
Memory in Garbage = 49GB
Server Stats:
RAM = 84GB
Tm1 Process Usage = 78GB
So questions I have:
Are the above stats an issue? From what I gather, although RAM is at 98%, Tm1 still has 49GB available for it to use? I'm pretty sure that multi-threaded start up is enabled.
Regards
Chris
-
- Regular Participant
- Posts: 156
- Joined: Tue Aug 17, 2010 11:51 am
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 7
Re: Memory Usage Issues - StatsByCube
Hi Steve
Actually this does look like a problem - just read that you are supposed to ADD TOGETHER the "Memory Used" and the "Memory in Garbage".
So adding this together will give me around 135GB, but the OS only has 86GB in total, and the windows processes says that Tm1 is consuming 78GB.
Also the stats by cube says that is it consuming 32GB
Doesn't seem to make sense?
Many thanks!!
Regards
Chris
Actually this does look like a problem - just read that you are supposed to ADD TOGETHER the "Memory Used" and the "Memory in Garbage".
So adding this together will give me around 135GB, but the OS only has 86GB in total, and the windows processes says that Tm1 is consuming 78GB.
Also the stats by cube says that is it consuming 32GB
Doesn't seem to make sense?
Many thanks!!
Regards
Chris
- Elessar
- Community Contributor
- Posts: 412
- Joined: Mon Nov 21, 2011 12:33 pm
- OLAP Product: PA 2
- Version: 2.0.9
- Excel Version: 2016
- Contact:
Re: Memory Usage Issues - StatsByCube
Hi,
TM1 will not use "Memory in garbage". The only way to resolve this is to return the garbage memory to Windows. And the only way to return memory to Windows is to restart the TM1 service
There is a recommended practice to restart your TM1 server every Sunday
TM1 will not use "Memory in garbage". The only way to resolve this is to return the garbage memory to Windows. And the only way to return memory to Windows is to restart the TM1 service

There is a recommended practice to restart your TM1 server every Sunday
- Steve Rowe
- Site Admin
- Posts: 2455
- Joined: Wed May 14, 2008 4:25 pm
- OLAP Product: TM1
- Version: TM1 v6,v7,v8,v9,v10,v11+PAW
- Excel Version: Nearly all of them
Re: Memory Usage Issues - StatsByCube
Hi,
It sounds you have some problems....
From what you have said the OS is using the swap file / virtual RAM on the disc to provide all the memory that TM1 requires. This _may_ not be hurting your performance since this is all garbage but far from ideal. This would explain why adding more RAM didn't resolve the RAM stat the 98% is really 100% and it still is.
Although a regular restart is sometimes recommended this will not resolve the issue if the Garbage is coming from a multi-threaded launch. I'd suggest that you try and do some testing.
1. Check the stats for server immediately after start up with your current build.
2. Repeat but with multi-threaded start-up turned off. Compare the stats. This may take a long while to launch. You may need to get comfortable with this...
I'm guessing that your DB is rule heavy, this is a tricky area for multi-threaded start-up and the costs can outweigh the benefits, though your case is extreme.
You might want to look at persistent feeders instead to shorten start-up times, though be prepared for ramp-up in disc consumption.
Obviously I don't know anything about your DB but something feels off with the design that is triggering these issues, what you have is at least unusual...
It sounds you have some problems....
From what you have said the OS is using the swap file / virtual RAM on the disc to provide all the memory that TM1 requires. This _may_ not be hurting your performance since this is all garbage but far from ideal. This would explain why adding more RAM didn't resolve the RAM stat the 98% is really 100% and it still is.
Although a regular restart is sometimes recommended this will not resolve the issue if the Garbage is coming from a multi-threaded launch. I'd suggest that you try and do some testing.
1. Check the stats for server immediately after start up with your current build.
2. Repeat but with multi-threaded start-up turned off. Compare the stats. This may take a long while to launch. You may need to get comfortable with this...
I'm guessing that your DB is rule heavy, this is a tricky area for multi-threaded start-up and the costs can outweigh the benefits, though your case is extreme.
You might want to look at persistent feeders instead to shorten start-up times, though be prepared for ramp-up in disc consumption.
Obviously I don't know anything about your DB but something feels off with the design that is triggering these issues, what you have is at least unusual...
Technical Director
www.infocat.co.uk
www.infocat.co.uk
-
- MVP
- Posts: 3698
- Joined: Fri Mar 13, 2009 11:14 am
- OLAP Product: TableManager1
- Version: PA 2.0.x
- Excel Version: Office 365
- Location: Switzerland
Re: Memory Usage Issues - StatsByCube
Don't know where you read this but it's wrong!chewza wrote: ↑Wed May 13, 2020 9:16 am ... just read that you are supposed to ADD TOGETHER the "Memory Used" and the "Memory in Garbage".
So adding this together will give me around 135GB, but the OS only has 86GB in total, and the windows processes says that Tm1 is consuming 78GB.
Also the stats by cube says that is it consuming 32GB
Doesn't seem to make sense?
"Total Memory Used" INCLUDES "Memory in Garbage". You SUBTRACT "Memory in Gargage" from "Total Memory Used" to see how much memory your model is currently actively using. Unfortunately the }Stats measures don't include any counters for memory used by dimensions and indicies, only cubes, so there will always be a bit of a gap between "Memory used for cubes" and Total Memory less Garbage Memory.
And yes TM1's PerfMon does seem to be a littly buggy as the numbers seldom either reconcile perfectly internally or reconcile to the OS counters.
TM1 will re-use "Memory in garbage". That's the whole point! TM1 may be reluctant to ever give memory it has reserved back to the OS. But it certainly does recycle the garbage memory internally.
The point about restarting the TM1 service weekly is certainly valid. The same goes for restarting the server as a whole!
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
-
- Regular Participant
- Posts: 156
- Joined: Tue Aug 17, 2010 11:51 am
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 7
Re: Memory Usage Issues - StatsByCube
Hi Guys
Really appreciate the responses!! Will follow your advice, and revert.
Regards
Chris
Really appreciate the responses!! Will follow your advice, and revert.
Regards
Chris
Re: Memory Usage Issues - StatsByCube
I'm pretty sure you know this trick, but, just in case, this will give you more info on how memory is used:
1. In tm1s.cfg, set the following parameter:
1. TrackMemoryPools=T
2. Restart the server
3. Once the server is up, create a new TI with the following line in the Prolog:
4. DebugUtility( 134, -1, 0, 'MemoryPool_Dump.txt', '', '' );
This will create a report on the allocated memory pools, by category. The syntax is as follows:
DebugUtility( 134, <pool category number>, 0, <output dump file path>, '', '' );
If <pool category number> = -1, then overall counts for each pool category (categories are in groups of 100).
From the resulting report, if there are one or two categories that are particularly large, then you can run the process again, but this time specify the pool category group, i.e. "800".
-
- Regular Participant
- Posts: 156
- Joined: Tue Aug 17, 2010 11:51 am
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 7
Re: Memory Usage Issues - StatsByCube
Hi
Thanks very much for all the advice.
In summary, the garbage accounts for more than half the memory. I disabled multi-threaded startup, and the memory usage almost halved.
We have 12 virtual cores and all were previously allocated for startup. I played around, and found the 6 is a workable number. Takes a fair bit longer (bit still workable), and memory usage is not high enough to cause problems for the OS and the other server.
It is a real pity that you cannot clear the garbage memory as this would allow you to then allocate maximum resources for quick statups of the model. Assume this is not possible?
Thanks guys!!!
Regards
Chris
Thanks very much for all the advice.
In summary, the garbage accounts for more than half the memory. I disabled multi-threaded startup, and the memory usage almost halved.
We have 12 virtual cores and all were previously allocated for startup. I played around, and found the 6 is a workable number. Takes a fair bit longer (bit still workable), and memory usage is not high enough to cause problems for the OS and the other server.
It is a real pity that you cannot clear the garbage memory as this would allow you to then allocate maximum resources for quick statups of the model. Assume this is not possible?
Thanks guys!!!
Regards
Chris
-
- MVP
- Posts: 3222
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Memory Usage Issues - StatsByCube
Maybe through a secret number with the DebugUtility command that only ykud seems to know

Well, I knew 134 should have done something useful but I did not know you have to use -1 (or multiples of 100) and certainly not an entry in the tm1s.cfg file.
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
-
- Regular Participant
- Posts: 156
- Joined: Tue Aug 17, 2010 11:51 am
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 7
Re: Memory Usage Issues - StatsByCube
DebugUtility?
-
- MVP
- Posts: 3222
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Memory Usage Issues - StatsByCube
Did you read ykud's post above ? Did you experiment with DebugUtility in a TI process ? Use numbers 134 and 123 for useful information.
Turn on Performance Monitor and also make sure the entry in the tm1s.cfg file is present. Restart TM1.
Code: Select all
# DebugUtility( 134, -1, 0, 'Debug TM1_134.txt', '', '' );
# DebugUtility( 123, A, B, 'Debug TM1_123.txt', '', '' );
# if control objects are needed:
# A = 0
# if control objects are not needed:
# A = -1
# if you want: Number overfed cells / fed cells that evaluate to zero / STET cells:
# B = -1
# if control objects are not needed:
# B = 0
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
-
- MVP
- Posts: 3222
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Memory Usage Issues - StatsByCube
A simple loop to generate the output of DebugUtility parameter 134:
Note the requirements above for this to give output.
Code: Select all
# overall stats
cFile = 'TM1 stats_-1.txt';
DebugUtility( 134, -1, 0, cFile, '', '' );
# detailed stats
i = 0;
While( i <= 1100 );
cFile = 'TM1 stats_' | NumberToString( i ) | '.txt';
DebugUtility( 134, i, 0, cFile, '', '' );
i = i + 100;
End;
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
- scrumthing
- Posts: 81
- Joined: Tue Jan 26, 2016 4:18 pm
- OLAP Product: TM1
- Version: 11.x
- Excel Version: MS365
Re: Memory Usage Issues - StatsByCube
Thanks for the clarification lotsaram! If TM1 would never reuse garbage memory most of the servers would explode in no time.
But as I was thinking i came upon a question. Not sure about the usefulness of my superficial knowledge here. Aren’t there some limitations about reusing the garbage memory? The server reuses only garbage memory if there is enough space otherwise no garbage memory is used. For example if the view needs 4 gb of memory and there is only 3 gb of garbage memory available the server gets another 4 gb of the unused memory and after the view is no longer needed adds it to the existing 3 gb of garbage for a total of 7 gb. Right or wrong?
There is no OLAP database besides TM1!
-
- MVP
- Posts: 3222
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Memory Usage Issues - StatsByCube
Refer to here:
https://www.tm1forum.com/viewtopic.php?f=21&t=15310
if you want an out-of-the-box solution to create a new }Stats cube and populate it with the pool statistics.
No need anymore to work with literally thousands of text files.
https://www.tm1forum.com/viewtopic.php?f=21&t=15310
if you want an out-of-the-box solution to create a new }Stats cube and populate it with the pool statistics.
No need anymore to work with literally thousands of text files.
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Re: Memory Usage Issues - StatsByCube
I'd bet it'll be more the case of the ~3gb taken from garbage + 1 from OS with 4 Gb left in garbage after view is discarded (this doesn't happen straightaway though).scrumthing wrote: ↑Mon May 18, 2020 11:25 am For example if the view needs 4 gb of memory and there is only 3 gb of garbage memory available the server gets another 4 gb of the unused memory and after the view is no longer needed adds it to the existing 3 gb of garbage for a total of 7 gb. Right or wrong?
TM1 asks for memory in os page size chunks, so it'll be asking for 4 gb / 4096 (in windows) pages and will re-use as many pages as it can from garbage it already has allocated.
Re: Memory Usage Issues - StatsByCube
Don't know any such commands. I keep hearing that there's some work on making TM1 release the unused memory back to OS, but I don't think I saw any official announcement. Having said that, I see PA (2.0.6 onwards) server memory both increase and decrease over time on some servers, so I think something like this is already happening.Wim Gielis wrote: ↑Mon May 18, 2020 8:31 amMaybe through a secret number with the DebugUtility command that only ykud seems to know![]()
The only reason I know of this was trying to get to the bottom of PA upgrade memory growth as described here:
https://ykud.com/blog/cognos/tm1-cognos ... ll-subsets
I'd be careful with that, each TM1 object creates a number of memory pools (i.e. you'd see named pools for subsets, views, etc). I think this is useful for an overall view of what's consuming memory (cubes vs dims vs subsets + maybe some analysis of what's the top contributors), but I wouldn't go through each of the pools or load them back in TM1, I'm pretty sure that running such upload will generate you more pools that you'll load back in and so on.Wim Gielis wrote: ↑Tue May 19, 2020 8:37 pm No need anymore to work with literally thousands of text files.
And the usual disclaimer around debugutility: has potential performance impacts, is dangerous and unsupported, don't use
