PreallocatedMemory.Size and Model Startup

Post Reply
chewza
Posts: 146
Joined: Tue Aug 17, 2010 11:51 am
OLAP Product: TM1
Version: 9.5
Excel Version: 7

PreallocatedMemory.Size and Model Startup

Post by chewza »

Hi there

Anyone know exactly what this does?
So if I set it to 75000 will to ensure that Tm1 only uses 75GB of Ram.

I have set it to this, but as my model starts up, I see that the RAM is hovering around 95%.

There are the config parameters I have used:

MTCubeLoad=T
MTQ=All
MTCubeLoad.UseBookmarkFiles=T
IndexStoreDirectory=D:\Models\IndexStore

# 75GB of RAM to preallocate
PreallocatedMemory.Size=75000
# Run preallocation in parallel to cube cell/feeder loading
PreallocatedMemory.BeforeLoad=F
# Window 2012 patches as of Dec 2016 worked most efficiently with a single thread
PreallocatedMemory.ThreadNumber=1

# Disable TM1 performance counters to speed up MTQ.
PerfMonIsActive=F


Many thanks in advance!!

Regards
Chris
lotsaram
MVP
Posts: 3651
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: PreallocatedMemory.Size and Model Startup

Post by lotsaram »

PreallocatedMemory.Size is an optional parameter for fine tuning MTCubeLoad. It is quite poorly (or really not at all) documented.

TM1 is an entirely in-memory database. The memory consumed is really related to one thing only; the size of the database. You can't constrain the memory by setting any parameters.

Presumably this parameter works by preallocating memory initially which saves the TM1 server from requesting additional memory from the OS as needed when reading each cube and this leads to some efficiencies. It certainly doesn't place any limits on the total memory used by the TM1 server. Usually these optional or fine tuning parameters are only used in specific circumstances under advice from IBM, best to ask IBM support for more information what it does and how it works. (Chances are though that someone around here might have already recieved such information and be prepared to share).
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
Post Reply