TM1 Performance

Post Reply
manu0521
Posts: 124
Joined: Wed Nov 26, 2014 8:32 pm
OLAP Product: IBM TM1, Planning Analytics
Version: PA 2.0.5
Excel Version: 2016

TM1 Performance

Post by manu0521 »

Hi TM1 Users,

I have been using a sales cube with month level detail for few years and performace has been okay .

Now I have added few dimensions to the cube and made it day level . I dont have any complex rules , just loading data from Data warehouse.

On Cube memory usage it showw 40 gb memory been used . I am not sure if I should extend the day level but definitely wanted this to .

I have 5 years of historical data around 25 million rows of fact data .

I have around 13 to 14 dimensions , but its very hard to remove them . Is 40 gb memory acceptable for a decent size cube ?

ANy other comments on how much memory is acceptable , I can also split this in 2 cubes with one with month level data and other with day level data for 2 years .

Any suggestions is appreciated.
User avatar
gtonkin
MVP
Posts: 1259
Joined: Thu May 06, 2010 3:03 pm
OLAP Product: TM1
Version: Latest and greatest
Excel Version: Office 365 64-bit
Location: JHB, South Africa
Contact:

Re: TM1 Performance

Post by gtonkin »

Think you will get vague answers as without being able to see the cube, difficult to say.
25 million rows does not seem particularly big at all.

Have you run performance monitor to get an idea of where the memory is going? Input data, calculations etc?
If you have string fields in your measures, this will obviously require more memory than numerics.
Have you looked at re-ordering dimensions if the memory for input data per cell is very high? Lots of good articles here and elsewhere on this topic.

What is the size on disk?
Any alternate hierarchies? These seem to add a lot of overhead from what I have seen in my models.

Have you got Year, Month and Day dimensions or a continuous time dimension?

If you were going to split into two cubes, would probably do numerics vs strings rather than month vs day but would need more information to guide.
BR, George.

Learn something new: MDX Views
User avatar
Steve Rowe
Site Admin
Posts: 2456
Joined: Wed May 14, 2008 4:25 pm
OLAP Product: TM1
Version: TM1 v6,v7,v8,v9,v10,v11+PAW
Excel Version: Nearly all of them

Re: TM1 Performance

Post by Steve Rowe »

Just to re-iterate what gtonkin has said.

Once you have large datasets in TM1 the dimension order in the cube becomes crictical and you should reorder the dimensions off the right-click menu against the cube in perspectives.

The rule of thumb for dimension order is shortest sparsest to longest sparsest to shortest dense to longest dense.

Dense means "if I keep all the other intersections the same what is the liklihood of the next element being populated" If this % is high then it is a dense dimension, if it is low then it is sparse. Time dimensions are usually the most dense, i.e. if I have a sale this month then I'll probably have one next month.

It is highly likely that your longest time dimension (day?) should go in the last position (assuming no strings in the cube).

But be warned that it will take sometime to re-order a cube of that size and also consume potentially another 40GB of RAM. If you are still in development you might be better off deleting the cube and building it with the dimensions in your best estimate order.
Technical Director
www.infocat.co.uk
User avatar
gtonkin
MVP
Posts: 1259
Joined: Thu May 06, 2010 3:03 pm
OLAP Product: TM1
Version: Latest and greatest
Excel Version: Office 365 64-bit
Location: JHB, South Africa
Contact:

Re: TM1 Performance

Post by gtonkin »

Steve Rowe wrote: Fri May 28, 2021 7:48 am ...
Once you have large datasets in TM1 the dimension order in the cube becomes crictical and you should reorder the dimensions off the right-click menu against the cube in perspectives.
...
Please do any re-ordering in Perspectives - PAW lets you reorder but does not give you any indication of expected reduction or growth.
You could very likely crash your server if you get it wrong on PAW.

p.s. Have added to my long list of May AMA questions but may need to convert it to an Idea.
BR, George.

Learn something new: MDX Views
manu0521
Posts: 124
Joined: Wed Nov 26, 2014 8:32 pm
OLAP Product: IBM TM1, Planning Analytics
Version: PA 2.0.5
Excel Version: 2016

Re: TM1 Performance

Post by manu0521 »

Hi , Just wanted to provide an update on this .I did a reorder in perspective but did my own order based on your comments . I picked all dense elements like time period , customers etc and moved towards the end in order , the memory which was in 38 gb came around to 18 gb when i deleted the cube ad recreated with new order.
Is this real or am I looking something wrong .
So by putting dense items at end the storage and retrieveal happens faster ? Any simple scenario as an example will be super helpful to understand this,
I read an example saying a time dimension is dense because keeping other dimension values constant you can have a sale for different time periods .so higher dense is placed in last in order , but how does this help in memory consumptionl

Thanks for all valuable suggestions.
User avatar
gtonkin
MVP
Posts: 1259
Joined: Thu May 06, 2010 3:03 pm
OLAP Product: TM1
Version: Latest and greatest
Excel Version: Office 365 64-bit
Location: JHB, South Africa
Contact:

Re: TM1 Performance

Post by gtonkin »

Going to point you to these articles for a better understanding:
What exactly is cube optimization
Optimising dimension order in cubes
BR, George.

Learn something new: MDX Views
manu0521
Posts: 124
Joined: Wed Nov 26, 2014 8:32 pm
OLAP Product: IBM TM1, Planning Analytics
Version: PA 2.0.5
Excel Version: 2016

Re: TM1 Performance

Post by manu0521 »

I looked at the link ,

So for my example I have a few dimensions which i wanted to ask your thoughts .

First question is the last dimension on the cube should be a Measure or can measure be in any place. I am in a assumption Measure should be the last.

Rank/Sort the dimensions by density then by size

so the size is nothing but whichever dimension has large number of n elements and density is the percentage of populated cell values.

I have these few dimensions which i wanted to choose one

Customer : Number of N level elements - 25000 n level Customers and number of n cellls that has value is around 13500

so density is 54% data filled

Items : There are around 225000 n level items but there is only 20000 items having a value .

so density is 8.9% data filled (this is what the system option in pax is showing as last dimension before the sales measure and has more memory consumption as per system design)

Time period : Shows day level data , have around 4000 n level dates , but there is 2000 dates that has a value because is data is loaded only for 5 years and few years are not used.

so density is 50% data filled.

Earlier I had the Time Period as Last and my menory got reduced from 38 gb to 18 gb .

Which one will you guys pick . Any idea why is Items the last as per system design when density is 9 % I understand the Where filter in sql will be more of items or could be time period or customers, is system giving items becasue of its size.

Thanks,
lotsaram
MVP
Posts: 3702
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: TM1 Performance

Post by lotsaram »

The reason that it is normally stated that the measure dimension shoudl be last is that TM1 cubes can hold both numeric and string data and the element type of the last dimension in the cube determines the data type. However, if (as is usually the case) your cube contains only numeric data then there is no reason at all to stick with keeping the measure dimension last. This can give you much more flexibility and allow greater memory saving.

I suggest you also look at OptimusPy which basically can take all the guess work out and automatically determines the optimal order.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
lotsaram
MVP
Posts: 3702
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: TM1 Performance

Post by lotsaram »

Steve Rowe wrote: Fri May 28, 2021 7:48 am But be warned that it will take sometime to re-order a cube of that size and also consume potentially another 40GB of RAM. If you are still in development you might be better off deleting the cube and building it with the dimensions in your best estimate order.
No. Just No! Don't do this. The whole reason the re-order (or "optimize") feature exists is so that for the GUI as well as all formula and code references end users and developers can use a consistent dimension ordering accross all cubes. This is a very important useability feature (and also efficiency and reducing errors for developers). Sure by all means re-order the cube in development with a reduced data set to save processing time and memory. But don't rebuild it from scratch with the initial (i.e. GUI display) order as the optimal order. And also the optimal order can change over time as data in the cube gets added. You don't want to be rebuilding reports and interfaces because you changed the indexing of some cubes.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
manu0521
Posts: 124
Joined: Wed Nov 26, 2014 8:32 pm
OLAP Product: IBM TM1, Planning Analytics
Version: PA 2.0.5
Excel Version: 2016

Re: TM1 Performance

Post by manu0521 »

I will try optimisepy but I also wanted to try to calculate manually for one time, I came down to 3 dimensions.
Custoemr has - 30000 customers. i did customers on rows with n and then all other dimension as Total and measure as sales.

So there were 19500 customers having value. so is my density - 19500 / 30000 or this should invlove calulating other dinmensions here


Items - 225000 items and 20000 items having sales for items with rest all

Time Dimension , has 6 years of data and all days will have sales and will grow .

No can anyone say whats my sparse and whats my density here with respect to three dimensions,


Thanks,
Post Reply