TM1 model size

Post Reply
pooja_11284
Posts: 14
Joined: Wed Jul 01, 2015 1:10 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

TM1 model size

Post by pooja_11284 »

How can we assess the size of a TM1 model?
Edward Stuart
Community Contributor
Posts: 248
Joined: Tue Nov 01, 2011 10:31 am
OLAP Product: TM1
Version: All
Excel Version: All
Location: Manchester
Contact:

Re: TM1 model size

Post by Edward Stuart »

Yes
pooja_11284
Posts: 14
Joined: Wed Jul 01, 2015 1:10 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

Re: TM1 model size

Post by pooja_11284 »

HOw? Is it same as size of data directory?
Edward Stuart
Community Contributor
Posts: 248
Joined: Tue Nov 01, 2011 10:31 am
OLAP Product: TM1
Version: All
Excel Version: All
Location: Manchester
Contact:

Re: TM1 model size

Post by Edward Stuart »

It is always beneficial to check the Request for Assistance Guidelines

http://www.tm1forum.com/viewtopic.php?f=3&t=1037
3) Try to make the question as specific as possible. A general question like "How do I write a rule?" is difficult to answer without copying out the whole of the user manual. A question like "I have a payroll cube which contains the following dimensions {details} and a general ledger cube which contains the following dimensions {details}, and although I've read through the Rules Guide I'm still not sure how to write a rule to give me average sales per employee" is far more likely to get the answer your need.
Models can be assessed by their space on disk and RAM consumption fairly easily and quickly but it won't really tell you anything.

What do you want to know? Or what issue are you hitting?
pooja_11284
Posts: 14
Joined: Wed Jul 01, 2015 1:10 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

Re: TM1 model size

Post by pooja_11284 »

I am trying to create a staging cube which is going to send data into 19 cubes for a particular Actual update process. I am not sure what all parameters i need, to assess the feasibility as the staging cube, whether it is going to work properly or how much performance gets bog down. So just want to gather server statistics. The following is the model statistics of my server. In cash system crashes after creating and loading such a big cube, I need to justify.

Statistics
Cubes 51 Control Cubes 137
Dimensions 93 Control Dimensions 0
Views 393 Subsets 785
User avatar
Steve Rowe
Site Admin
Posts: 2417
Joined: Wed May 14, 2008 4:25 pm
OLAP Product: TM1
Version: TM1 v6,v7,v8,v9,v10,v11+PAW
Excel Version: Nearly all of them

Re: TM1 model size

Post by Steve Rowe »

The information you post doesn't really help much since you don't talk about expected number of data points a period, the sparsity of the data, the dimensionality of the data, the number of elements in the dimensions, the dimensionality of the cubes, the frequency of data updates, the number of different versions of data, any calculations you need to do on the data, etc.

IMO you have some design issues
1. If the data is similar enough that it fits into a single cube (the staging cube) then why do you need the 19 different other cubes?
or
1. If the data is different enough that you need 19 different cubes to hold it in then a staging cube doesn't make much sense.

If you are storing all your data in the staging cube and in the 19 other cubes then system is going to be roughly twice as big as it needs to be. There should be a real world business justification for why you want to do this. If the staging cube is just for doing work as the data transitions into the system then you should clear it after the data processing finishes, then the size of it shouldn't be an issue.

Not sure I'm helping but you are not giving us much to go on, plus my lunch break is over....

Cheers,
Technical Director
www.infocat.co.uk
pooja_11284
Posts: 14
Joined: Wed Jul 01, 2015 1:10 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

Re: TM1 model size

Post by pooja_11284 »

Client has a project-based business with revenue, costs and gross profit being tracked at Program / Project level. A project rolls into a unique program. Programs are categorized at a high level as building/ Repairing projects.
It tracks costs and revenue for the entire life span of the project by Period(28 days) / Year.

Model has 12 modules
Bill Labour Rate
Estimate At Completion
Overhead
Capital
Back stop
IRB
Loan Forgiveness
Income Statement
Balance Sheet
CashFlow
Consolidation
General

Each module has couple of cubes.

Not sure if this l gives any information. For every 28 days actuals are getting loaded into cubes (19 cubes distributed among 12 modules). I am also not able to assess the feasibility of Staging cube but as per them dump all the dimensions from 19 cubes into one cube and load them through DW. Then through rules get the data loaded into 19 cubes as loading of cubes from DW is taking long time. W need to add one new element to all the dimensions say "DW load" which would be used while loading data(which is not getting used anywhere). Not a recommended approach but want to try out if it reduces loading time of data.
User avatar
Steve Rowe
Site Admin
Posts: 2417
Joined: Wed May 14, 2008 4:25 pm
OLAP Product: TM1
Version: TM1 v6,v7,v8,v9,v10,v11+PAW
Excel Version: Nearly all of them

Re: TM1 model size

Post by Steve Rowe »

So, I think what you are saying is that

You have to load a bunch of data from a data warehouse, this is too slow and ends up in 19 cubes.
You think a staging cube might help with this.

I think that a staging cube might help if it reduces the amount of data you are reading from the DW because much of the data across the 19 cubes is duplicated.

but consider also

1. You need to be sure that it is the DW read that is causing the performance hit since it could also be long complex downstream feeder chains.
2. Instead of having a TI for each of the 19 cubes, have 1 (or more) TIs and then have 19 CellPutNs, this gets you to the same place as the staging cube, i.e. minimise the amount of reads from the DW.
3. Consider writing the load TIs so that they can be multi-threaded and non-locking. It has been my experience that when users say they complain about how long the load takes, what they are really complaining about is how long TM1 has been locked for.

Suggest you take a step back and re state what your actual business issue is and then consider what the possible resolutions for this are, rather than diving straight into technical issues with the first design you thought of.

Cheers,
Technical Director
www.infocat.co.uk
tomok
MVP
Posts: 2832
Joined: Tue Feb 16, 2010 2:39 pm
OLAP Product: TM1, Palo
Version: Beginning of time thru 10.2
Excel Version: 2003-2007-2010-2013
Location: Atlanta, GA
Contact:

Re: TM1 model size

Post by tomok »

pooja_11284 wrote:Then through rules get the data loaded into 19 cubes
An exceptionally bad idea. This will likely be a performance drag that will dwarf whatever you are experiencing with the DW loads. Unless you are building a real-time planning system that requires dynamic calculations then for heavens sake don't go down that path.
Tom O'Kelley - Manager Finance Systems
American Tower
http://www.onlinecourtreservations.com/
dharav
Regular Participant
Posts: 193
Joined: Wed Apr 02, 2014 6:43 pm
OLAP Product: TM1
Version: 10.2
Excel Version: 2010

Re: TM1 model size

Post by dharav »

pooja_11284 wrote:I am trying to create a staging cube which is going to send data into 19 cubes for a particular Actual update process. I am not sure what all parameters i need, to assess the feasibility as the staging cube, whether it is going to work properly or how much performance gets bog down. So just want to gather server statistics. The following is the model statistics of my server. In cash system crashes after creating and loading such a big cube, I need to justify.

Statistics
Cubes 51 Control Cubes 137
Dimensions 93 Control Dimensions 0
Views 393 Subsets 785
Hi, Pooja

I personally don't recommend to go through with rules. not even recommend to think about it when you are sending data in to 19 cubes.

It would be better if you can ask DBA team to send the text file having data on the server through FTP. If you have large data set then divide them in to 2-3 files based on cube combination. Create dynamic TI processes for each file to upload data, assign them in batch with TM1RunTI function, and run TI.
** While creating TI disable cube logging in prolog and enable it in the epilog at the end of the code.

I hope this approach helps.

Regards,

DHARAV
Post Reply