Improve TI process time for cubes with heavy rules

Post Reply
kenship
Regular Participant
Posts: 194
Joined: Thu May 30, 2013 1:41 pm
OLAP Product: Cognos
Version: Planning Analytics 2.0.7
Excel Version: 2010

Improve TI process time for cubes with heavy rules

Post by kenship »

Hi, I'm developing a salary and benefits model. For the purpose of posting, I simplify the model to the following:

Cubes:
- Salary Rate
- Benefits Rate
- Input
- Calculation
- Report
- Mapping

Rules:
- Input sends FTE and headcount data into Calculation
- Calculation calculates salary and each benefit based on Input, Salary Rate and Benefits Rate
- Input receives salary$ and benefits $ from Calculation

*One the reasons to use rule is that FTE change can take place any month, depending on whether the change is internal transfer or new hire, benefits can be very different; There are a few more reasons why we can't use a standardized, master salary and benefits but to rely on a ruled-calculated cube.

Processes:
- Input to Mapping (every 10 min, beginning :00)
- Input to Report (every 10 min, beginning :05)

-----

Problem:
The 2 processes take long time (3 min.) whenever there is change in Input. If Input doesn't change, processes take literally a few seconds. I suspect cache view is helping to speed it up. However, during planning season I expect Input to be changed consistently during work hours for a few months. The time needed for the processes is too long for what we need. I'm looking for way to reduce the complexity of the model and improve the time needed to run the processes.

Thanks for looking into this.

Kenneth
tomok
MVP
Posts: 2831
Joined: Tue Feb 16, 2010 2:39 pm
OLAP Product: TM1, Palo
Version: Beginning of time thru 10.2
Excel Version: 2003-2007-2010-2013
Location: Atlanta, GA
Contact:

Re: Improve TI process time for cubes with heavy rules

Post by tomok »

If you really need to run this every 10 minutes have you considered just moving the data for changed employees (assuming you have an employee dimension)?
Tom O'Kelley - Manager Finance Systems
American Tower
http://www.onlinecourtreservations.com/
kenship
Regular Participant
Posts: 194
Joined: Thu May 30, 2013 1:41 pm
OLAP Product: Cognos
Version: Planning Analytics 2.0.7
Excel Version: 2010

Re: Improve TI process time for cubes with heavy rules

Post by kenship »

I can probably try this, thanks! But not much data would be excluded because the model calculates the year to year changes and therefore basically every employee will see their salary and benefits amount changed every year.

Unless there's way the cube is smart enough to identify the changed cells only.
tomok
MVP
Posts: 2831
Joined: Tue Feb 16, 2010 2:39 pm
OLAP Product: TM1, Palo
Version: Beginning of time thru 10.2
Excel Version: 2003-2007-2010-2013
Location: Atlanta, GA
Contact:

Re: Improve TI process time for cubes with heavy rules

Post by tomok »

Sorry, you presented your problem in very simplified terms so I can only give you very simplified ideas.
Tom O'Kelley - Manager Finance Systems
American Tower
http://www.onlinecourtreservations.com/
User avatar
gtonkin
MVP
Posts: 1192
Joined: Thu May 06, 2010 3:03 pm
OLAP Product: TM1
Version: Latest and greatest
Excel Version: Office 365 64-bit
Location: JHB, South Africa
Contact:

Re: Improve TI process time for cubes with heavy rules

Post by gtonkin »

Similar models I have running at clients are instantaneous and also handle joiners, leavers, transfers, annual increases, performance increases (based on tenure), benefits based on role,grading etc. etc. Everything is rule based and a change e.g. termination date, override in salary, change to benefit driver are lightning.
Admittedly we are only dealing with 3000 employees but further detail of your environment may get you more helpful answers.
User avatar
Steve Rowe
Site Admin
Posts: 2410
Joined: Wed May 14, 2008 4:25 pm
OLAP Product: TM1
Version: TM1 v6,v7,v8,v9,v10,v11+PAW
Excel Version: Nearly all of them

Re: Improve TI process time for cubes with heavy rules

Post by Steve Rowe »

With a ten minute refresh by design your very close to wanting the system "always live" anyway, why use a TI instead of rules?

Is the issue with the performance of the TI just the elapsed time or the fact that it is locking? If it always takes 3 minutes to run and is non-locking your data is still updated every 10 minutes. So no problem. If it is locking, this should be a solvable problem.
The 2 processes take long time (3 min.) whenever there is change in Input. If Input doesn't change, processes take literally a few seconds. I suspect cache view is helping to speed it up. However, during planning season I expect Input to be changed consistently during work hours for a few months. The time needed for the processes is too long for what we need. I'm looking for way to reduce the complexity of the model and improve the time needed to run the processes.
You've probably worked this out but since you've not explicitly stated it. The performance issue is with your rules not the TI process, the input is destroying the pre-calculated cached of ruled values that the TI depends on and so the rule engine is recalculating before the TI executes.
Check you have MTQ set to something. (PA ships at 1 for example).
Check you are not over-feeding the right hand side of the rule.
Check you are performing the calculation in as few steps as possible (i.e. if a= b * c * d * e, where b to e are all calculations of some form, don't keep b to e as separate measures).
Technical Director
www.infocat.co.uk
kenship
Regular Participant
Posts: 194
Joined: Thu May 30, 2013 1:41 pm
OLAP Product: Cognos
Version: Planning Analytics 2.0.7
Excel Version: 2010

Re: Improve TI process time for cubes with heavy rules

Post by kenship »

tomok wrote: Thu Oct 05, 2017 9:04 pm Sorry, you presented your problem in very simplified terms so I can only give you very simplified ideas.
Fully understood. No need to say sorry at all.
kenship
Regular Participant
Posts: 194
Joined: Thu May 30, 2013 1:41 pm
OLAP Product: Cognos
Version: Planning Analytics 2.0.7
Excel Version: 2010

Re: Improve TI process time for cubes with heavy rules

Post by kenship »

gtonkin wrote: Fri Oct 06, 2017 7:29 am Similar models I have running at clients are instantaneous and also handle joiners, leavers, transfers, annual increases, performance increases (based on tenure), benefits based on role,grading etc. etc. Everything is rule based and a change e.g. termination date, override in salary, change to benefit driver are lightning.
Admittedly we are only dealing with 3000 employees but further detail of your environment may get you more helpful answers.
Hi, thanks for reply.

I think we are quite close. In our case we have 6000+ employees but after grouping we have only >2000 distinct line items.
Our complexity is that we need to calculate every change, and to further complicate the situation:
1. We throw in another dimension so that we can keep 3 sets of scenarios and therefore calculation;
2. We run a multi-year budget and once there's a budget restatement we will be calculating the restatement for all years in the multi-year budget cycle.

This is why things are complicated.
kenship
Regular Participant
Posts: 194
Joined: Thu May 30, 2013 1:41 pm
OLAP Product: Cognos
Version: Planning Analytics 2.0.7
Excel Version: 2010

Re: Improve TI process time for cubes with heavy rules

Post by kenship »

Steve Rowe wrote: Fri Oct 06, 2017 8:15 am With a ten minute refresh by design your very close to wanting the system "always live" anyway, why use a TI instead of rules?

Is the issue with the performance of the TI just the elapsed time or the fact that it is locking? If it always takes 3 minutes to run and is non-locking your data is still updated every 10 minutes. So no problem. If it is locking, this should be a solvable problem.
The 2 processes take long time (3 min.) whenever there is change in Input. If Input doesn't change, processes take literally a few seconds. I suspect cache view is helping to speed it up. However, during planning season I expect Input to be changed consistently during work hours for a few months. The time needed for the processes is too long for what we need. I'm looking for way to reduce the complexity of the model and improve the time needed to run the processes.
You've probably worked this out but since you've not explicitly stated it. The performance issue is with your rules not the TI process, the input is destroying the pre-calculated cached of ruled values that the TI depends on and so the rule engine is recalculating before the TI executes.
Check you have MTQ set to something. (PA ships at 1 for example).
Check you are not over-feeding the right hand side of the rule.
Check you are performing the calculation in as few steps as possible (i.e. if a= b * c * d * e, where b to e are all calculations of some form, don't keep b to e as separate measures).
Thanks Steve.

To answer:
1. I don't think locking is the issue at this time;
2. MTQ - Unfortunately I don't have access to server configuration and setting, but I'll look it up;
3. Over-feeding is certainly something I will look at;
4. Simplify calculation is also something I will look at as well;

Kenneth
User avatar
macsir
MVP
Posts: 782
Joined: Wed May 30, 2012 6:50 am
OLAP Product: TM1
Version: PAL 2.0.9
Excel Version: Office 365
Contact:

Re: Improve TI process time for cubes with heavy rules

Post by macsir »

kenship wrote: Fri Oct 06, 2017 12:36 pm
Hi, thanks for reply.

I think we are quite close. In our case we have 6000+ employees but after grouping we have only >2000 distinct line items.
Our complexity is that we need to calculate every change, and to further complicate the situation:
1. We throw in another dimension so that we can keep 3 sets of scenarios and therefore calculation;
2. We run a multi-year budget and once there's a budget restatement we will be calculating the restatement for all years in the multi-year budget cycle.

This is why things are complicated.
There is always a room for performance improvement. I think there are several things you need to consider and most of them have been said by our gurus already.
1. TI process
Try to run TI processes in parallel if there is no locking or even break processes to subprocesses using TM1RunTI command.
2. Rule
Over-feeding is definitely your main problem here. Break the entire calculation logic into small pieces and use static value where possible to feed next step of calculation. Always try to use natural consolidation of TM1, rather than putting sum or average in the rule.
3. Server
Use more powerful CPU with more cores and higher frequency.
In TM1,the answer is always yes though sometimes with a but....
http://tm1sir.blogspot.com.au/
Drg
Regular Participant
Posts: 159
Joined: Fri Aug 12, 2016 10:02 am
OLAP Product: tm1
Version: 10.2.0 - 10.3.0
Excel Version: 2010

Re: Improve TI process time for cubes with heavy rules

Post by Drg »

If you Output generate many data and cube receiver not final in chain calcultion.
You need find the narrow neck of your process:
long calc or long insert.
based on the result of the analysis, I can assume that you have further logic for calculating the data, and the TI side-by-side starts will not give a result.

After analyzing the process, I would go over to an analysis of the result of the calculation. that the snapshot occurs as the strongest data are distributed and what opportunities are available to optimize further calculations (it is possible to add more off-line processes to aggregate the data)

Also look at the operation of server configuration (virtual machine)
kenship
Regular Participant
Posts: 194
Joined: Thu May 30, 2013 1:41 pm
OLAP Product: Cognos
Version: Planning Analytics 2.0.7
Excel Version: 2010

Re: Improve TI process time for cubes with heavy rules

Post by kenship »

Hi all,

Thanks for all the reply.

I gave some thoughts to the model and found a way to separate calculation of salary and benefits into 2 streams:

First stream will have salary and benefits pre-calculated for the full year, on full time basis, values will be hardcoded.
Second stream will deal with the rest.

Due to the fact the vast majority of headcount will be falling into first stream, I believe a lot of resource will be made available to handle the second stream and will significantly improve processing time.

Kenneth
Post Reply