Potential PI issue with Dimension Updates

Post Reply
PlanningDev
Community Contributor
Posts: 349
Joined: Tue Aug 17, 2010 6:31 am
OLAP Product: Planning Analytics
Version: 2.0.5
Excel Version: 2016

Potential PI issue with Dimension Updates

Post by PlanningDev »

We have run into what we believe may be an issue with PI and updating dimensions. Just looking to get some thoughts are confirmations on this as we are running some of our own tests.

1. In the prolog of the dimension update we execute a separate process to delete all consolidated items within the dimension.
2. The metadata tab inserts and updates new items.
3. The data tab does a standard attribute update.
4. The epilog calls another TI assign all n level elements with no parent to an orphaned item

Heres the problem. Step 1 does not appear to ever get committed. In other words when the TI is finished we seem to have n level elements still assigned to old parents. Heres why we think there is a probelm with PI. When running each process individually there seems to be no issue.

Is it possible that PI is supposed to run the first process but since the data is not committed untill the very end, the step 1 changes are getting wiped out by the metadata tab changes?
lotsaram
MVP
Posts: 3648
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: Potential PI issue with Dimension Updates

Post by lotsaram »

I can't see that parallel interaction would have anything to do with this. The method used by the TI called on the prolog maybe and something to do with the combination and order of how the processes are running maybe, but I doubt very much whether the behavior would be any different with or without parallel interaction, or did you mean something else by PI?

What is the method used by the called process to delete all consols? Is it simply runnign a while loop on the dimension and deleting all elements with DTYPE C or is it assigning subset All of the dimension as a datasource and doing the removal on the meta data tab? If its the later then I can see how you could have the issue you describe, if its the former then it should work and if it doesn't I'd call it a bug. Any meta data changes done on the prolog and epilog should be committed instantly (which is why its usually better to do it on meta data since it saves much unnecessary writing to disk).

As you are using 9.5.2 the new DimensionElementComponentDeleteDirect function could also solve your issue.
Duncan P
MVP
Posts: 600
Joined: Wed Aug 17, 2011 1:19 pm
OLAP Product: TM1
Version: 9.5.2 10.1 10.2
Excel Version: 2003 2007
Location: York, UK

Re: Potential PI issue with Dimension Updates

Post by Duncan P »

Before running the sub-process to delete all consolidations do you do anything in the calling process to modify the dimension in any way?

If you do then from that point onwards the calling process will be seeing a private copy of the dimension. The sub-process will see and modify the public copy and then save it. The changes to the public copy are at that point not reflected in the private copy being used in the calling process. After the meta-data changes have been made that private copy is saved overwriting the changed public copy.

A way to avoid this is to have a master process that wrap both the consolidation deletion process and the dimension update process. Then each of the sub-processes will see and modify sequential public versions of the dimension.
PlanningDev
Community Contributor
Posts: 349
Joined: Tue Aug 17, 2010 6:31 am
OLAP Product: Planning Analytics
Version: 2.0.5
Excel Version: 2016

Re: Potential PI issue with Dimension Updates

Post by PlanningDev »

Just found the problem.

The ExecuteProcess to remove the C level elements was the first thing AFTER the generated statements in the prolog and unfortunately the generated statements had a dimension sort in it. It does appear that dimension updates work on copies of dimensions and the copy seems to begin with the first dimensional function. Subsequent calls to processes that modify the dimension in question don't seem to be committed back to the working copy.

Moving the process above as the VERY FIRST item worked.
tomok
MVP
Posts: 2831
Joined: Tue Feb 16, 2010 2:39 pm
OLAP Product: TM1, Palo
Version: Beginning of time thru 10.2
Excel Version: 2003-2007-2010-2013
Location: Atlanta, GA
Contact:

Re: Potential PI issue with Dimension Updates

Post by tomok »

However tempting it may be, you shouldn't mix wizard generated code with your own snippets of code. I'm not saying this is the reason you had a problem but I would avoid it like the plague. Just another thing in a long list of things you should avoid like the plague in TM1. Someone should write a book, "Things Not to do in TM1".
Tom O'Kelley - Manager Finance Systems
American Tower
http://www.onlinecourtreservations.com/
User avatar
jpm_de
Posts: 22
Joined: Thu Jun 10, 2010 5:19 pm
OLAP Product: TM1
Version: 10.2.2 FP3
Excel Version: 2010

Re: Potential PI issue with Dimension Updates

Post by jpm_de »

Yeah, I would love to see some best practice classes and documents. Well, people tend to need to learn a lot of things the hard way ;-D

I came across these topics, when I was designing a master process management concept to get a more flexible chore management functionality.
By the way, I am wondering, what recent innovations can do. Up to now, I have never had time to test the design implications of the "Direct" functions, e.g. in combination with the multi commit.
Why TM1? Because ...with great dimensionality there must also come -- great responsibility!
(http://www.quotecounterquote.com/2012/0 ... great.html)
User avatar
paulsimon
MVP
Posts: 808
Joined: Sat Sep 03, 2011 11:10 pm
OLAP Product: TM1
Version: PA 2.0.5
Excel Version: 2016
Contact:

Re: Potential PI issue with Dimension Updates

Post by paulsimon »

Hi

I have developed a master process cube as well. One of the main advantages is that parameters are a lot more visible than they are in a Chore, and you get automatic stats such as process run times. It can be easy to get things done just by pasting in calls to standard processes and amending parameters.

As far as the Direct functions go, I have used them in two ways:

1) When doing a data load when there is the risk of encountering new elements. Using the Direct functions on the Data Tab means you don't need to do two passes over the data to ensure that all required elements are updated.

2) When updating some very large dimensions to create scheme whereby a temp copy of the dimension was built using std dimension functions, and then compared to the real dimension and changes were made using the Direct functions. The aim was to cut down the period of locking on the real dimension, and its related cubes.

As a general rule of thumb Direct Functions should not be used if you are going to update more than 20% of the elements in a dimension.

I don't think you will get any issues with using the Direct functions in a multi-commit chore if that is what you mean, as by nature the Direct functions commit changes immediately.

If you use Direct functions there is a need to periodically run a DimensionUpdateDirect to compress and optimise the dimension.

Regards

Paul Simon
Post Reply