Methods for Copying Consolidated Data via TI

Post Reply
fleaster
Regular Participant
Posts: 167
Joined: Wed Mar 30, 2011 11:57 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: XL2010

Methods for Copying Consolidated Data via TI

Post by fleaster »

Ok, I was planning on creating some processes to recreate the relative proportional spread function in TI - this would consist of:
A. ViewZeroOut > OK
B. Copying consolidated data > ?
C. +CellPutProportionalSpread > OK

...with the copying of consolidated data via TI part, am thinking the only way is to :

1. create an MDX dynamic subset that grabs all zero level elements of the consolidation
2. add the subset(s) to a view
3. use the view as a data source to copy data at leaf level

...I think this will work, but it feels very convoluted - anyone have any ideas for a more effcient method...?
lotsaram
MVP
Posts: 3704
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: Methods for Copying Consolidated Data via TI

Post by lotsaram »

The method sounds about right except that I would use static not dynamic subsets. If you are familiar with bedrock this should not be too convoluted or complicated and should need only a couple of lines of code.
fleaster
Regular Participant
Posts: 167
Joined: Wed Mar 30, 2011 11:57 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: XL2010

Re: Methods for Copying Consolidated Data via TI

Post by fleaster »

thanks for the confirmation... ok so re: the method of getting all Leaf level elements under a consolidation, the ways i can think of are:

1. create an MDX query that does the equivalent of an expand and filter by 0 level
2. trawl through the entire dimension, skipping elements that are not 0 level, and only including 0 level elements that have the consolidation as an ancestor
3. if the consolidation is level 1, then you can just grab the immediate children of it by looping through the index position of ELCOMP

..is that the complete list? my issues are...

(a) what if this process needs to be done across multiple dimensions and
(b) this TI process was built to allow the user to perform relative proportional spread in bulk, hence it might need to be repeated up to 100 times through each line of input...?
tomok
MVP
Posts: 2836
Joined: Tue Feb 16, 2010 2:39 pm
OLAP Product: TM1, Palo
Version: Beginning of time thru 10.2
Excel Version: 2003-2007-2010-2013
Location: Atlanta, GA
Contact:

Re: Methods for Copying Consolidated Data via TI

Post by tomok »

Code: Select all

i = 1;
SubsetCreate(Dim, Sub);
NumEls = ELCOMPN(Dim, El);
WHILE (i <= NumEls)';
  Child = ELCOMP(Dim, El, i);
  SubsetElementInsert(Dim, Sub, Child, i);
  i = i + 1;
END;
Tom O'Kelley - Manager Finance Systems
American Tower
http://www.onlinecourtreservations.com/
fleaster
Regular Participant
Posts: 167
Joined: Wed Mar 30, 2011 11:57 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: XL2010

Re: Methods for Copying Consolidated Data via TI

Post by fleaster »

Hi tomok,
I thought ELCOMP/ELCOMPN only works for immediate children of the rollup - hence my issue is if there are several different levels within the rollup, if there is a more efficient way of doing it than creating an MDX (expand/filter 0 level) subset...?


Matt
rmackenzie
MVP
Posts: 733
Joined: Wed May 14, 2008 11:06 pm

Re: Methods for Copying Consolidated Data via TI

Post by rmackenzie »

Is this what you need:

Code: Select all

sDimName = 'THINGUMMYJIG';
sSubName = 'zTemp';
sConsolidation = 'ALL THINGUMMYJIGS';
nCounter = 1;
nMaxCount = DIMSIZ ( sDimName );
nSubsetIndex = 1;
SubsetCreate ( sDimName, sSubName );
WHILE ( nCounter <= nMaxCount );
  sElemName = DIMNM ( sDimName, nCounter );
  sType = DTYPE ( sDimName, sElemName );
  nCheck = ELISANC ( sDimName, sConsolidation, sElemName );
  IF ( nCheck = 1 & sType @= 'N' );
    SubsetElementInsert ( sDimName, sSubName, sElemName, nSubsetIndex );
    nSubsetIndex = nSubsetIndex + 1;
  ENDIF;
  nCounter = nCounter + 1;
END;
Robin Mackenzie
fleaster
Regular Participant
Posts: 167
Joined: Wed Mar 30, 2011 11:57 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: XL2010

Re: Methods for Copying Consolidated Data via TI

Post by fleaster »

thanks - I had thought of that method, but am assuming it would involve looping through the entire dimension, checking every single element...? given it may have to be done 100 times from an input file (+across multiple dimensions), would this not be a big performance hit...? :)

M.
rmackenzie
MVP
Posts: 733
Joined: Wed May 14, 2008 11:06 pm

Re: Methods for Copying Consolidated Data via TI

Post by rmackenzie »

Yes, it could be a big performance hit depending on the dimension size, structural complexity, number of multiple hierarchies, etc.

It's just another option for you - not necessarily the best.
Robin Mackenzie
fleaster
Regular Participant
Posts: 167
Joined: Wed Mar 30, 2011 11:57 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: XL2010

Re: Methods for Copying Consolidated Data via TI

Post by fleaster »

ok thanks for the consideration :) hmm i guess it may have to be an MDX query then... :p
rmackenzie
MVP
Posts: 733
Joined: Wed May 14, 2008 11:06 pm

Re: Methods for Copying Consolidated Data via TI

Post by rmackenzie »

Regarding performance, from one point of view, the MDX route is preferable. This work the MDX has to do (to return n-level elements of a given consolidation) is limited to walking down the different drill-downs of the hierarchy:

Code: Select all

{TM1FILTERBYLEVEL( {TM1DRILLDOWNMEMBER( {[Thingummyjig].[Complex Thingummyjig Consolidation]}, ALL, RECURSIVE )}, 0)}
Whereas the TI code using ELISANC may have to do a lot of redundant processing checking the ancestry on elements that don't roll-up to your consolidation. However, from another performance point-of-view, using dynamic subsets in your system can, in some circumstances, have a negative impact on performance because of the need to recalculate them as metadata changes. For what you've said you want to do it sounds like you only need the dynamic subset for the duration of a TI that does the load of consolidated data and so it may very well be that going down the MDX route is the best for you.
Robin Mackenzie
Post Reply