Copy Large Cube to Cube advice

Post Reply
LutherPaul
Posts: 80
Joined: Tue Jun 04, 2013 3:35 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

Copy Large Cube to Cube advice

Post by LutherPaul »

Hi All,
I am trying to write a TI process to copy from a planning cube to a reporting Cube. Both are large cubes and am looking for best ways for performance.
we are using 10.2.2.

Thanks,
Paul.
User avatar
gtonkin
MVP
Posts: 1198
Joined: Thu May 06, 2010 3:03 pm
OLAP Product: TM1
Version: Latest and greatest
Excel Version: Office 365 64-bit
Location: JHB, South Africa
Contact:

Re: Copy Large Cube to Cube advice

Post by gtonkin »

HI Paul,
You probably need to give some more information e.g.
-Have you looked up Batrch Update vs Bulk Load? Would users need to access whilst refreshing?
-Frequency of copying
-Once off all data then incremental or all everytime
-Some idea in terms of structure i.e. are you taking ALL N across all dimensions, Are you reading from C levels etc. etc.
-Skip options considered/applicable e.g. Consolidations, Calcs, Zeroes
-String data and suppression gotchas
-Would rules work instead?
-What thought have you given to Metadata-are dimensions cloned are different for the reporting requirements?

I am sure there are many more considerations but please let us have your thoughts so that everyone can respond accordingly.
LutherPaul
Posts: 80
Joined: Tue Jun 04, 2013 3:35 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

Re: Copy Large Cube to Cube advice

Post by LutherPaul »

Thanks gtonkin for the reply. sorry, took a longer to reply on this.
I have another question. The view built to copy data uses MTQ to retrieve, but the copy is single threaded (I think it's writing a row each time.) Is there multi-threaded copy?

We are on TM1 10.2.2 FP4.

Answers are below.
-Have you looked up Batrch Update vs Bulk Load? Would users need to access whilst refreshing? --> bulkload was taking longer than regular TI load and got a ticket created to IBM. Never tried Batch update
-Frequency of copying --> Daily
-Once off all data then incremental or all everytime --> Plan data can change anytime or all time.
-Some idea in terms of structure i.e. are you taking ALL N across all dimensions, Are you reading from C levels etc. etc. --> Reading ALL N across all dimensions - leaf level elements.
-Skip options considered/applicable e.g. Consolidations, Calcs, Zeroes --> Skipping Zeros and reading all rule calculated cells.
-String data and suppression gotchas --> none
-Would rules work instead? --> No
-What thought have you given to Metadata-are dimensions cloned are different for the reporting requirements? --> Dimensions for reporting cube are different from planing.

Thanks,
Paul.
User avatar
Steve Rowe
Site Admin
Posts: 2415
Joined: Wed May 14, 2008 4:25 pm
OLAP Product: TM1
Version: TM1 v6,v7,v8,v9,v10,v11+PAW
Excel Version: Nearly all of them

Re: Copy Large Cube to Cube advice

Post by Steve Rowe »

You need to think about which aspect of performance you want to maximise. Usually people think that they want to make things fast but actually what they mean is "The copy process locks the environment so I want to make it run fast".

So really you probably want to minimise the length of the lock.

The easiest way of doing this to do the export to a flat file and then do the read from the file with no metadata maintenance.
Technical Director
www.infocat.co.uk
lotsaram
MVP
Posts: 3652
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: Copy Large Cube to Cube advice

Post by lotsaram »

LutherPaul wrote:Hi All,
I am trying to write a TI process to copy from a planning cube to a reporting Cube. Both are large cubes and am looking for best ways for performance.
we are using 10.2.2.

Thanks,
Paul.
I find it a bit strange that no one has mentioned parallelising the load by breaking the view into multiple slices along one dimension. In 10.2.2 with parallel interaction this is an easy and foolproof way to reduce end to end processing time. I wouldn't ever consider BulkLoadMode, the speedup multiple possible is very limited due to the single threaded nature and it comes with disadvantages in terms of locking out logins which very much outweigh any benefits.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
User avatar
mattgoff
MVP
Posts: 516
Joined: Fri May 16, 2008 1:37 pm
OLAP Product: TM1
Version: 10.2.2.6
Excel Version: O365
Location: Florida, USA

Re: Copy Large Cube to Cube advice

Post by mattgoff »

I find it a bit strange that no one has mentioned parallelising the load by breaking the view into multiple slices along one dimension. In 10.2.2 with parallel interaction this is an easy and foolproof way to reduce end to end processing time.
Is this any faster than performance under 10.2.2 FP6 which enabled MTQ view generation via TI?
I wouldn't ever consider BulkLoadMode, the speedup multiple possible is very limited due to the single threaded nature and it comes with disadvantages in terms of locking out logins which very much outweigh any benefits.
I have tried to get BulkLoadMode to work many times with zero success. Could be I'm doing something wrong, but I had nothing but problems with locking out users and little to n performance improvement (and a lot of crashes, granted this was years ago).

Matt
Please read and follow the Request for Assistance Guidelines. It helps us answer your question and saves everyone a lot of time.
Post Reply