Load Problem

Post Reply
AnonimusMax
Posts: 60
Joined: Thu Nov 17, 2016 2:13 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2013

Load Problem

Post by AnonimusMax »

Hi,

When i am loading data into TM1 i am using a Master Process load.
So This process is updating dimension first.
When I am running a load DIM, first I have a SQL where i get my data from oracle and then I use a DimesnionelementinsertDirect and DIMENSIONELEMENTCOMPONENTADDDIRECT ( for All element).
In case I have big volume of data, can this cause a crash of TM1 server?
Is there a better way to do this update?

Thank you
Wim Gielis
MVP
Posts: 3240
Joined: Mon Dec 29, 2008 6:26 pm
OLAP Product: TM1, Jedox
Version: PAL 2.1.5
Excel Version: Microsoft 365
Location: Brussels, Belgium
Contact:

Re: Load Problem

Post by Wim Gielis »

Hello

I understand that you would like to use the Direct variants of the dimension maintenance functions.
Did you test with the regular functions as well ?
Best regards,

Wim Gielis

IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
AnonimusMax
Posts: 60
Joined: Thu Nov 17, 2016 2:13 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2013

Re: Load Problem

Post by AnonimusMax »

No i didn t test with the regular... Can that cause some problems?
Wim Gielis
MVP
Posts: 3240
Joined: Mon Dec 29, 2008 6:26 pm
OLAP Product: TM1, Jedox
Version: PAL 2.1.5
Excel Version: Microsoft 365
Location: Brussels, Belgium
Contact:

Re: Load Problem

Post by Wim Gielis »

First use the regular functions to get it all working.
If there is any pressing need (like not having to go through SQL server tables with hundreds of thousands / millions of rows) then you can consider the Direct variants.
Note, I don't know if the Direct variants are the problem here but just to rule that one out, consider the usual approach of DimensionElementInsert.
Best regards,

Wim Gielis

IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
AnonimusMax
Posts: 60
Joined: Thu Nov 17, 2016 2:13 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2013

Re: Load Problem

Post by AnonimusMax »

I have hundreds of thousands of rows, so i don t think that can be an option. The execution time will increase a lot right?
Is there another way to import new data into dimensions?
Wim Gielis
MVP
Posts: 3240
Joined: Mon Dec 29, 2008 6:26 pm
OLAP Product: TM1, Jedox
Version: PAL 2.1.5
Excel Version: Microsoft 365
Location: Brussels, Belgium
Contact:

Re: Load Problem

Post by Wim Gielis »

Just to be clear, updating dimensions shouldn't cause a crash.
You will have to debug the process.
For example, use the same data source but stop about halfway the number of records (use a simple counter and to itemskip when a certain number of records is reached).
Or, get rid of the Direct variants and use code in the Metadata tab.
Turn off logging during the process.
I there any chance that feeders are fired while loading ?
And so on.
Best regards,

Wim Gielis

IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
User avatar
Steve Rowe
Site Admin
Posts: 2464
Joined: Wed May 14, 2008 4:25 pm
OLAP Product: TM1
Version: TM1 v6,v7,v8,v9,v10,v11+PAW
Excel Version: Nearly all of them

Re: Load Problem

Post by Steve Rowe »

The other option in this situation is to have multiple SQL queries. You don't have to process the data twice to get to the structures.

You could just say "give me the list of all the accounts in the data table" and use this to update the account dimension. Even better if you are reading from a ledger there is probably a table somewhere that has the structures you need, you might even be able to query for "new stuff since I last asked".

With large queries of the order you speak of it does not make sense to do the Metadata update as part of the data.
1. It is very inefficient to test every data row to see if it contains new Metadata.
2. Dimension updates are still locking, if you combine this with the data load then you'll lock the application for a long time.

Basically doing the metadata from the data is OK for small feeds but it doesn't scale and you should change the methodology.
Technical Director
www.infocat.co.uk
Post Reply