Hi,
When i am loading data into TM1 i am using a Master Process load.
So This process is updating dimension first.
When I am running a load DIM, first I have a SQL where i get my data from oracle and then I use a DimesnionelementinsertDirect and DIMENSIONELEMENTCOMPONENTADDDIRECT ( for All element).
In case I have big volume of data, can this cause a crash of TM1 server?
Is there a better way to do this update?
Thank you
Load Problem
-
- Posts: 60
- Joined: Thu Nov 17, 2016 2:13 pm
- OLAP Product: TM1
- Version: 10.2.2
- Excel Version: 2013
-
- MVP
- Posts: 3240
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Load Problem
Hello
I understand that you would like to use the Direct variants of the dimension maintenance functions.
Did you test with the regular functions as well ?
I understand that you would like to use the Direct variants of the dimension maintenance functions.
Did you test with the regular functions as well ?
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
-
- Posts: 60
- Joined: Thu Nov 17, 2016 2:13 pm
- OLAP Product: TM1
- Version: 10.2.2
- Excel Version: 2013
Re: Load Problem
No i didn t test with the regular... Can that cause some problems?
-
- MVP
- Posts: 3240
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Load Problem
First use the regular functions to get it all working.
If there is any pressing need (like not having to go through SQL server tables with hundreds of thousands / millions of rows) then you can consider the Direct variants.
Note, I don't know if the Direct variants are the problem here but just to rule that one out, consider the usual approach of DimensionElementInsert.
If there is any pressing need (like not having to go through SQL server tables with hundreds of thousands / millions of rows) then you can consider the Direct variants.
Note, I don't know if the Direct variants are the problem here but just to rule that one out, consider the usual approach of DimensionElementInsert.
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
-
- Posts: 60
- Joined: Thu Nov 17, 2016 2:13 pm
- OLAP Product: TM1
- Version: 10.2.2
- Excel Version: 2013
Re: Load Problem
I have hundreds of thousands of rows, so i don t think that can be an option. The execution time will increase a lot right?
Is there another way to import new data into dimensions?
Is there another way to import new data into dimensions?
-
- MVP
- Posts: 3240
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Load Problem
Just to be clear, updating dimensions shouldn't cause a crash.
You will have to debug the process.
For example, use the same data source but stop about halfway the number of records (use a simple counter and to itemskip when a certain number of records is reached).
Or, get rid of the Direct variants and use code in the Metadata tab.
Turn off logging during the process.
I there any chance that feeders are fired while loading ?
And so on.
You will have to debug the process.
For example, use the same data source but stop about halfway the number of records (use a simple counter and to itemskip when a certain number of records is reached).
Or, get rid of the Direct variants and use code in the Metadata tab.
Turn off logging during the process.
I there any chance that feeders are fired while loading ?
And so on.
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
- Steve Rowe
- Site Admin
- Posts: 2464
- Joined: Wed May 14, 2008 4:25 pm
- OLAP Product: TM1
- Version: TM1 v6,v7,v8,v9,v10,v11+PAW
- Excel Version: Nearly all of them
Re: Load Problem
The other option in this situation is to have multiple SQL queries. You don't have to process the data twice to get to the structures.
You could just say "give me the list of all the accounts in the data table" and use this to update the account dimension. Even better if you are reading from a ledger there is probably a table somewhere that has the structures you need, you might even be able to query for "new stuff since I last asked".
With large queries of the order you speak of it does not make sense to do the Metadata update as part of the data.
1. It is very inefficient to test every data row to see if it contains new Metadata.
2. Dimension updates are still locking, if you combine this with the data load then you'll lock the application for a long time.
Basically doing the metadata from the data is OK for small feeds but it doesn't scale and you should change the methodology.
You could just say "give me the list of all the accounts in the data table" and use this to update the account dimension. Even better if you are reading from a ledger there is probably a table somewhere that has the structures you need, you might even be able to query for "new stuff since I last asked".
With large queries of the order you speak of it does not make sense to do the Metadata update as part of the data.
1. It is very inefficient to test every data row to see if it contains new Metadata.
2. Dimension updates are still locking, if you combine this with the data load then you'll lock the application for a long time.
Basically doing the metadata from the data is OK for small feeds but it doesn't scale and you should change the methodology.
Technical Director
www.infocat.co.uk
www.infocat.co.uk