What would be the most efficient way to feed Oracle with TM1 data ?

Post Reply
BobMilli
Posts: 8
Joined: Thu Feb 28, 2013 9:45 pm
OLAP Product: TM1
Version: 10.2.2
Excel Version: 2010

What would be the most efficient way to feed Oracle with TM1 data ?

Post by BobMilli »

Hi,
We've got a bunch of TM1 databases and for some of them, whether it's about TM1 calculated data or users data entry,we need to push TM1 Data to Oracle.
What would be the best way to do so.
Usually, we use the ODBC stuff in the following manner :
  • open the odbc connection in the Prolog
    on the Data tab generate insert SQL commands and run them through a ODBCOutput TI command
    still on the Data tab, for each 1000 records, run a SQL Commit command otherwise the Oracle tablespaces are growing too much
    run a Commit and an ODBC close on the Epilog
The only issue I have with this is the extreme slowness of the process.
Yesterday we've made a test and the process took 7 minutes to export the data in a text file and more than 6 hours to do the same in an Oracle table.

For sure we'll need to change our method.
I've got 2 solutions in mind at this time : use the exported text file through an external ETL which will feed Oracle in a more efficient way or generate a SQL+ script that could be fired in the Epilog.

Is there anybody in the forum who faced the same kind of need ? What would be the more efficient way ?
Thanks for the help.
Regards,
Bob
Wim Gielis
MVP
Posts: 3105
Joined: Mon Dec 29, 2008 6:26 pm
OLAP Product: TM1, Jedox
Version: PAL 2.0.9.18
Excel Version: Microsoft 365
Location: Brussels, Belgium
Contact:

Re: What would be the most efficient way to feed Oracle with TM1 data ?

Post by Wim Gielis »

Bonjour Bob,

What I did in the past is output to a text file and then in the Epilog, do a Bulk Insert statement. This is with SQL Server, not sure what this would be in Oracle. The time spent is only a fraction of the method of individual Insert statements (or variants thereof by sending more than 1 record at a time).

Hope this helps,

Wim
Best regards,

Wim Gielis

IBM Champion 2024
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
lotsaram
MVP
Posts: 3651
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: What would be the most efficient way to feed Oracle with TM1 data ?

Post by lotsaram »

Depending on the length of your SQL statement for the ODBC insert you can probably achieve somewhere from a 10x to 100x throughput improvement by not only running the SQL Commit every 1000 records but also batching the ODBCOutput so that you don't execute it for each TM1 cell that is processed. The concept is simple; TM1 TI variables (and hopefully the Oracle ODBC driver) can handle very long strings, at least 64000 unicode characters. For every record of the data source check the length of the SQL statement and when you get to 50K characters or thereabouts then do the ODBCOutput, else just keep on concatenating the Insert Into or Update Into or whatever your SQL statement happens to be. This will make the ODBC channel much less "chatty" and can easily give a 1 or 2 order of magnitude speed improvement. Note that with this method there will always be a left over uncommitted string that needs to be handled with a ODBCOutput on the Epilog as well as the Data tab.

For a really big insert the best is going to be as already suggested to export a flat file and then run a stored procedure to grab the file and perform a bulk update. But in many cases "batching" the ODBCOutput statements could be good enough, as good or better.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
Post Reply