Hi There,
I am trying to enhance the speed to copy a large set of data from one version to another version or from one month to another month, they are identical data, currently, I am using TI to copy all the n level by using CellPutN, but it is very slow, take like over an hour to finish to process, is there a way to do that in a faster way?
Regards
PlayKid
Advice on Copying Large Identical Data?
- jim wood
- Site Admin
- Posts: 3958
- Joined: Wed May 14, 2008 1:51 pm
- OLAP Product: TM1
- Version: PA 2.0.7
- Excel Version: Office 365
- Location: 37 East 18th Street New York
- Contact:
Re: Advice on Copying Large Identical Data?
This is always a pain. The method I have always stuck to (As it is easier to role back) is exporting the data to flat file first and the importing data via another process. I know this is effectively the same as you are already doing but it is much quicker to change back. Also I have found that using a view as data source can be very slow especially if you are caclulating any measures as you load.
Struggling through the quagmire of life to reach the other side of who knows where.
Shop at Amazon
Jimbo PC Builds on YouTube
OS: Mac OS 11 PA Version: 2.0.7
Shop at Amazon
Jimbo PC Builds on YouTube
OS: Mac OS 11 PA Version: 2.0.7
-
- Site Admin
- Posts: 6644
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: Advice on Copying Large Identical Data?
Sorry if this is an obvious one, but I take it that the view that you're using to copy from does suppress zeroes? If not it's far faster to completely blow away the data in the target version and load only the combinations which have values. Given that you say that you're doing it at N level I'm assuming that you're already skipping consolidations. (Presumably rules (if any) as well, otherwise you'd have rejects.)PlayKid wrote: I am trying to enhance the speed to copy a large set of data from one version to another version or from one month to another month, they are identical data, currently, I am using TI to copy all the n level by using CellPutN, but it is very slow, take like over an hour to finish to process, is there a way to do that in a faster way?
However TI's about as fast as it gets when it comes to data movement.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
- Steve Rowe
- Site Admin
- Posts: 2455
- Joined: Wed May 14, 2008 4:25 pm
- OLAP Product: TM1
- Version: TM1 v6,v7,v8,v9,v10,v11+PAW
- Excel Version: Nearly all of them
Re: Advice on Copying Large Identical Data?
When you have a view as a data source TM1 has to construct the whole view in memory before it starts to read it into a new area. If you have a large source view alot of the time can be taken while TM1 constructs this view in memory.
What I've done before is to break the view up into smaller bits, as it seems to be faster to process many small chunks rather than 1 big one.
For example, if you are copying a whole year at once then set the TI up to do a month at a time and then call the copying TI 12/13 times from another TI whilst passing the period you want to copy as a parameter.
Other things you can do is as Alan says, make sure zeros, consolidations and rules are suppressed. Also do your best to trap errors so that you don't get any log messages as this can drag the speed down, though this is not as bad as it used to be since the log file stops after so many records.
If the area you are copying into is the trigger for feeders this will slow you down too, you can use the RuleLoadFile function to set the rules to blank and then reload them at the end of the process, though I've never found the need to do this.
HTH
What I've done before is to break the view up into smaller bits, as it seems to be faster to process many small chunks rather than 1 big one.
For example, if you are copying a whole year at once then set the TI up to do a month at a time and then call the copying TI 12/13 times from another TI whilst passing the period you want to copy as a parameter.
Other things you can do is as Alan says, make sure zeros, consolidations and rules are suppressed. Also do your best to trap errors so that you don't get any log messages as this can drag the speed down, though this is not as bad as it used to be since the log file stops after so many records.
If the area you are copying into is the trigger for feeders this will slow you down too, you can use the RuleLoadFile function to set the rules to blank and then reload them at the end of the process, though I've never found the need to do this.
HTH
Technical Director
www.infocat.co.uk
www.infocat.co.uk