Hi All,
I am working on automating back up process for production data folder on daily basis.
I know there are many ready mate scripts available for this back up process and i have gone through IBM back up docs and TM1 Tutorial blog
our company is a process oriented company and they want every one to follow same process even it makes your work complex
Inorder to achieve this back up i got Shell script from other team and this shell script backs up current data folder on daily basis and delete any data folder greater than last 30 days in back up data folder.
The problem with this shell script is if any new TM1 object is created or deleted during this process this shell script will fail.(usually it takes between 30 to 60 seconds to complete shell script)
our TM1 Application is Global application and we cannot restrict user's globally from accessing the server during this back up process because this process run's on other dependies so we dont have extact time frame for this
i tried achieving same thing with windows batch file i havent faced this issue but it has limitation on zipping the folder beyond 2 GB.(as i said above our company dont want to use it)
so in order to make this shell script to work in all situation with out fail and to make this process to be more dynamic i am planing to restrict users from performing modification in TM1 server during this 30 to 60 seconds till the shell script is completed
i have one procedure in my mind like running this entire process in Bulk load mode .
process 1 : START BULK MODE
process 2 : SAVE DATA ALL
EXECUTE SHELL SCRIPT FROM TI
process 3 : END BULK MODE
The reason i want to run this in bulk load mode is to make sure this process run's as a single threaded while communciating with TM1 Server.
Is it adviseable to run this back up process in bulk mode or we have any other work arounds
How it is going to effect performance, Is there any cons in this process.Please advise.
i am using TM1 10.1 on 2008 Windows R2
Thanks in advance for your valuable suggestions
Data folder back up
-
- MVP
- Posts: 3706
- Joined: Fri Mar 13, 2009 11:14 am
- OLAP Product: TableManager1
- Version: PA 2.0.x
- Excel Version: Office 365
- Location: Switzerland
Re: Data folder back up
BulkLoadMode will stop any new users from logging in and will stop any other TI processes from running but it won't stop any currently logged on users from doing whatever they are doing. I think your best bet is to take a zip of the directory (using something like 7zip) and then point the shellscript at that.
-
- Posts: 62
- Joined: Tue Mar 13, 2012 4:34 am
- OLAP Product: TM1
- Version: 9.5.2 10.1 10.2
- Excel Version: 2007 2010 SP1
Re: Data folder back up
Thanks Lotsaram for your suggestion .
It is very difficult to raise a request for installing new software like 7Zip in our environment
from your post it looks like i cannot use bulk load mode approach either
It is very difficult to raise a request for installing new software like 7Zip in our environment
from your post it looks like i cannot use bulk load mode approach either
- Martin Ryan
- Site Admin
- Posts: 2003
- Joined: Sat May 10, 2008 9:08 am
- OLAP Product: TM1
- Version: 10.1
- Excel Version: 2010
- Location: Wellington, New Zealand
- Contact:
Re: Data folder back up
Why does the shell script fail? Is it because the object gets locked? I believe the "/C" switch will solve this by continuing even when an error occurs. e.g.
Code: Select all
xcopy "%srcFolder%" "%targetDir%" /e /c
Please do not send technical questions via private message or email. Post them in the forum where you'll probably get a faster reply, and everyone can benefit from the answers.
Jodi Ryan Family Lawyer
Jodi Ryan Family Lawyer
-
- Posts: 11
- Joined: Mon Feb 04, 2013 11:30 pm
- OLAP Product: Product
- Version: Version
- Excel Version: Version
- Location: Location
Re: Data folder back up
Hi,
If users don't connect during a particular time of the day (or maybe at night) maybe you can just stop the server, perform the backup and then restart the server. This is surely the safest way.
You can also consider to kick all users and then entering BULK mode?
Omar
If users don't connect during a particular time of the day (or maybe at night) maybe you can just stop the server, perform the backup and then restart the server. This is surely the safest way.
You can also consider to kick all users and then entering BULK mode?
Omar
Last edited by Tetsuo331 on Mon Oct 07, 2019 12:47 pm, edited 1 time in total.
-
- Posts: 33
- Joined: Tue Jan 29, 2013 2:52 pm
- OLAP Product: TM1
- Version: 9.0 SP3 9.5.1 10.1.1
- Excel Version: excel 2010 2007 2003
Re: Data folder back up
Hello Jodha,
Xcopy will fail if it encounters a file that is currently active. Whilst kicking out all users and stopping the service would certainly help, it goes a little against the grain of production availability that most installations are looking for.
You could consider copying only those files that you need and know to be static. (.cub, .dim, .rux, .pro, .cho....). I worked with a TM1 set up like this for many years. We then switched to backing up all files and encountered similar problems as you are describing. By doing this, you could fo example keep out the large .feeders files if you are using persistent feeders.
If you have a separate logging folder where you have the TM1 error messages, metadata logs, TM1 transaction logs etc and keep the data directory clean, then you will have more success. I think in the end we used a ROBOCOPY command as well, as it had a little more control than the XCOPY. This will also save you a lot of time on the backup process as you likely do not want to keep multiple versions of the large transaction logs.
Hope this helps.
-Chris
Xcopy will fail if it encounters a file that is currently active. Whilst kicking out all users and stopping the service would certainly help, it goes a little against the grain of production availability that most installations are looking for.
You could consider copying only those files that you need and know to be static. (.cub, .dim, .rux, .pro, .cho....). I worked with a TM1 set up like this for many years. We then switched to backing up all files and encountered similar problems as you are describing. By doing this, you could fo example keep out the large .feeders files if you are using persistent feeders.
If you have a separate logging folder where you have the TM1 error messages, metadata logs, TM1 transaction logs etc and keep the data directory clean, then you will have more success. I think in the end we used a ROBOCOPY command as well, as it had a little more control than the XCOPY. This will also save you a lot of time on the backup process as you likely do not want to keep multiple versions of the large transaction logs.
Hope this helps.
-Chris
-
- MVP
- Posts: 2836
- Joined: Tue Feb 16, 2010 2:39 pm
- OLAP Product: TM1, Palo
- Version: Beginning of time thru 10.2
- Excel Version: 2003-2007-2010-2013
- Location: Atlanta, GA
- Contact:
Re: Data folder back up
The only open file that a backup copy will choke on is tm1s.log. As long as you separate your logging directory from your data directory you will have no problem with a backup routine that just says "copy everything". If by chance a view or subset gets created while the routine is running and gets skipped, it will always get backed up the next time the job is run. I have been doing backups this way forever and have never had any issues.cgaunt wrote:Xcopy will fail if it encounters a file that is currently active.
-
- Posts: 33
- Joined: Tue Jan 29, 2013 2:52 pm
- OLAP Product: TM1
- Version: 9.0 SP3 9.5.1 10.1.1
- Excel Version: excel 2010 2007 2003
Re: Data folder back up
Hi Tomok,
Agreed. The other thing to watch out for is if you have metadata logging and you have not specifically moved the directory for that log in the config file. That too can choke the XCOPY.
In a previous installation we had isues with XCOPY and certain views, there was a condition where to release the choke we had to open a view and resave it. It may well have been environment and/or version specific and was only occasional. I never investigated this further as it was not a big issue at the time.
-Chris
Agreed. The other thing to watch out for is if you have metadata logging and you have not specifically moved the directory for that log in the config file. That too can choke the XCOPY.
In a previous installation we had isues with XCOPY and certain views, there was a condition where to release the choke we had to open a view and resave it. It may well have been environment and/or version specific and was only occasional. I never investigated this further as it was not a big issue at the time.
-Chris
-
- Posts: 62
- Joined: Tue Mar 13, 2012 4:34 am
- OLAP Product: TM1
- Version: 9.5.2 10.1 10.2
- Excel Version: 2007 2010 SP1
Re: Data folder back up
Thanks for your suggestions sorry for late replay i was stuck in middle of some other issues
i will try all your suggestions and update you shortly on this.
i will try all your suggestions and update you shortly on this.