Page 1 of 1
using parameters for data source
Posted: Tue Jan 25, 2011 6:23 am
by winsonlee
I have created a process. I would like to ask if it is possible to specify the data source through parameters ?
eg when i run the process, it prompt for the file path and i specify the filename and file path before the process execute ?
public void TestForecast(string filename)
{
TM1ProcessParameter[] objaParams = new TM1ProcessParameter[1];
TM1AdminServer admin = new TM1AdminServer("192.1.1.6", "tm1adminserver");
TM1Server myServer = admin.Servers["cxmd"].Login("yyy", "xxx");
objaParams[0] = new TM1ProcessParameter("Filename", "c:\" + filename);
TM1Process objTm1Process = myServer.Processes["Upload Information"];
objTm1Process.Execute(objaParams);
}
Re: using parameters for data source
Posted: Tue Jan 25, 2011 11:33 am
by David Usherwood
Interesting code.
What's it written in? Clearly not the TI scripting language.
Within TI set the variable DataSourceNameforServer to the file you want to process - and you can pass this in via a parameter.
But I would like to know what your code is running under.
Re: using parameters for data source
Posted: Thu Jan 27, 2011 11:31 am
by lotsaram
Regardless of whether you are calling the process via API from VBA, C, Java, .Net or whatever you still need to write actual TI code in the process itself.
Specifically you need to:
1. Create a paremater of data type string on the parameters tab, lets call it pDataSource
2. Write some code on the prolog tab to assign the pDataSource parameter string value as the data source for the process
DataSourceNameForServer = pDataSource;
Note if the process has a text file data source then the pDataSource string needs to be the full folder path and string of the file one the TM1 server.
You can then externally call the process via API and pass the data source parameter as part of the parameter array.
Re: using parameters for data source
Posted: Mon Jan 31, 2011 5:40 am
by winsonlee
I have selected the datasource type to "None"
this means that i cant define the variable.
in prolog, is there a way i can hav a loop to go through every line of the file and extract the data out and place it in a cube ?
Re: using parameters for data source
Posted: Mon Jan 31, 2011 5:45 am
by Alan Kirk
winsonlee wrote:I have selected the datasource type to "None"
this means that i cant define the variable.
in prolog, is there a way i can hav a loop to go through every line of the file and extract the data out and place it in a cube ?
Short answer "I don't believe so" since the TI language doesn't provide a function for sequentially reading a data source (indeed there's no reason why it would given that that's the job of the Data tab), longer answer you should not use a data source of None in the circumstances (as Lotsaram pointed out to you, and which you seem to have ignored), longer answer still it would be useful if you would be polite enough to respond to David's question given that you might get better answers if people actually knew what you were doing and in which language / API you were doing it.
Re: using parameters for data source
Posted: Tue Feb 01, 2011 10:13 pm
by winsonlee
thanks for the reply.
i am using c# with TM1 API. There is a folder which contains a list of txt file that i would need to import to TM1 cube. it will be time consuming if ever now and then there is a new file in the folder, I have to manually goes to TM1 Server explorer to select the data source and upload the data to cube.
So what i am trying to achieve is to use the API to goes through the directory and check if there is any new file exist in the directory, if there is then parse the file name to TM1 process and import the data in the file to the cube.
do you think such problem is achievable using TM1 API + creating a process to upload the data ?
Re: using parameters for data source
Posted: Tue Feb 01, 2011 10:27 pm
by Alan Kirk
winsonlee wrote:thanks for the reply.
i am using c# with TM1 API. There is a folder which contains a list of txt file that i would need to import to TM1 cube. it will be time consuming if ever now and then there is a new file in the folder, I have to manually goes to TM1 Server explorer to select the data source and upload the data to cube.
So what i am trying to achieve is to use the API to goes through the directory and check if there is any new file exist in the directory, if there is then parse the file name to TM1 process and import the data in the file to the cube.
do you think such problem is achievable using TM1 API + creating a process to upload the data ?
Lotsaram already gave you the answer to that.
Lotsaram wrote:Specifically you need to:
1. Create a paremater of data type string on the parameters tab, lets call it pDataSource
2. Write some code on the prolog tab to assign the pDataSource parameter string value as the data source for the process
DataSourceNameForServer = pDataSource;
Using that method all you need to do is pass the filename to the process via a parameter when you call it from the API. The DataSourceNameForServer function in the Prolog then takes care of dynamically reassigning the data source to the new filename.
Re: using parameters for data source
Posted: Tue Feb 01, 2011 10:33 pm
by tomok
Since I don't know what it is you are trying to accomplish, business-wise, it's hard for me to offer any suggestions (especially since I don't know what you mean by a "new" file). However, what makes you think you need to use the API to cycle through a list of files and load the data to TM1? Are you aware that 1) you can shell out to an external program (like a batch file or simple VB program) from within a TM1 process and 2) you can shell out to other TI processes from a TI process. I could accomplish what I think it is you are trying to do without the API. I would create a simple batch file that does a DIR and then writes that list of files to a simple text file. That text file could then be the data source for the next TI process which would read the list of files. On the Data tab I could shell out to the applicable TI that loads that particular file or to a generic TI if I also used parameters as part of the call that let me change the data source.
Re: using parameters for data source
Posted: Tue Feb 01, 2011 10:39 pm
by Alan Kirk
tomok wrote:Since I don't know what it is you are trying to accomplish, business-wise, it's hard for me to offer any suggestions (especially since I don't know what you mean by a "new" file). However, what makes you think you need to use the API to cycle through a list of files and load the data to TM1? Are you aware that 1) you can shell out to an external program (like a batch file or simple VB program) from within a TM1 process and 2) you can shell out to other TI processes from a TI process. I could accomplish what I think it is you are trying to do without the API. I would create a simple batch file that does a DIR and then writes that list of files to a simple text file. That text file could then be the data source for the next TI process which would read the list of files. On the Data tab I could shell out to the applicable TI that loads that particular file or to a generic TI if I also used parameters as part of the call that let me change the data source.
That is a seriously good idea. It would save a lot of API heartburn and with the exception of the batch file, would keep everything "in house" within the TM1 server with no need to have it triggered by an external application running the API code.
Re: using parameters for data source
Posted: Tue Feb 01, 2011 10:46 pm
by Martin Ryan
While we're chucking ideas around you might find the WildcardFileSearch and FileExists TI functions useful. Combined with a batch file to move files once imported (which you can call via ExecuteCommand) you can automate everything fairly easily.
Martin
Re: using parameters for data source
Posted: Wed Feb 23, 2011 11:01 pm
by winsonlee
hey ....
thanks for the guide given by you guys. it has been very helpful.
it does help me to look at an issue at different ways rather then always use API to achieve the same result.
i used tomok ideas and it works brilliantly. now i am able to import these files very efficiently.