I think this is a small bug seen in CX 10.2 something or other
If I create a consolidation in the prolog and put it in a view for the datasource of that TI then the process aborts since the consolidation is not in the dimension by the time the MD / D tabs access the view.
If I Process Break at the end of the TI I can see that the view gets created without issue.
Isn't this wrong? It means the TI is accessing the view before the prolog completes / commits so there seems to be some kind of ordering problem here.
Interestingly using the direct function doesn't resolve this issue and I had to step out into a sub TI to build the consolidation.
I expect TI has always behaved like this and just not noticed or thought about what was going on.
Bug in TI Process order?
- Steve Rowe
- Site Admin
- Posts: 2464
- Joined: Wed May 14, 2008 4:25 pm
- OLAP Product: TM1
- Version: TM1 v6,v7,v8,v9,v10,v11+PAW
- Excel Version: Nearly all of them
Bug in TI Process order?
Technical Director
www.infocat.co.uk
www.infocat.co.uk
- qml
- MVP
- Posts: 1097
- Joined: Mon Feb 01, 2010 1:01 pm
- OLAP Product: TM1 / Planning Analytics
- Version: 2.0.9 and all previous
- Excel Version: 2007 - 2016
- Location: London, UK, Europe
Re: Bug in TI Process order?
TI commits metadata changes only once - between Metadata and Data tabs. It does not do it at the end of Prolog and it's been that way since I can remember. It's easy to test. Put one DimensionElementInsert call in Prolog and one in Metadata. If you then add a ProcessQuit call anywhere before the Data tab then neither change will be committed. If you add a ProcessQuit call anywhere after the Metadata tab then both changes will be committed.
Kamil Arendt
- Steve Rowe
- Site Admin
- Posts: 2464
- Joined: Wed May 14, 2008 4:25 pm
- OLAP Product: TM1
- Version: TM1 v6,v7,v8,v9,v10,v11+PAW
- Excel Version: Nearly all of them
Re: Bug in TI Process order?
I'm not sure that it is 100% the same thing Kamil.
I thought that changes made in the "local to the TI" dimension were available to that TI for the life of the TI irrespective of when the changes are committed back to the system as whole.
For example
If I populate a text file with 10 rows of the same value, use this as a datasource with this Metadata as shown below
Then I get 9 rows in my b.cma file.
So there is a local copy of the dimension that the TI is using before fully committing.
I guess it has to be fully committed before being used against data but I can't quite convince myself that it has always been like that....Probably just encountering a nuance that I'd just worked around previously....
Hope you had a good break and the New Year treats you well.
Cheers,
I thought that changes made in the "local to the TI" dimension were available to that TI for the life of the TI irrespective of when the changes are committed back to the system as whole.
For example
If I populate a text file with 10 rows of the same value, use this as a datasource with this Metadata as shown below
Code: Select all
If(Dimix('A' , v1)=0);
DimensionElementInsert('A' , '' , v1 , 'N');
Else;
asciioutput('b.cma' , 'a');
EndIf;
So there is a local copy of the dimension that the TI is using before fully committing.
I guess it has to be fully committed before being used against data but I can't quite convince myself that it has always been like that....Probably just encountering a nuance that I'd just worked around previously....
Hope you had a good break and the New Year treats you well.
Cheers,
Technical Director
www.infocat.co.uk
www.infocat.co.uk
-
- MVP
- Posts: 3704
- Joined: Fri Mar 13, 2009 11:14 am
- OLAP Product: TableManager1
- Version: PA 2.0.x
- Excel Version: Office 365
- Location: Switzerland
Re: Bug in TI Process order?
Convince yourself. That's how it is.Steve Rowe wrote:I guess it has to be fully committed before being used against data but I can't quite convince myself that it has always been like that....Probably just encountering a nuance that I'd just worked around previously....
What Kamil said is correct, there isn't a separate prolog and metadata dimension commit, only the one at the end of metadata. The data source of a TI process can only be from the base model so if you have edited a dimension in the prolog then any changes can't be part of the assigned data source since changes exist only in the shadow copy and not in the base model. So a support style answer would be "this is expected behaviour".
If you made the prolog changes to insert the consolidation using the DimensionInsertDirect functions then I would guess it should work though as this takes the shadow dimension copy out of the equation.
Please place all requests for help in a public thread. I will not answer PMs requesting assistance.
-
- Site Admin
- Posts: 6667
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: Bug in TI Process order?
You're right... but so is Kamil. Lotsa beat me to this but I'll just add one other thing; "metadata ain't always metadata". There has always been a difference between subsets and dimension updates. Specifically, subsets do not need to go through the Metadata commit barrier to be updated. If they did it would make it damn near impossible to dynamically create a data source without having calls to other processes (which didn't exist back in version 8).Steve Rowe wrote:I'm not sure that it is 100% the same thing Kamil.
I thought that changes made in the "local to the TI" dimension were available to that TI for the life of the TI irrespective of when the changes are committed back to the system as whole.
Proof?
I created a dimension with two elements, Element 1 and Element 2.
Then I created two processes. All of the code is in the Prolog of both processes. Here is the code of process 1:
Code: Select all
SC_DIM = 'zTestDim';
SC_SUBSET = 'Sample Subset';
SubsetCreate( SC_DIM, SC_SUBSET);
SubsetElementInsert( SC_DIM, SC_SUBSET, 'Element 1', 1);
DimensionElementInsert( SC_DIM, '', 'Total Elements', 'C');
ExecuteProcess ('zTest2');
Code: Select all
SC_LOG = '\\tm1\tm1\Temp\SubsetVsElementTest.txt';
SC_DIM = 'zTestDim';
SC_SUBSET = 'Sample Subset';
AsciiOutput ( SC_LOG, 'The dimension ' | SC_DIM | ' contains ' | NumberToString(DimSiz ( SC_DIM) ) | ' ELEMENTS.' );
If ( SubsetExists ( SC_DIM, SC_SUBSET) = 1);
sMsg = 'The subset exists and has ' | NumberToString ( SubsetGetSize( SC_DIM, SC_SUBSET) ) | ' elements.';
Else;
sMsg = 'The subset does not exist.';
EndIf;
AsciiOutput ( SC_LOG, sMsg );
Code: Select all
"The dimension zTestDim contains 2 ELEMENTS."
"The subset exists and has 1 elements."
The subset didn't exist until I did the SubsetCreate command in the first process.
Yet the second process not only knew that it was there but correctly identified how many elements it had.
Conclusion? Subsets get updated immediately; they don't need to wait for the Metadata commit. Elements, on the other hand, are in a different boat, at least if the non-Direct functions are used. Obviously by the time the Metadata commit happens, it is too late to include the new element in your data source because the data source will have already iterated once, on the Metadata tab.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.