Hi,
I am transferring data within the cube from one year to another through the TI process. For example - transferring Net Sales of 2013 into 2014.
This activity is taking a lot of time because of the feeders written in the cube.
I disabled all the feeders and the process completed within no time.
Is there any code using TI Functions , by which we can disable the feeders in the PROLOG
and again ENABLE it in the EPILOG...?
I used this TI function : RuleLoadFromFile()
but this also takes a significant amount of time in epilog (while attaching the rule file again).
PROLOG Code:
RuleLoadFromFile('Test','D:\Roy\Before.txt');
EPILOG Code:
RuleLoadFromFile('Test','D:\Roy\After.txt');
Any help will be really helpful...
Regards,
Roy.
Disabling FEEDERS in TI
- qml
- MVP
- Posts: 1098
- Joined: Mon Feb 01, 2010 1:01 pm
- OLAP Product: TM1 / Planning Analytics
- Version: 2.0.9 and all previous
- Excel Version: 2007 - 2016
- Location: London, UK, Europe
Re: Disabling FEEDERS in TI
Yes, the code you provided will do just that.roy2087 wrote:Is there any code using TI Functions , by which we can disable the feeders in the PROLOG
and again ENABLE it in the EPILOG...?
What else would you expect? The feeding has to take time - either during the data load or after it. Whether it is more optimal to use one or the other approach will differ in each case, so you have to test and decide what works for you. You will however not be able to have feeding happen automagically within 0 time which you seem to be expecting.I used this TI function : RuleLoadFromFile()
but this also takes a significant amount of time in epilog (while attaching the rule file again).
Kamil Arendt
-
- Posts: 42
- Joined: Fri Apr 19, 2013 7:07 pm
- OLAP Product: TM1
- Version: 10.1 RP1 FP1
- Excel Version: 2003 SP3
Re: Disabling FEEDERS in TI
I think the "rock and hard place" Roy is stuck between is that with active feeders, TM1 will fire feeders for each individual cell update as the transfer proceeds, while if he deactivates feeders by detaching and later reattaching the feeder statements, TM1 will re-fire feeders for *all* data in the cube. While the latter won't be as expensive as the initial processing of feeders at server load, it probably isn't ideal. Perhaps BatchUpdate could be used to group the changes and perform a single feeder processing run on only the new data, in batch, at the completion of the transfer. This may be faster - you'd have to test. If the transfer can be performed during downtime (no general user activity,) perhaps BulkLoadMode can be (alternatively) used to further optimize the feeding?
-
- MVP
- Posts: 3241
- Joined: Mon Dec 29, 2008 6:26 pm
- OLAP Product: TM1, Jedox
- Version: PAL 2.1.5
- Excel Version: Microsoft 365
- Location: Brussels, Belgium
- Contact:
Re: Disabling FEEDERS in TI
Hello
My view on this is that yes, please review all suggestions done.
But in the first place: have your feeders been written in a good/optimised way?
Could it be a cube or set of cubes that is way too heavy (and where TI should be used instead of rules and feeders).
Is the long load for feeders to be expected?
My view on this is that yes, please review all suggestions done.
But in the first place: have your feeders been written in a good/optimised way?
Could it be a cube or set of cubes that is way too heavy (and where TI should be used instead of rules and feeders).
Is the long load for feeders to be expected?
Best regards,
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
Wim Gielis
IBM Champion 2024-2025
Excel Most Valuable Professional, 2011-2014
https://www.wimgielis.com ==> 121 TM1 articles and a lot of custom code
Newest blog article: Deleting elements quickly
-
- Regular Participant
- Posts: 424
- Joined: Sat Mar 10, 2012 1:03 pm
- OLAP Product: IBM TM1, Planning Analytics, P
- Version: PAW 2.0.8
- Excel Version: 2019
Re: Disabling FEEDERS in TI
Apologies if i didn't get it correctly,The Batch update would put a lock on cubes and other objects,I agree you suggested it to be run on non business hours.But whenever an end user opens the view doesn't TM1 fires all the feeders again so how does batch update would help in this scenario.Appreciate if experts could please clarify.ThanksTableManagerOne wrote:I think the "rock and hard place" Roy is stuck between is that with active feeders, TM1 will fire feeders for each individual cell update as the transfer proceeds, while if he deactivates feeders by detaching and later reattaching the feeder statements, TM1 will re-fire feeders for *all* data in the cube. While the latter won't be as expensive as the initial processing of feeders at server load, it probably isn't ideal. Perhaps BatchUpdate could be used to group the changes and perform a single feeder processing run on only the new data, in batch, at the completion of the transfer. This may be faster - you'd have to test. If the transfer can be performed during downtime (no general user activity,) perhaps BulkLoadMode can be (alternatively) used to further optimize the feeding?
"You Never Fail Until You Stop Trying......"
- qml
- MVP
- Posts: 1098
- Joined: Mon Feb 01, 2010 1:01 pm
- OLAP Product: TM1 / Planning Analytics
- Version: 2.0.9 and all previous
- Excel Version: 2007 - 2016
- Location: London, UK, Europe
Re: Disabling FEEDERS in TI
Absolutely NOT. There are certain events that trigger feeder reprocessing and opening a view is not one of them (unless the cube in question had been unloaded from memory and needs to be loaded back on demand).BariAbdul wrote:But whenever an end user opens the view doesn't TM1 fires all the feeders again
Kamil Arendt
-
- Posts: 42
- Joined: Fri Apr 19, 2013 7:07 pm
- OLAP Product: TM1
- Version: 10.1 RP1 FP1
- Excel Version: 2003 SP3
Re: Disabling FEEDERS in TI
Like qml said, in general, opening a view wouldn't processes feeders. When a cell first becomes populated, feeder statements whose area definition includes that cell will be evaluated. This would typically happen at server load (when persistent feeders is not in use,) when an unloaded cube is forced to load, or when input (via TI, spread, etc.) occurs.
What I was suggesting was that Batch Update might be used to force the feeder evaluation of many cells to happen as a batch, rather than for every cell. Generally, batch operations are more optimal as the overhead that comes with switching between tasks can be avoided. Batch Update is an older technology with limitations. For example, when in Batch Update mode, aggregations and calculations will not work (as they wouldn't take into account the batched updates.) However, Batch Update doesn't impose any extra locking, and can be used concurrently by any number of threads. When the batch update finishes, the changes are merged into the base model as standard write back (i.e. w/Parallel Interaction - a non-blocking write.)
Bulk Load Mode is a newer technology meant primarily to address the extra overhead that came along with the locking model changes of 9.1. Unlike Batch Update, it forces the server into a single threaded session (disconnecting any user threads and temporarily deactivating any active chores.) Under single threaded operation, it can avoid segregating the object/data maintenance of that one thread into the shadow objects necessary to support undoing those changes in the event of a lock conflict (rollback.) Instead, the thread writes directly to the object. This can have a drastic effect on large data loads/transfers and their resulting feeder processing.
I agree with Wim that you should first attempt to optimize feeders. Better to fix the problem at the source, rather than downstream. Also, both approaches I suggest are untested and have their limitations. Figured they were worth mentioning though.
What I was suggesting was that Batch Update might be used to force the feeder evaluation of many cells to happen as a batch, rather than for every cell. Generally, batch operations are more optimal as the overhead that comes with switching between tasks can be avoided. Batch Update is an older technology with limitations. For example, when in Batch Update mode, aggregations and calculations will not work (as they wouldn't take into account the batched updates.) However, Batch Update doesn't impose any extra locking, and can be used concurrently by any number of threads. When the batch update finishes, the changes are merged into the base model as standard write back (i.e. w/Parallel Interaction - a non-blocking write.)
Bulk Load Mode is a newer technology meant primarily to address the extra overhead that came along with the locking model changes of 9.1. Unlike Batch Update, it forces the server into a single threaded session (disconnecting any user threads and temporarily deactivating any active chores.) Under single threaded operation, it can avoid segregating the object/data maintenance of that one thread into the shadow objects necessary to support undoing those changes in the event of a lock conflict (rollback.) Instead, the thread writes directly to the object. This can have a drastic effect on large data loads/transfers and their resulting feeder processing.
I agree with Wim that you should first attempt to optimize feeders. Better to fix the problem at the source, rather than downstream. Also, both approaches I suggest are untested and have their limitations. Figured they were worth mentioning though.
Re: Disabling FEEDERS in TI
Might be a terrible idea but perhaps using conditional feeders
in combo with the TI or feeding from a tiny cube to the cellputN target area?

in combo with the TI or feeding from a tiny cube to the cellputN target area?
Code: Select all
Feeders;
['Volume'] => DB(If(DB('Parameters', '2014', 'Feed Me')@='Y', 'GLCube', ''), 'Actual', '2014', 'Sales');
Yeon