Page 1 of 1

Memory or Caching Issue

Posted: Wed Nov 25, 2009 11:12 pm
by Ajay
Hi Everyone

I have a strange one which I'd like to get your thoughts on. I have a chore made up of four processes, which when run is designed to summarise some data in one cube, and then copy the summary elsewhere within the same cube.
I also have a large excel workbook which is a deck of reports which pulls on data within the version affected by the above chore.

Scenario:-

1. Refreshing the excel workbook takes about 20 seconds.

2. Run the chore which takes about 5-10 seconds and doesn't error (logging turned off)

3. Repeat step 1, refresh workbook, which now takes 8 minutes.

4. Re-refresh workbook (so again repeating step 1 above) now takes about 20 seconds again

It seems that immediately after the chore runs the refresh takes ages.

I know this sounds strange but I am a bit perplexed at what might be happening here. I have disabled the "SaveDataAll" in the final process of the chore. And i have also create another chore which containes a single chore with ExecuteProcess(All four processes) as well to make sure that it wasn't the chore mucking me around either.

My concern is that this is one chore, and I have many running in the hour. I have 200 plus users and quite rightly they are getting real performance issues now.

Have any of you come across this, and if so what could the solution be.

By the way the processes are constructing views of data which are then summarised and posted. Could the best approach be to spit out an ASCII file and import in ?

Your thoughts appreciated

Ajay

Re: Memory or Caching Issue

Posted: Wed Nov 25, 2009 11:26 pm
by Alan Kirk
Ajay wrote: 1. Refreshing the excel workbook takes about 20 seconds.

2. Run the chore which takes about 5-10 seconds and doesn't error (logging turned off)

3. Repeat step 1, refresh workbook, which now takes 8 minutes.

4. Re-refresh workbook (so again repeating step 1 above) now takes about 20 seconds again

It seems that immediately after the chore runs the refresh takes ages.
That's pretty much what I'd expect to happen. When you run the chore and change your values in the cube, it'll most likely dump all of the previously calculated values.

If you have performance monitor running (or if you don't, turn it on), keep an eye on the }StatsByCube cube. Specifically, watch the Number of Stored Calculated Cells. (I'm not sure whether this is consolidations only or rules calculations as well, but I'd imagine both. Edit: I just tested that by doing an N level browse of a cube which is entirely rules-based. It's inactive at this time of the week and had 0 calculated cells. The number went from 0 to 7 as expected, so since all of the values were N level this metric must include rules values, not just consolidations.)

I'll lay you odds that as soon as you run your chore (assuming that you allow a minute before you recalculate your spreadsheet) you'll see the numbers go from some value, to 0. Then when you recalculate your workbook and check after the following (8) minutes, it'll be back up to somewhere near the original number. The second calculation of the workbook won't need to recalculate those values since they'll already be in memory but as soon as you again run the chore... bang, all of the calculated values are gone again.

You may want to look at seeing whether you can summarise the data into a different cube rather than the same one; a cube with as few calculations as possible.

Re: Memory or Caching Issue

Posted: Thu Nov 26, 2009 2:12 pm
by mattgoff
A band-aid while you investigate a real solution is ViewConstruct. Create a view that's similar to the results pulled in your Excel report and call ViewConstruct in the Epilog of your summarization process. It won't shorten the time it takes for TM1 to re-calculate all of the cells, obviously, but it may improve apparent performance to the affected users.

Matt