Page 1 of 1

Optimizing query performance with large dimension (1.5 mil)

Posted: Tue Feb 12, 2013 1:53 am
by fleaster
Hi all,
I'm currently building a cube used for querying Customer information, however am running into a few performance issues as the main dimension contains 1-1.5 million customers/elements (+attributes) - all of which need to consolidate to top level and mid level rollups. The cube does respond to queries, however the response rate is not as good as I would like it to be (40-60 seconds).
(Note that the response time when referencing the cube using TI or inter-cube rules is ok)

Things I have done so far to address the query response time:

(a) try with/without rules + optimize feeders
(b) try with optimized cube dimension order
(c) consolidate structures and reduce the number of dimensions in cube

...these seem to have minimal impact on the base response time, so it seems to come down to the large number of elements in the customer dimension... soo if anyone has any ideas how to improve this, it would be much appreciated :)

Note:
Things I haven't tried yet, as I wasn't sure whether they would have a significant impact on query response time:
(d) removing element attributes (and possibly referencing these elsewhere in a separate cube)
(e) creating stargate view...?

Re: Optimizing query performance with large dimension (1.5 m

Posted: Tue Feb 12, 2013 8:18 am
by declanr
If it is purely a reporting cube and numbers will stay the same (other than after data load) you should try playing with the vmm and vmt settings... never remember which is which off the top of my head.

Just remember to have enough memory.

Re: Optimizing query performance with large dimension (1.5 m

Posted: Tue Feb 12, 2013 9:17 am
by paulsimon
Hi

The first thing to check is that the properties window is turned off, if you are opening the subset editor.

Are you using any MDX expressions in the subset. If so, then try replacing them with static subsets.

Do you have zero suppression turned on?

Are there string elements in the cube? If so are these only in the last dimension, and do they appear after any numerics?

I am not sure if you have all elements consolidating to a single top level element. In the past some have found that adding intermediate consolidations to give a fan out from any consolidation of no more than 10 may help.

What other dimensions do you have in the cube? How big are those?

What is the view that you are trying to retrieve that is taking a long time?

Have you checked that the cube is not running on virtual memory?

Regards

Paul Simon

Re: Optimizing query performance with large dimension (1.5 m

Posted: Tue Feb 12, 2013 11:13 am
by fleaster
Hi all,
thanks for the tips - in the end I found these things improved performance dramatically (from 40 to 5 sec) query response time :

(a) adding intermediate consolidations (e.g. grouping by first 4 digits of the customer number) worked really well

(b) reducing # of dimensions from 10 to 7 also helped (I think the extra dimensions may have been too sparse)

...so thanks all for the advice :)

Matt