lotsaram wrote:Well is it 200 million (2*10^8) or 2000 million (2*10^9) ??
sorry..200 millions
lotsaram wrote:In either case it would seem to be a fairly massive dimension for an OLAP cube
If I had to save 20 millions of elements in the dimension, do you think the situation would change? Or 20 millions are yet so much?
From the error screenshot you supplied it would seem that yes the 8GB of RAM on the server is inadequate, probably woefully. But dimensions, even huge dimensions, shouldn't be taking up too much of the total amount of memory, it is the cube data and retrieval performance that you should be really concerned about with having such large dimensions. But the cause of your immediate issue could be (or probably is) related to simply trying to browse one of these massive dimensions in the subset editor and that is what is causing the memory spike and termination of the application.
I have this error when I try to open the cube viewer.
lotsaram wrote:Given that your last question was on how to construct a hierarchy in a time dimension I suggest you go and get some training or at the very least invest in 1 or 2 days of consulting from someone who does know what they are doing with TM1 to look at your design and maybe redesign it, because dimensions of that size just doesn't sound right.
My huge dimensions are the following:
- data about hospital structure (patient name, name of the department, type of diagnosys, etc..)
- measure dimension
Other dimensions are about time and other 3 smaller
In my tests I'm verifying the behavior of the TM1 server with a load of data created in 10 years.
Thanks again for your precious advices.