Clarification about memory consumption

Rosanero4Ever
Posts: 54
Joined: Thu Oct 18, 2012 5:08 pm
OLAP Product: Cognos TM1
Version: 10.1.1
Excel Version: 2010
Location: Italy

Re: Clarification about memory consumption

Post by Rosanero4Ever »

qml wrote:Rosanero, why did it have to take almost 20 posts (and counting) to pull out information you should have provided in your original post? Why oh why do people do that to others? Not everyone has the time to spend all day trying to get bits of essential information out of you.
I thought that initial information were good...i'm sorry...
qml wrote:You still haven't answered the questions about your original requirements. Do you even need patient-level detail in there? What for? Typically, an OLAP database (yes, it's a DB too) doesn't store this level of detail as it's rarely needed for analysis. And I honestly can't believe you will have 200 million or even 20 million patients over 10 years. Unless you really meant 10 thousand years..
As I wrote, I need the patient level because the cube will be used also in BI. Then, as I wrote, I'm testing my TM1 installation(may be I exaggerated).
Now, I'm trying with 5millions of elements (is a possible number if i consider more years period)
qml wrote:As my predecessors have already said, you are probably trying to load way to much data compared to the amount of RAM you have (the load times you quote support this theory). TM1 is pretty scalable, so I'm guessing your model could unfortunately actually work if you had sufficient RAM, but it still sounds like your design is all wrong. It boils down to your original requirements, which you aren't sharing.
qml, my initial question was intended to understand the cause of the problem. As you and your predeccors have said, the problem probably is due to my 8GB RAM (too few). It's enough for me. Knowing this aspect I'll modify my design or I'll use another solution.

I apologize for many posts and my intent is not to waste the poster's time

I renew my thanks to all
Rosanero4Ever
Posts: 54
Joined: Thu Oct 18, 2012 5:08 pm
OLAP Product: Cognos TM1
Version: 10.1.1
Excel Version: 2010
Location: Italy

Re: Clarification about memory consumption

Post by Rosanero4Ever »

tomok wrote:Once again, did you bother to peruse the sizing document from IBM??????? All this bother could have been prevented had you spent an hour so reviewing your design, the amount of data, and your server setup to see if it is going to fit. As I said before, no model with 200 million loaded intersections is going to fit into an 8GB server.
tomok, I just wrote in my previous post that I read the document you adviced. As I wrote, I'll read again this document with more attention
I'm sorry I bothered you...
User avatar
qml
MVP
Posts: 1098
Joined: Mon Feb 01, 2010 1:01 pm
OLAP Product: TM1 / Planning Analytics
Version: 2.0.9 and all previous
Excel Version: 2007 - 2016
Location: London, UK, Europe

Re: Clarification about memory consumption

Post by qml »

Rosanero4Ever wrote:As I wrote, I need the patient level because the cube will be used also in BI.
Not sure what you mean, but it does not look like a good reason to me. It doesn't look like any sort of reason, actually. By BI do you mean Cognos BI? Do you really expect to use 5/20/200 million dimension elements in Analysis Studio or Report Studio?

Have you thought of having aggregated analysis-level data in a TM1 cube with a drill-down to your relational DB when more details, like patient, are requested?
Kamil Arendt
Rosanero4Ever
Posts: 54
Joined: Thu Oct 18, 2012 5:08 pm
OLAP Product: Cognos TM1
Version: 10.1.1
Excel Version: 2010
Location: Italy

Re: Clarification about memory consumption

Post by Rosanero4Ever »

qml wrote:Not sure what you mean, but it does not look like a good reason to me. It doesn't look like any sort of reason, actually. By BI do you mean Cognos BI?
Yes, I do. This patient detail level is a client requirement
qml wrote: Do you really expect to use 5/20/200 million dimension elements in Analysis Studio or Report Studio?
client would like know even the single patient in a particular day. I share your doubt but it's a strict requirement...
User avatar
qml
MVP
Posts: 1098
Joined: Mon Feb 01, 2010 1:01 pm
OLAP Product: TM1 / Planning Analytics
Version: 2.0.9 and all previous
Excel Version: 2007 - 2016
Location: London, UK, Europe

Re: Clarification about memory consumption

Post by qml »

When designing a solution, requirements are only a small part of all things that you need to consider. Restrictions of the software, best practices and hardware sizing are some of the other factors that should be considered when deciding how to address the requirements. In this case, like in almost all cases, there is more than one design that meets the requirements you quote.

Edit: Rosanero, can you please write any further questions on the subject directly in this thread, not via PM? This way other people can benefit from the answers as well.

The question of how to create a drill-down in Cognos BI is not strictly a TM1 question (drill-down to relational databases can also be done in pure TM1 in a different way), but here are some options for you:

1) Dynamic drill through defined in your Framework Manager Package. Check in the Administration and Security Guide for more details on how to do it.

2) A separate Framework Manager Package on your relational database with a separate 'drill-down' report created on top of it, sharing a parameter with your original TM1-based report which would link to it.
Kamil Arendt
lotsaram
MVP
Posts: 3706
Joined: Fri Mar 13, 2009 11:14 am
OLAP Product: TableManager1
Version: PA 2.0.x
Excel Version: Office 365
Location: Switzerland

Re: Clarification about memory consumption

Post by lotsaram »

Rosanero4Ever wrote:... As you and your predeccors have said, the problem probably is due to my 8GB RAM (too few). It's enough for me. Knowing this aspect I'll modify my design or I'll use another solution.
Sorry to be blunt but I fear that as opposed to the limit of 8GB of RAM the real problem is that you don't know what you are doing and are out of your depth.
Rosanero4Ever
Posts: 54
Joined: Thu Oct 18, 2012 5:08 pm
OLAP Product: Cognos TM1
Version: 10.1.1
Excel Version: 2010
Location: Italy

Re: Clarification about memory consumption

Post by Rosanero4Ever »

qml wrote: 1) Dynamic drill through defined in your Framework Manager Package. Check in the Administration and Security Guide for more details on how to do it.

2) A separate Framework Manager Package on your relational database with a separate 'drill-down' report created on top of it, sharing a parameter with your original TM1-based report which would link to it.
ok for the drill-through...I didn't understand you referred to drill-through because you wrote "drill-down"
Thanks for your advice. Now it's more clear
User avatar
qml
MVP
Posts: 1098
Joined: Mon Feb 01, 2010 1:01 pm
OLAP Product: TM1 / Planning Analytics
Version: 2.0.9 and all previous
Excel Version: 2007 - 2016
Location: London, UK, Europe

Re: Clarification about memory consumption

Post by qml »

Rosanero4Ever wrote:ok for the drill-through...I didn't understand you referred to drill-through because you wrote "drill-down"
You are right. In this case the term "drill through" is more appropriate. In TM1 terminology it's often just a "drill".
Kamil Arendt
Rosanero4Ever
Posts: 54
Joined: Thu Oct 18, 2012 5:08 pm
OLAP Product: Cognos TM1
Version: 10.1.1
Excel Version: 2010
Location: Italy

Re: Clarification about memory consumption

Post by Rosanero4Ever »

lotsaram wrote:Sorry to be blunt but I fear that as opposed to the limit of 8GB of RAM the real problem is that you don't know what you are doing and are out of your depth.
No problem ;)
I'm still learning. I'll do surely better after this (long) discussion that is being very useful to clarify many doubts in my mind.
I revenue again my thanks to all
tomok
MVP
Posts: 2836
Joined: Tue Feb 16, 2010 2:39 pm
OLAP Product: TM1, Palo
Version: Beginning of time thru 10.2
Excel Version: 2003-2007-2010-2013
Location: Atlanta, GA
Contact:

Re: Clarification about memory consumption

Post by tomok »

Rosanero4Ever wrote:tomok, I just wrote in my previous post that I read the document you adviced. As I wrote, I'll read again this document with more attention
it's not that complicated. Take the number of rows of data in your relational source, assuming each metric is a single row, and multply that by around 20 (the number of bytes it's going to consume in a 64-bit environment) and that is how many bytes of RAM you need just for the base data, forgetting about any consolidations. Is that greater than 6,000,000,000?
Tom O'Kelley - Manager Finance Systems
American Tower
http://www.onlinecourtreservations.com/
Post Reply