TM1 server Message:system outofmemory"
-
- Regular Participant
- Posts: 155
- Joined: Fri May 20, 2011 8:17 am
- OLAP Product: Applix,Cognos TM!
- Version: applix9.0Cognos tm1 9.5.1
- Excel Version: Excel 2010 2007
TM1 server Message:system outofmemory"
Hello,
i would appriciate for below error:
while user loggin into the TM1 he Can not login . Receiving the following error messages.
"TM1 server Message:system outofmemory"
Any suggesions here why it just got the meesage .
And RAM size is:4GB
Data model size is:5.92 GB
OS is windows 2003 x64 sp2.
Thanks in andvance.
i would appriciate for below error:
while user loggin into the TM1 he Can not login . Receiving the following error messages.
"TM1 server Message:system outofmemory"
Any suggesions here why it just got the meesage .
And RAM size is:4GB
Data model size is:5.92 GB
OS is windows 2003 x64 sp2.
Thanks in andvance.
- Alan Kirk
- Site Admin
- Posts: 6610
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: TM1 server Message:system outofmemory"
Are you sure that the numbers above aren't giving you an indication of why that might be, assuming that by "model size" you're citing memory usage rather than disk usage? (Since disk usage has nothing to do with it.)mincharug.shulft wrote:Hello,
i would appriciate for below error:
while user loggin into the TM1 he Can not login . Receiving the following error messages.
"TM1 server Message:system outofmemory"
Any suggesions here why it just got the meesage .
And RAM size is:4GB
Data model size is:5.92 GB
OS is windows 2003 x64 sp2.
The message typically comes up when the server uses up all of the available memory, crashes and, assuming that you have the service set to restart, restarts itself.
Leaving aside the fact that your cited model size is larger than your cited RAM size, I think that 4 gig is a little bit light for a 64 bit system as well since it will obviously use more physical memory than a 32 bit one does. I'd look at throwing more memory at your server. A lot more memory.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-
- MVP
- Posts: 214
- Joined: Tue Nov 11, 2008 11:57 pm
- OLAP Product: TM1, CX
- Version: TM1 7x 8x 9x 10x CX 9.5 10.1
- Excel Version: XP 2003 2007 2010
- Location: Hungary
Re: TM1 server Message:system outofmemory"
Buying/allocating more RAM is the fastest and probably the cheapest solution.mincharug.shulft wrote:Hello,
i would appriciate for below error:
while user loggin into the TM1 he Can not login . Receiving the following error messages.
"TM1 server Message:system outofmemory"
Any suggesions here why it just got the meesage .
And RAM size is:4GB
Data model size is:5.92 GB
OS is windows 2003 x64 sp2.
Thanks in andvance.
If you cannot do that then you have to reduce the size of the model in the memory:
1) optimize the feeders (tm1 expert required)
2) delete a part of the historical data (consult with the key business users in advance)
BTW: do you have LoggingDirectory= line in the tm1s.cfg file?
The size of the model information can be misleading if your model is logging in the model directory.
You can separate them with the LoggingDirectory parameter.
Kind regrads,
Peter
Best Regards,
Peter
Peter
-
- Regular Participant
- Posts: 155
- Joined: Fri May 20, 2011 8:17 am
- OLAP Product: Applix,Cognos TM!
- Version: applix9.0Cognos tm1 9.5.1
- Excel Version: Excel 2010 2007
Re: TM1 server Message:system outofmemory"
How does it really help us once we reduse the data model size? and even when we add the logging directory paramater even then the loggs are in the memory with some other drive how does it really its work?
can you please explain little elabralty.
Becase as per my experiance all the data that we are using was storing in 4 GB of memory which cannot hanlded the usage/transacations that are doing by users.
Means here what i felt is by seeing logs there are lot transactions had done by the user where 4 GB RAM cannot able to handled.
Could you aggree or please advice me if i am wrong.
Only what should do here is Increase the RAM should help i do belive.please correct .......
can you please explain little elabralty.
Becase as per my experiance all the data that we are using was storing in 4 GB of memory which cannot hanlded the usage/transacations that are doing by users.
Means here what i felt is by seeing logs there are lot transactions had done by the user where 4 GB RAM cannot able to handled.
Could you aggree or please advice me if i am wrong.
Only what should do here is Increase the RAM should help i do belive.please correct .......
kpk wrote:Buying/allocating more RAM is the fastest and probably the cheapest solution.mincharug.shulft wrote:Hello,
i would appriciate for below error:
while user loggin into the TM1 he Can not login . Receiving the following error messages.
"TM1 server Message:system outofmemory"
Any suggesions here why it just got the meesage .
And RAM size is:4GB
Data model size is:5.92 GB
OS is windows 2003 x64 sp2.
Thanks in andvance.
If you cannot do that then you have to reduce the size of the model in the memory:
1) optimize the feeders (tm1 expert required)
2) delete a part of the historical data (consult with the key business users in advance)
BTW: do you have LoggingDirectory= line in the tm1s.cfg file?
The size of the model information can be misleading if your model is logging in the model directory.
You can separate them with the LoggingDirectory parameter.
Kind regrads,
Peter
- Alan Kirk
- Site Admin
- Posts: 6610
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: TM1 server Message:system outofmemory"
Your problem is that your server is crashing because it's out of memory.mincharug.shulft wrote:How does it really help us once we reduse the data model size?
And you seriously ask "How does it really help us once we reduse the data model size?"
Could you please take a moment to pause, reflect and think about that one?
I'll elaborate on the question that I asked before and which you didn't answer; where are you getting this "Data model size is:5.92 GB" number from?mincharug.shulft wrote:and even when we add the logging directory paramater even then the loggs are in the memory with some other drive how does it really its work?
can you please explain little elabralty.
If it's the size on disk, as I stated before that size is completely, utterly and comprehensively irrelevant.
I think that Peter's point was that if your log files are going into your data directory then the data size, as measured by size on disk, will be overstated.
And again I point out that size on disk is completely, utterly and comprehensively irrelevant. All that matters is the size that the model occupies in memory, which will have no direct correlation to the size on disk, especially if rules are involved.
mincharug.shulft wrote: OS is windows 2003 x64 sp2.
There's something that I seem to recall hearing previously, can't quite recall what it was... it'll come to me...mincharug.shulft wrote: Only what should do here is Increase the RAM should help i do belive.please correct .......
Oh, right, that was it.Alan Kirk wrote: I think that 4 gig is a little bit light for a 64 bit system as well since it will obviously use more physical memory than a 32 bit one does. I'd look at throwing more memory at your server. A lot more memory.
Even on Standard edition (R2 at least) you should be able to go to 32 Gig on an x64 platform, server permitting.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-
- MVP
- Posts: 263
- Joined: Fri Jun 27, 2008 12:15 am
- OLAP Product: Cognos TM1, CX
- Version: 9.0 and up
- Excel Version: 2007 and up
Re: TM1 server Message:system outofmemory"
You are not serious, are you?kpk wrote: BTW: do you have LoggingDirectory= line in the tm1s.cfg file?
The size of the model information can be misleading if your model is logging in the model directory.
You can separate them with the LoggingDirectory parameter.
Peter
-
- MVP
- Posts: 214
- Joined: Tue Nov 11, 2008 11:57 pm
- OLAP Product: TM1, CX
- Version: TM1 7x 8x 9x 10x CX 9.5 10.1
- Excel Version: XP 2003 2007 2010
- Location: Hungary
Re: TM1 server Message:system outofmemory"
As Alan wrote:Gregor Koch wrote:You are not serious, are you?kpk wrote: BTW: do you have LoggingDirectory= line in the tm1s.cfg file?
The size of the model information can be misleading if your model is logging in the model directory.
You can separate them with the LoggingDirectory parameter.
Peter
"I think that Peter's point was that if your log files are going into your data directory then the data directory, as measured by size on disk, will be overstated."
Of course the size of the model on the disk is not relevant from memory point of view.
For the memory issue I suggested to add more RAM or reduce the model.
Kind regards,
Peter
Best Regards,
Peter
Peter
- Alan Kirk
- Site Admin
- Posts: 6610
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: TM1 server Message:system outofmemory"
Yup, that's why I was trying to get him to state where he'd plucked this "5.92 GB" figure from. I'm betting, like you, that this may well be a disk number. Of course there really isn't a fixed "in memory" model size unless there are no rules and no C elements in any cube, but it was I suppose possible that he was quoting a startup memory usage number which is why I tried to cover all the bases.kpk wrote:As Alan wrote:Gregor Koch wrote:You are not serious, are you?kpk wrote: BTW: do you have LoggingDirectory= line in the tm1s.cfg file?
The size of the model information can be misleading if your model is logging in the model directory.
You can separate them with the LoggingDirectory parameter.
Peter
"I think that Peter's point was that if your log files are going into your data directory then the data size, as measured by size on disk, will be overstated."
Of course the size of the model on the disk is not relevant from memory point of view.
As Professor Walter Lewin says, "Any measure, without knowledge of its degree of uncertainty, is meaningless!". (You really need to hear him say it to get the full Lewin experience.)
This is doubly so when it's not even clear what object is being measured in the first place.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-
- MVP
- Posts: 263
- Joined: Fri Jun 27, 2008 12:15 am
- OLAP Product: Cognos TM1, CX
- Version: 9.0 and up
- Excel Version: 2007 and up
Re: TM1 server Message:system outofmemory"
Was just checking, sounded like a joke to me.
-
- Regular Participant
- Posts: 155
- Joined: Fri May 20, 2011 8:17 am
- OLAP Product: Applix,Cognos TM!
- Version: applix9.0Cognos tm1 9.5.1
- Excel Version: Excel 2010 2007
Re: TM1 server Message:system outofmemory"
Hi,
The size that i have specified was from the drive and by selcting right click & properties on the libraries/Data model( size is:5.92 GB) and
the memory that was occupaying datamodel was 3929744 GB out of 4GB RAM when users are using the application this is i got it from task managar and
Thanks lot to all here to steering me to understand as expertise.
The size that i have specified was from the drive and by selcting right click & properties on the libraries/Data model( size is:5.92 GB) and
the memory that was occupaying datamodel was 3929744 GB out of 4GB RAM when users are using the application this is i got it from task managar and
Thanks lot to all here to steering me to understand as expertise.
Alan Kirk wrote:Your problem is that your server is crashing because it's out of memory.mincharug.shulft wrote:How does it really help us once we reduse the data model size?
And you seriously ask "How does it really help us once we reduse the data model size?"
Could you please take a moment to pause, reflect and think about that one?
I'll elaborate on the question that I asked before and which you didn't answer; where are you getting this "Data model size is:5.92 GB" number from?mincharug.shulft wrote:and even when we add the logging directory paramater even then the loggs are in the memory with some other drive how does it really its work?
can you please explain little elabralty.
If it's the size on disk, as I stated before that size is completely, utterly and comprehensively irrelevant.
I think that Peter's point was that if your log files are going into your data directory then the data size, as measured by size on disk, will be overstated.
And again I point out that size on disk is completely, utterly and comprehensively irrelevant. All that matters is the size that the model occupies in memory, which will have no direct correlation to the size on disk, especially if rules are involved.
mincharug.shulft wrote: OS is windows 2003 x64 sp2.There's something that I seem to recall hearing previously, can't quite recall what it was... it'll come to me...mincharug.shulft wrote: Only what should do here is Increase the RAM should help i do belive.please correct .......
Oh, right, that was it.Alan Kirk wrote: I think that 4 gig is a little bit light for a 64 bit system as well since it will obviously use more physical memory than a 32 bit one does. I'd look at throwing more memory at your server. A lot more memory.
Even on Standard edition (R2 at least) you should be able to go to 32 Gig on an x64 platform, server permitting.
-
- MVP
- Posts: 2832
- Joined: Tue Feb 16, 2010 2:39 pm
- OLAP Product: TM1, Palo
- Version: Beginning of time thru 10.2
- Excel Version: 2003-2007-2010-2013
- Location: Atlanta, GA
- Contact:
Re: TM1 server Message:system outofmemory"
The bottom line is you don't have enough RAM in your TM1 server. 4GB in a 64-bit TM1 server is barely enough to have the sample database that comes with the TM1 installation, let alone any working model. At a minimum you should double the RAM to 8GB and I would recommend going to 32GB or 64GB. RAM is cheap compared to human capital wasted working with systems that crash all the time.
Ever heard the saying "being penny-wise and pound-foolish"? Trying to save a few bucks by cheaping out and only buying the amount of RAM you think you need for TM1 never works out to the good. Right-size your model using the worksheet on the IBM web site and then double the estimate. I just did a policy/premium projection model for an insurance company and we sized the model out at around 2GB, given the current book of business. I had them put 64GB on the TM1 server. It was an extra few thousand dollars over the 16GB that was probably adequate. That's a drop in the bucket compared to the total cost of the project and they won't have to worry about running out of RAM for years. It will also open up the server for more uses of TM1 should they get creative.
Ever heard the saying "being penny-wise and pound-foolish"? Trying to save a few bucks by cheaping out and only buying the amount of RAM you think you need for TM1 never works out to the good. Right-size your model using the worksheet on the IBM web site and then double the estimate. I just did a policy/premium projection model for an insurance company and we sized the model out at around 2GB, given the current book of business. I had them put 64GB on the TM1 server. It was an extra few thousand dollars over the 16GB that was probably adequate. That's a drop in the bucket compared to the total cost of the project and they won't have to worry about running out of RAM for years. It will also open up the server for more uses of TM1 should they get creative.
-
- Posts: 8
- Joined: Thu Oct 27, 2011 4:31 pm
- OLAP Product: TM1
- Version: 9.5
- Excel Version: 2003
Re: TM1 server Message:system outofmemory"
To put it simply;mincharug.shulft wrote: Becase as per my experiance all the data that we are using was storing in 4 GB of memory which cannot hanlded the usage/transacations that are doing by users.
Means here what i felt is by seeing logs there are lot transactions had done by the user where 4 GB RAM cannot able to handled.
Could you aggree or please advice me if i am wrong.
Only what should do here is Increase the RAM should help i do belive.please correct .......
- the main reason people use 64bit is to utilise the extra RAM that 32bit architecture cannot use.
- 32bit is limited to around 3.7GB of RAM for the whole system.
- by whole system that means the WHOLE system - operating system, other running programs like anti-virus, the whole deal.
- to procure & install a 64bit system with only 4GB of RAM is like buying a Ferrari then replacing the engine for a one out of a Fiat 500. Pointless.
- couple that fact with 64bit applications usually consume more RAM than their 32bit versions, its no surprise your model is crashing all the time.
Simple
- Alan Kirk
- Site Admin
- Posts: 6610
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: TM1 server Message:system outofmemory"
Bad analogy, Enforcer; that would be an improvement. Since it would push the Ferrari neither very far nor very fast, it would reduce the chances of it ripping apart like a soggy bog roll every time it hits anything with a level of structural rigidity >= a toothpick.TheEnforcer wrote:[*]to procure & install a 64bit system with only 4GB of RAM is like buying a Ferrari then replacing the engine for a one out of a Fiat 500. Pointless.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-
- Regular Participant
- Posts: 155
- Joined: Fri May 20, 2011 8:17 am
- OLAP Product: Applix,Cognos TM!
- Version: applix9.0Cognos tm1 9.5.1
- Excel Version: Excel 2010 2007
Re: TM1 server Message:system outofmemory"
All gentle mans,
Really its implausible reply I got it from your gentle mans.Really again I should say thanks lot to OLAP forums team.
Please help here.
If I implement it Start Performance monitor On .do you see any impact to my application or any performance issues will I get it.?
Because I have written TI script to get the memory allocation of cube size in the RAM which I can get it from }statsByCube.
Please help ….
Thanks in advance.
Really its implausible reply I got it from your gentle mans.Really again I should say thanks lot to OLAP forums team.
Please help here.
If I implement it Start Performance monitor On .do you see any impact to my application or any performance issues will I get it.?
Because I have written TI script to get the memory allocation of cube size in the RAM which I can get it from }statsByCube.
Please help ….
Thanks in advance.
- Alan Kirk
- Site Admin
- Posts: 6610
- Joined: Sun May 11, 2008 2:30 am
- OLAP Product: TM1
- Version: PA2.0.9.18 Classic NO PAW!
- Excel Version: 2013 and Office 365
- Location: Sydney, Australia
- Contact:
Re: TM1 server Message:system outofmemory"
Anything that you run on the server will use some resources (memory, CPU time and so on), but I normally run with Performance Monitor on and I've never seen it take up significant resources. It's never had any impact that was noticeable.mincharug.shulft wrote: If I implement it Start Performance monitor On .do you see any impact to my application or any performance issues will I get it.?
Because I have written TI script to get the memory allocation of cube size in the RAM which I can get it from }statsByCube.
The only thing that you need to make sure of is that you turn logging OFF for the cubes }StatsByClient, }StatsByCube, }StatsByCubeByClient and }StatsForServer. (Right click on "Cubes", select "Security Assignments...", and uncheck "Logging" for those 4 cubes. I note that in version 10 they're still logged by default; well done Iboglix .) If you don't do that you can end up with transaction logs which readily and rapidly blow out to multi-gigabyte sizes, and which become useless for finding the transactions that you do want to trace.
"To them, equipment failure is terrifying. To me, it’s 'Tuesday.' "
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-----------
Before posting, please check the documentation, the FAQ, the Search function and FOR THE LOVE OF GLUB the Request Guidelines.
-
- Regular Participant
- Posts: 155
- Joined: Fri May 20, 2011 8:17 am
- OLAP Product: Applix,Cognos TM!
- Version: applix9.0Cognos tm1 9.5.1
- Excel Version: Excel 2010 2007
Re: TM1 server Message:system outofmemory"
Thanks for your valuable reply.
Actually there are couple of processes has run on demand and I have written TI script to get the memory allocation in the Memory for all the cubes will run weekly once.
And Regarding re-ordering the dimensions I was started doing the re-ordering but before re-ordering I the size was 32KB (I can able to see in the current order of dimension in memory used) and when i re-order the dimensions I am able to get the percentage change as -0.011 and sometimes -1.0 and sometimes or someplace -4.45 BUT I have stopped re-order at -4.45 and click ok .and again I re-login and see the size of the current order of dimension in memory used is 63104KB so is it increased the memory size but not decreased ? And please help me to go correct way.
Actually there are couple of processes has run on demand and I have written TI script to get the memory allocation in the Memory for all the cubes will run weekly once.
And Regarding re-ordering the dimensions I was started doing the re-ordering but before re-ordering I the size was 32KB (I can able to see in the current order of dimension in memory used) and when i re-order the dimensions I am able to get the percentage change as -0.011 and sometimes -1.0 and sometimes or someplace -4.45 BUT I have stopped re-order at -4.45 and click ok .and again I re-login and see the size of the current order of dimension in memory used is 63104KB so is it increased the memory size but not decreased ? And please help me to go correct way.
- qml
- MVP
- Posts: 1095
- Joined: Mon Feb 01, 2010 1:01 pm
- OLAP Product: TM1 / Planning Analytics
- Version: 2.0.9 and all previous
- Excel Version: 2007 - 2016
- Location: London, UK, Europe
Re: TM1 server Message:system outofmemory"
Firstly, you are reordering a cube that has hardly any data in it. 32KB is nothing. If you want to have any real gain from reordering, do it with big cubes. Secondly, the memory usage will increase after reordering if you don't restart the server as it will keep both the memory allocated with the old order and with the new order. This is why you can see memory usage grow by ~95% for that cube. Logging off is not enough to return the previously allocated memory to the OS, TM1 doesn't work that way.mincharug.shulft wrote:And Regarding re-ordering the dimensions I was started doing the re-ordering but before re-ordering I the size was 32KB (I can able to see in the current order of dimension in memory used) and when i re-order the dimensions I am able to get the percentage change as -0.011 and sometimes -1.0 and sometimes or someplace -4.45 BUT I have stopped re-order at -4.45 and click ok .and again I re-login and see the size of the current order of dimension in memory used is 63104KB so is it increased the memory size but not decreased ? And please help me to go correct way.
Kamil Arendt