RAM usage in production job/mode

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

RAM usage in production job/mode

Michael Müller

Hello Everyone,

I have a problem with the RAM usage of SPSS and would like to know if anyone has a solution.


Problem:

RAM Usage increases over time, without setting memory free again until SPSS crashes (more RAM helps, but in this case we cannot determine/change the used hardware).

Details:

I use a production job / SPSS 19 or SPSS 20 / Windows 7 Ultimate 64 / 4 GB RAM.

RAM usage (stast.com increases until crash):

Java.exe                              304K

Javaw.exe                         3,264K

Spssengine.exe            100,932K

Stats.exe                       143,904K

Stats.com                   3,212,340K

 

Dataset close and output close are used widely, but seems to have no effect on this problem. The memory is only cleared, when the job ends or SPSS crashes.

I would be thankful to any ideas. Just tell me if you need further information.

 

Regards

Michael

Reply | Threaded
Open this post in threaded view
|

Re: RAM usage in production job/mode

Jon K Peck
You should not have stats.com and stats.exe both running, but the real issue here, given the small amount of memory being used by the backend (spssengine) process is likely the Viewer, which is part of the stats.exe process.  The Viewer contents live in memory, so eventually enough output would exhaust the available resources.  You can control and recover this using the OUTPUT commands, in particular OUTPUT SAVE and OUTPUT CLOSE so that you can get a large output volume out of memory.

In addition, SPSS 20 has the light tables option (see Edit > Options > Pivot Tables) that simplifies the tables resulting in less memory usage.  Later versions of Statistics address memory and speed issues better.

You can take this to an extreme by using Python external mode programmability, in which case there is no Viewer and output is captured using OMS instead.



Jon Peck (no "h") aka Kim
Senior Software Engineer, IBM
[hidden email]
phone: 720-342-5621




From:        Michael Müller <[hidden email]>
To:        [hidden email],
Date:        12/05/2013 01:20 PM
Subject:        [SPSSX-L] RAM usage in production job/mode
Sent by:        "SPSSX(r) Discussion" <[hidden email]>




Hello Everyone,
I have a problem with the RAM usage of SPSS and would like to know if anyone has a solution.

Problem:
RAM Usage increases over time, without setting memory free again until SPSS crashes (more RAM helps, but in this case we cannot determine/change the used hardware).
Details:
I use a production job / SPSS 19 or SPSS 20 / Windows 7 Ultimate 64 / 4 GB RAM.
RAM usage (stast.com increases until crash):
Java.exe                              304K
Javaw.exe                         3,264K
Spssengine.exe            100,932K
Stats.exe                       143,904K
Stats.com                   3,212,340K
 
Dataset close and output close are used widely, but seems to have no effect on this problem. The memory is only cleared, when the job ends or SPSS crashes.
I would be thankful to any ideas. Just tell me if you need further information.
 
Regards
Michael
Reply | Threaded
Open this post in threaded view
|

Re: RAM usage in production job/mode

Michael Müller

Could you explain the difference between "stats.exe" and "stats.com"? How can I decide or choose which of the processes I want to run? “Stats.com” only appears in the task manager when I use production jobs (SPSS still open).

 

Output close is already used by me, but I cannot see an affect. In normal syntax (without using production) it works fine to clear memory with output commands, but not in this case.

 

Which light table options do you mean? Column Width, Table Rendering or Default Editing Mode are the only settings I have for Pivot Tables.

 

Regards

Michael




2013/12/5 Jon K Peck <[hidden email]>
You should not have stats.com and stats.exe both running, but the real issue here, given the small amount of memory being used by the backend (spssengine) process is likely the Viewer, which is part of the stats.exe process.  The Viewer contents live in memory, so eventually enough output would exhaust the available resources.  You can control and recover this using the OUTPUT commands, in particular OUTPUT SAVE and OUTPUT CLOSE so that you can get a large output volume out of memory.

In addition, SPSS 20 has the light tables option (see Edit > Options > Pivot Tables) that simplifies the tables resulting in less memory usage.  Later versions of Statistics address memory and speed issues better.

You can take this to an extreme by using Python external mode programmability, in which case there is no Viewer and output is captured using OMS instead.



Jon Peck (no "h") aka Kim
Senior Software Engineer, IBM
[hidden email]
phone: <a href="tel:720-342-5621" target="_blank" value="+17203425621">720-342-5621




From:        Michael Müller <[hidden email]>
To:        [hidden email],
Date:        12/05/2013 01:20 PM
Subject:        [SPSSX-L] RAM usage in production job/mode
Sent by:        "SPSSX(r) Discussion" <[hidden email]>




Hello Everyone,
I have a problem with the RAM usage of SPSS and would like to know if anyone has a solution.

Problem:
RAM Usage increases over time, without setting memory free again until SPSS crashes (more RAM helps, but in this case we cannot determine/change the used hardware).
Details:
I use a production job / SPSS 19 or SPSS 20 / Windows 7 Ultimate 64 / 4 GB RAM.
RAM usage (stast.com increases until crash):
Java.exe                              304K
Javaw.exe                         3,264K
Spssengine.exe            100,932K
Stats.exe                       143,904K
Stats.com                   3,212,340K
 
Dataset close and output close are used widely, but seems to have no effect on this problem. The memory is only cleared, when the job ends or SPSS crashes.
I would be thankful to any ideas. Just tell me if you need further information.
 
Regards
Michael

Reply | Threaded
Open this post in threaded view
|

Re: RAM usage in production job/mode

Jon K Peck
It doesn't matter which you use, but if you start Statistics from the menu or icon, it starts stats.exe.  If you run a production job from the Client, it would launch stats.com.  Both of these are small, at least initially.

Regarding OUTPUT CLOSE, what is "this case"?  OUTPUT CLOSE closes the Viewer window, and that should release the memory it is using.  The Task Manager is not a reliable way, though, of inspecting what is happening in the frontend (stats) process, because that is written in Java, and Java uses its own memory management that the TM cannot see.  While a little memory might not be reclaimed on a close, most would be available for reuse by Java or sometimes returned to the OS.

Statistics 20 has a setting on the Pivot Tables tab, "Render as legacy tables".  That is better left unchecked unless you need to pass output to older versions of Statistics.  Statistics 19 has a choice, Render tables with all features and Render tables faster.  "Faster" will use less memory but does not support some editing features.  In V20, all tables are "faster" but support full editing.

You can adjust Java memory usage by editing the file jvmcfg.ini in your Statistics installation directory.  Toward the bottom you can see settings for heap sizes for the JVM.  You might find it helpful to adjust these, but save a copy of the original.


Jon Peck (no "h") aka Kim
Senior Software Engineer, IBM
[hidden email]
phone: 720-342-5621




From:        Michael Müller <[hidden email]>
To:        [hidden email],
Date:        12/06/2013 11:34 AM
Subject:        Re: [SPSSX-L] RAM usage in production job/mode
Sent by:        "SPSSX(r) Discussion" <[hidden email]>




Could you explain the difference between "stats.exe" and "stats.com"? How can I decide or choose which of the processes I want to run? “Stats.com” only appears in the task manager when I use production jobs (SPSS still open).
 
Output close is already used by me, but I cannot see an affect. In normal syntax (without using production) it works fine to clear memory with output commands, but not in this case.
 
Which light table options do you mean? Column Width, Table Rendering or Default Editing Mode are the only settings I have for Pivot Tables.
 
Regards
Michael




2013/12/5 Jon K Peck <peck@...>
You should not have stats.com and stats.exe both running, but the real issue here, given the small amount of memory being used by the backend (spssengine) process is likely the Viewer, which is part of the stats.exe process.  The Viewer contents live in memory, so eventually enough output would exhaust the available resources.  You can control and recover this using the OUTPUT commands, in particular OUTPUT SAVE and OUTPUT CLOSE so that you can get a large output volume out of memory.

In addition, SPSS 20 has the light tables option (see Edit > Options > Pivot Tables) that simplifies the tables resulting in less memory usage.  Later versions of Statistics address memory and speed issues better.


You can take this to an extreme by using Python external mode programmability, in which case there is no Viewer and output is captured using OMS instead.




Jon Peck (no "h") aka Kim
Senior Software Engineer, IBM

peck@...
phone:
<a href="tel:720-342-5621" target=_blank>720-342-5621




From:        
Michael Müller <michael.mueller.kiel@...>
To:        
[hidden email],
Date:        
12/05/2013 01:20 PM
Subject:        
[SPSSX-L] RAM usage in production job/mode
Sent by:        
"SPSSX(r) Discussion" <[hidden email]>





Hello Everyone,

I have a problem with the RAM usage of SPSS and would like to know if anyone has a solution.


Problem:

RAM Usage increases over time, without setting memory free again until SPSS crashes (more RAM helps, but in this case we cannot determine/change the used hardware).

Details:

I use a production job / SPSS 19 or SPSS 20 / Windows 7 Ultimate 64 / 4 GB RAM.

RAM usage (
stast.com increases until crash):
Java.exe                              304K

Javaw.exe                         3,264K

Spssengine.exe            100,932K

Stats.exe                       143,904K

Stats.com                   3,212,340K

 

Dataset close and output close are used widely, but seems to have no effect on this problem. The memory is only cleared, when the job ends or SPSS crashes.

I would be thankful to any ideas. Just tell me if you need further information.

 

Regards

Michael