Running out of memory in R package

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Running out of memory in R package

Peter Spangler
While running an association analysis using SPSSINC APIORI on a large dataset I receive the an error message that I have reached the maximum 2GB of memory. I have amended the file by adding -max-mem-size=6GB after the "" under R->Properties->Target. This does  not seem to solve the memory problem. Any suggestions?
Reply | Threaded
Open this post in threaded view
|

Re: Running out of memory in R package

Jon K Peck
If you are running 64-bit Statistics and R, you can get more memory than with the 32-bit version, but R still cannot allocate vectors over a fixed size governed by the size of a 32-bit integer.  I don't know how big a problem it can solve, but you may have to sample your data in order to get results.  I don't know whether increasing the minimum criteria would help, but it's easy to try.


Jon Peck (no "h") aka Kim
Senior Software Engineer, IBM
[hidden email]
phone: 720-342-5621




From:        Peter Spangler <[hidden email]>
To:        [hidden email],
Date:        08/07/2013 06:52 PM
Subject:        [SPSSX-L] Running out of memory in R package
Sent by:        "SPSSX(r) Discussion" <[hidden email]>




While running an association analysis using SPSSINC APIORI on a large dataset I receive the an error message that I have reached the maximum 2GB of memory. I have amended the file by adding -max-mem-size=6GB after the "" under R->Properties->Target. This does  not seem to solve the memory problem. Any suggestions?