Dealing with a Vector Memory Exhausted Error in R

27 April 2020
By Tanvi Mehta

When working with relatively large data in RStudio, it’s not uncommon to run into a Vector Memory Exhausted error. The threshold for this can depend on a number of things, including the size of your data and the amount of RAM available on your computer. Regardless of how you encounter this error, it can be a frustrating experience so here are some solutions that might address the problem:

Error: vector memory exhausted (limit reached?)

1. Find a Computer with More RAM

At Carleton, lab computers have 16GB of RAM, which, if more than your personal computer, can be an easy fix.

2. Use Parallel Computing

While most computers today have multiple cores, R defaults to using only a single core. It’s possible to parallelize and use multiple cores using an R function called mclapply. Mclapply is very similar to the more commonly used function lapply, but parallelizes the calls. An example call could be:

mclapply(data_vector, function, mc.cores = detectCores())

This call would apply the specified function on the provided vector, but utilize all available cores to do so. In order to use this function, you will need to install and load the parallel package, which can be done by:



Jones, Matt, 2017. Quick Intro to Parallel Computing in R.

3. Change Environment Variable R_MAX_VSIZE

The environment variable R_MAX_VSIZE can be used to specify the maximal vector heap size. On macOS, this can be adjusted relatively easily. First, open terminal. Then, run the following commands:

cd ~

touch .Renviron

open .Renviron

Save R_MAX_VSIZE = __GB as the first line of .Renviron using an appropriate number of GB

The value and whether this solves the problem will obviously be dependent on the available physical memory, but if R is not using all available memory, this can be used to adjust that.

While these solutions may not solve the often complicated vector memory exhausted error, these are some simple solutions to try.