'R started using 20 GB of memory spontaneously
I have been "setting up" a new project in R, i.e. copying bits of code together, I have run almost nothing and got an error when opening the project today:
WARNING: Failed to restore workspace from 'mywd/RData'
Reason: cannot allocate vector of size 172 Kb
Can anyone explain where this is coming from, I've tried:
rm(list=ls(all=T))
gc()
which has reduced the memory usage to 17.88 GB, but that still seems excessive...
update
I closed RStudio and reopened it and my memory usage (without any loaded libraries or data) had reduced to 858 MB. I'm still confused about why this happened and what to do if it happens again.
update2 packages used... and YES, I need all of them.
tables
library(tidyverse) library(janitor) library(tidyr) library(reshape2) library(readr) library(data.table) library(plyr)
taxonomy
library(Taxonstand)
database
library(DBI) library(odbc) library(dbplyr) library(knitr)
spatial
library(sp) library(raster) library(rgdal) library(maptools) library(rgeos) library(rangeBuilder) library(geosphere) library(rgdal) library(sf) library(vegan)
graphics
library(ggplot2) library(RColorBrewer) library(broom) library(gridExtra) library(ggmcmc) library(ggforce) library(Hmisc) library(cowplot) library(ggpmisc) library(gtable) library(egg)
Bayes
library(R2WinBUGS) library(runjags)
library(R2jags) library(rjags) library(dclone)
Modelling
library(psych) library(betareg) library(coda)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|