'felm regression with many dummies speed and memory issues

I am running a fixed effect regression with felm which has interactions. I have fixed effects outside of the regression, and categorical fixed effects within the interaction term. Because the dataset I am using is so large, and the interacted factor variable for regions has so many levels (roughly 2000 different regions), my regression is extremely slow. Previously, I received the error: Error: vector memory exhausted (limit reached?)

I was able to fix that by increasing the memory, and I could then view my regression with all of the approximately 2000 dummies, but now that I try to run it again, every time I run the code, it keeps loading and never finishes running the regression. My regression is the following:

fe_reg <- felm(data = df, outcome ~ (dummy_control1 + dummy_control2 + factor(categorical_10levels) + factor(region_2000levels) + x_dummytreatment)*interaction | country + year)

How can I speed up the regression and make the felm run? Also, to change the memory limit, I had to use a temporary fix with the r.enivronment that only worked once and required a restart of r each time. How can I change this to permanently increase the vector memory limit and not have to reboot r each time? I am a mac user, therefore the basic command for memory.limit does not work.

r


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source