I have a macro that manipulates potentially large data files that can occasionally clog the computer's memory (I use a certain amount of traffic between sheets and variants as well as multiple sheet formulas).
I am trying to find ways to identify and handle such cases on the fly. Specifically, I have encountered the problem that once I get an "out of memory" error, I can't do anything other than restarting Excel (if I stop the macro and run it again without closing Excel, I get the "out of memory" error sooner than on the previous attempt).
I am looking for a solution along the following lines:
- see what variables are actually being used at a given moment (in debugging mode) so I can determine if I need to clear any of them (I have many variables with various scopes, so it's hard to check them manually one by one)
- clear the memory after getting the "out of memory" error (so I can recover from this error dynamically and partition the data until it's small enough to handle).