Hello all,
I am working on an academic research project that includes data that studies gas diffusivity rates into soil. To give a very simplified picture, gas is injected into a sealed chamber that is partially submerged into the soil. Then measurements are taken over time to see how much the concentration changes as time goes on. The more the concentration goes down, the higher the soil diffusivity rate is. Unfortunately, this does not follow a linear trend. In fact, it follows this rather difficult formula
C/C0 = EXP(D*T/H^2)*ERFC(D*T/H^2)
Where:
C is Concentration at Time T
C0 is initial concentration (at T = 0)
D is the diffusivity rate
H is the height of the chamber
EXP() is the natural exponential function and
ERFC() is the error function complement function.
Using SOLVER it is not all that difficult to solve for the most likely value of of the diffusivity rate for given values of C, C0, T, and H. This is what I have set up (with real data) in the "SF6" attachment. I told solver to minimize the Square the shows the sum of square error (H6) by changing the value for the diffusivity rate (E5).
The problem is that I need to do this several hundred times, and therefore I need a way of automating this process. I am having trouble visualizing how I would do this, however. What the data looks like when I pull it out of the database is shown in the "RawData" attachment. I am trying to come up with a diffusivity rate for each of the chambers (of which only a small subset is shown). How would I begin to take the data that is shown in "RawData" attachment, change it so it looks like the SF6 attachment, use Solver to come up with an optimized solution, and then repeat several hundred times?
Any help is greatly appreciated.
-Evan
Bookmarks