I'm thinking I may be asking too much from Excel. I'm really using it as a Direct Access File and while avoiding the use of databases, maybe I have no alternative but to embrace them.
To avoid using databases, maybe I could reduce the Excel file size somehow (any way to make single precision?) or do everything in memory while keep the data in a binary file (easy in FORTRAN, but obtuse in Delphi.)
Here are some details of the .XLS file:
This .XLS file is used to collect output from a Delphi program that makes 3000 write then save loops. It has no formats but it does have cell formulas.
The content of the XLS file has
500 sheets (identical but with different numbers)
Each sheet has
200 rows
14 columns
Of the 14 columns, 6 of them have formulas (Average, Min, Max, Stdev, etc.) which process the last 6 columns.
That is a grand total of 500 * 200 * 14 = 1,400,000 numbers
Estimating size just based on double precision of 8 bytes (64 bits) per number, we have 1,400,000 * 8 Bytes = 11.2 MB
The Number of Cells with Formulas = 500 * 200 * 6 = 600,000
After going through these numbers, I'm not surprised it is 34 MB in size and frankly, there may be nothing that can be done about that except perhaps for the single precision thought which is probably impossible.
The issue really isn't the size, its the time it takes to save which must be done on each pass of the automating program. If I open this file myself in XL, it opens pretty quickly. Saving is much longer. But saving from within the Delphi program seems to be multiples longer.
Maybe adding Memory to the PC might help. I confess I'm running XP w/2GB. (with 2GB I should be able to hold these numbers in memory which suggests a stored binary file, loaded and saved each pass would be feasible.)
Suggestions?
ANY would be Appreciated SO MUCH!
Tom
Bookmarks