I have written a macro module to aid users import .csv files into some empty tables ready for further analysis. These .csv files always contain three columns of data, but the amount of rows varies greatly and can be in the thousands. Previously we used to copy and paste values, but this is notoriously slow. Instead we have switched to using ''Range.Value ='' (shown in green below).
This has sped things up massively, but I am struggling to size the receiving table to match the amount of rows from the incoming data set. As a work around I have just manually set, using a resize function, to a value that I know to be larger than the amount of rows in the data set (shown in blue below and set in this example to 100 rows). This will then just show any empty rows as #N/A. Not pretty!
I could write some code to remove the #N/A rows, but I am sure there must be a neater way to do this from the off. Can anyone let me know how to code this so each table will match the size of their imported data set? I have attached a striped down version of the macro file along with a test .csv file to allow you to try this out. Thanks in advance for any help.
Bookmarks