Hey guys,

Sorry if this has gone in the wrong place, but I'm at a loss, and no amount of reading up about VB is helping me work anything out, so I beg of you to help if possible

What I want to do is pretty much take ALL race data from greyhound-data dot com and normalise everything to between 0 and 1, except maybe the date and stadium, but I'd need "comment", "wintm" and "etime" removed with each stadium across all dog's races being given its own ID for Excel, and each dog's race having its url ### ID (see below) in the first column.

This sounds like a pretty big task for a script/macro so that's why I'm stumped.

The URLs themselves appear to be easily parseable: greyhound-data.com/d?l=##### where ##### is the ID number. So really I'd want it to download each table from the incremented page output of the link (some are more than 1 page, changing the link for 2 and 3/4/5/6 page results).

I would be happy if each dog's data went in a different sheet, but can Excel support 170,000+ sheets in a workbook?

And then there's the normalising of all data, i.e. take the maximum of ALL a columns data across ALL dogs/sheets, i.e. the best time *ever* with =MAXIMUM(A1:whatever) and divide the cell by the resulting maximum for each bit of data, so that across all dogs and all races, every bit of data is between 0 and 1 (for input into matlab).


I know this is so much to ask for help with from a noob to this forum but I'd appreciate if anyone could help (or better write something up). So much work for my little brain to do and it's failed.


Please help!!!

Roy