Hi,
Anyone know of any vba macros that can crawl from a starting URL to x number of levels of URLs and save all available PDFs to a certain folder.
I used to do this a long time ago with professional software to recreate a website or just download any specific file types I wanted.
Does anyone know of something similar in Excel vba?
The steps would be:
1) User inputs a starting URL (e.g. https://www.google.com/search?btnG=1...&q=excel%2Bpdf)
2) The web crawler would look for every URL on that page and enter it on a sheet and download all files of a certain type (e.g. PDFs) from that page
3) For each retrieved URL, this would be a new starting URL and repeat step 2 again
4) The process would continue for x number of cycles or until there are no more URLs to scrape.
So for example, if I said download only PDFs from abousetta.com (which isn't my site just to be clear), eventually the program would run out of webpages and stop running.
Any thoughts?
abousetta
P.S. Even if you know of some parts of this problem would be helpful. Thanks.
Bookmarks