Hi
I have a macro created and working that grabs data from a site ( nothing new there ). Problem I have is the speed, this macro actually runs 105 web queries one after the other.
First of all I have tried to slim this down, so 105 is the smallest it can go.
It appears to take around 10 seconds for each query, which puts the overall run time at 15 minutes.
Is there a smarter way of doing this, can I chop out chunks of the code (unnecesary parts) is there a better way of running the query.
This is the code I am using or a snippit, there are 104 more of thoseSub Update()
Dim PlayerId As String
Dim UNPId As String
PlayerId = "24241061987"
UNPId = "UNP-Blade"
With Worksheets(UNPId).QueryTables.Add(Connection:= _
"URL;https://www.novaworld.com/NWCommunities/charStats.aspx?id=" & PlayerId & "&productid=616065&playerCard=1" _
, Destination:=Sheets(UNPId).Range("B100"))
.Name = UNPId
.BackgroundQuery = True
.RefreshStyle = xlOverwriteCells
.SaveData = True
.WebFormatting = xlWebFormattingNone
.WebTables = "14"
.Refresh BackgroundQuery:=False
End With
Thanks in advance for any guidance.
R
Bookmarks