This forum has been very helpful in my work and I finally decided to register to ask for help.
I have an incoming real-time stream of data from a platform however, the platform only provides 300 rows of data each time, and as newer rows of data comes, the past ones are deleted. However, I need to be able to save the past rows of data as well to have a better understanding of the data.
If any one has experienced a similar problem before, please share with me how you solved it. What I did was I made a macro that runs every second to copy and paste the 300 rows of data then it deletes the duplicates. The process is repeated however the data is pasted below the bottom row then deletes the duplicates. It works for now however, sometimes the process of deleting duplicates tend to delete some essential duplicate data as well.
Is there a way where the data is constantly pushed down so as to have one continuous stream of data instead of 300?
Thank you!
Bookmarks