The engineers did a good job on PowerPivot but there seems to be an opportunity for improvement or I have not fully learned the ropes here.
I have one file of CSV data that, when it is loading, will rack up 90 million rows of data then it will crash. I have 24G of RAM and can't figure out why this is happening.
So I tried another route by creating a smaller file and then appending new data to it. Well, the clipboard is not large enought to hold the data.
My question is how can I get this data loaded into powerpivot... all 150 million rows of it.
My application is scientific in nature and requires all of this data. It is my understanding that the capacity of PowerPivot is only limited by available memory.
Bookmarks