Hi Excel-community,
a friend and I are currently working on a Dashboard. The data set is very large which is why she uploaded the data using Power Query. If I understand correctly, by doing so, the data set is comprimised (please correct me if I am wrong) and it should not be a problem to build a Dashboard with about one million rows.
When I started building the Dashboard (which mainly includes making simple pivot tables in a backlog and based on those using data slicers to create diagrams in a different sheet), the file started to become slower and slower. I just checked how large the file is (184MB) which I think is too big. My concrete questions are the following:
Is it normal that a file with 184MB is so slow (slow=selecting a different filter takes 40 seconds)?
Is it possible that it became so large because of all the filters I inserted or should that not be a problem? (the data set is just one table and not connected with other add-ins)?
What is the most problematic: Creating more and more Pivot tables (using the same set of data), using data sclicers (=filters) or including diagrams based on the pivot tables and slicers?
How comprimised should the data be after uploading it? (Original data: 110MB)?
What would you suggest I should do as a next step?
I would be very greatful for any tips & tricks, please don't hesitate to ask questions if I haven't explained well enough.
Best regards Olivia
This Dashboard has already been built for parts of the data, I wanted to build one big Dashboard including all of the data. Since we uploaded it with power pivot, we expected it to be faster than before (previously with the other Dashboards we did not use Power pivot/Query), but it is as slow as before.
Bookmarks