Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for a $400 discount.
Register nowThe Power BI DataViz World Championships are on! With four chances to enter, you could win a spot in the LIVE Grand Finale in Las Vegas. Show off your skills.
Hi MS Support & Community,
For number of large datasets in our primary PBI workspaces, we have seen errors last couple days related to insufficient memory in premium workspaces. How can we check memory usage vs limit on the capacity? The Capacity metrics tool does not appear to monitor memory usage; only CU% are monitored and we're well below the limit on the capacity.
Solved! Go to Solution.
Ensure you have Large Dataset turned on for the dataset @sirahcy , and see this article. It specifically mentions a 12GB dataset in a P1 capacity might cause issues, but a 9GB may depending on what else is loaded at the time. Large datasets in Power BI Premium - Power BI | Microsoft Learn
Are you using incremental refresh? The entire dataset does not need to be loaded during refresh if you are using incremental refresh on the large fact table(s). Only those recent partitions that would refresh.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingThank you @edhans. Yes. We're enabled for Large dataset and incrementl refresh on the large fact tables. Some of the datasets actually approached 12GB or exceeded so we definitely will look into sizing the capacity. We're using incremental refresh (e.g. ~60% of a fact table), but not sure how much saving in memory usage, instead of 2x dataset size using full refresh.
I suspect it is those approaching or exceeding it that is causing the issue. May be time to move to a P2 capacity, or remove some history from the models.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingWe do have adhoc memory failure on P2 capacity as well. To check against the Max memory per query (6GB), do you look at the IO on the compute/cluster that processes the PBI refresh or something else?
We tend to upscale the fabric instance before a refresh, supplying us with enough memory for multiple dataset refreshs. Afterwards we will downscale again. I don't understand why there are such low memory limitations in the first place though.
Thanks. Will discuss with admins to enable this setting.
How large is the dataset and how large is the capacity? You typically need 2x the size of the dataset to refresh because temporarily, the full dataset is loaded twice, one for what is the current dataset, and one for the refreshed data that is swapped out.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingThanks. Our dataset size is 9G. Looking at all the datasets refreshing around the same time is about 21GB (dataset size x 2), which is still below the P1 capacity memory limit 25G. Or this is cutting too close?
Ensure you have Large Dataset turned on for the dataset @sirahcy , and see this article. It specifically mentions a 12GB dataset in a P1 capacity might cause issues, but a 9GB may depending on what else is loaded at the time. Large datasets in Power BI Premium - Power BI | Microsoft Learn
Are you using incremental refresh? The entire dataset does not need to be loaded during refresh if you are using incremental refresh on the large fact table(s). Only those recent partitions that would refresh.
DAX is for Analysis. Power Query is for Data Modeling
Proud to be a Super User!
MCSA: BI ReportingThank you @edhans. Yes. We're enabled for Large dataset and incrementl refresh on the large fact tables. Some of the datasets actually approached 12GB or exceeded so we definitely will look into sizing the capacity. We're using incremental refresh (e.g. ~60% of a fact table), but not sure how much saving in memory usage, instead of 2x dataset size using full refresh.