top of page
Aggregate Measurements
(Entity Store ) in D365 F&O
This document gives a detailed rundown of using Aggregate Measurements and the Entity Store for analytics in Dynamics 365 Finance & Operations (F&O).
1. What It Is
The Entity Store is a separate, read-optimised database that lives alongside your main F&O transaction database. Its whole reason for being is to provide a high-performance data source for analytics and BI reporting. Microsoft calls this database the AxDW.
You don't put raw tables into the Entity Store. Instead, developers create special analytical models called Aggregate Measurements in Visual Studio. An Aggregate Measurement is basically a star schema—a central fact table (like "Sales Transactions") surrounded by related dimension tables (like "Customers," "Products," "Dates"). This pre-aggregated, denormalised structure is what makes analytical queries super fast. When you "refresh" the Entity Store, F&O takes these models, gathers the data from the live database, and populates the AxDW database.
2. Architecture & Data Flow
The process is a scheduled, two-step flow designed for performance:
​
-
Development
-
A developer designs an Aggregate Measurement in Visual Studio. This involves choosing the key numbers (measures, like "Sales Amount") and the categories to slice them by (dimensions, like "Region" or "Product Category"). They then deploy this model to F&O.
-
​
-
Refresh
-
An F&O administrator goes to the Entity Store page in the system and schedules a recurring refresh for the deployed measurements.
-
​
-
Data Population
-
When the schedule kicks off, F&O runs a process in the background. It queries the live transactional database, performs all the necessary joins and calculations defined in the measurement, and pushes the resulting, aggregated dataset into the AxDW database. This database uses special in-memory, column-store indexes, which are brilliant for BI queries.
-
​
-
Reporting
-
BI tools, especially Power BI, connect directly to the AxDW (Entity Store). Because the data is already pre-aggregated and optimised, the reports are incredibly fast. This is the technology that powers most of the embedded Power BI dashboards you see in F&O workspaces.
-
3. Prerequisites
-
Developer Skills
-
Creating or modifying an Aggregate Measurement is a developer task. It requires X++ and SQL knowledge and access to Visual Studio. This is not something a business analyst can do.
-
​
-
Power BI Pro Licences
-
If you are using Power BI to connect to the Entity Store, users who need to view the reports will typically need a Power BI Pro licence.
-
​
-
An Understanding of BI Concepts
-
To design a good measurement, developers need to understand data warehousing concepts like star schemas, facts, and dimensions.
-
​
4. Pros (The Good Bits)
-
Extremely High Performance
-
This is the fastest way to run analytics on F&O data. By connecting Power BI in "DirectQuery" mode to the Entity Store, you get lightning-fast reports on massive datasets without having to import and duplicate the data in Power BI.
-
​
-
The Go-To for Embedded Analytics
-
It is Microsoft's number one, recommended solution for embedding rich, interactive Power BI reports directly into F&O workspaces, providing users with analytics right where they work.
-
​
-
Reduces Load on Transactional Database
-
Because the heavy lifting of joining and aggregating data is done during the refresh process, the reporting queries from Power BI hit the separate AxDW database, protecting the performance of the main F&O system.
-
​
-
Simplifies BI for End-Users
-
It provides a clean, simple, and business-friendly data model for report builders. They don't need to understand the complex underlying F&O table structures.
-
​
5. Cons (The Not-So-Good Bits)
-
Requires a Developer
-
This is a major hurdle. Any change, no matter how small, requires a developer to modify the model in Visual Studio, create a deployment package, and push it through a full software release cycle. It's not agile.
-
​
-
Pre-Aggregated Data Only
-
You only get the data that was designed into the model. You can't drill down to the individual, raw transaction records (unless the developer specifically designed the model that way, which is rare). It's not a tool for general-purpose data exploration.
-
​
-
Not Real-Time
-
The data is only as fresh as your last refresh. While you can schedule refreshes frequently, it's still a batch process, so there will always be some latency.
-
​
-
Database Size and Cost
-
The AxDW database can grow quite large, especially if you have many complex measurements. This can add to your overall database storage costs.
-
6. Key Implementation 'Gotchas' (Things to Watch Out For)
​​
-
DirectQuery vs. Import
-
When connecting Power BI, you have a choice. DirectQuery gives you near-real-time data (from the last refresh) and is best for embedded reports. Import copies the data into Power BI, which can be faster for disconnected analysis but means you have two copies of the data to manage and refresh.
-
​
-
Entity Store Refresh is Key
-
Users often get confused and think that hitting "Refresh" in Power BI gives them live data. It doesn't. It just re-queries the Entity Store. The data is only as fresh as the last scheduled Entity Store refresh from within F&O itself.
-
​
-
Don't Go Overboard
-
Be selective about which measurements you deploy and refresh. Each one adds to the refresh time and database size. If nobody is using a report, stop refreshing its underlying measurement.
-
​
-
It's Not a Data Lake
-
The Entity Store is a structured, relational data mart. It's not a place to dump all your raw data like you would with Azure Synapse Link.
-
7. When Should You Use It?
Aggregate Measurements and the Entity Store are the perfect choice for:
​
-
Embedded Power BI Dashboards
-
This is its primary and most powerful use case. When you want to put an interactive chart or KPI inside an F&O workspace.
-
​
-
High-Performance, Summary-Level BI
-
When you need to build very fast Power BI reports for a large audience that are based on aggregated data (e.g., "Sales by Month," "Revenue by Region").
-
​
-
Creating a Simplified Data Source for Power Users: When you want to provide a clean, curated data model for your power users to build their own reports from, without exposing them to the complexity of the full F&O database.
​
It should NEVER be used when you need to analyse raw, transactional-level detail or when you need a flexible, general-purpose data export tool. It is a purpose-built BI acceleration layer, not a data dump.
bottom of page