Dataverse Storage Capacity Model
The Dataverse Storage Capacity Model defines how business data, documents, and system logs are stored and managed within the Power Platform. It is important because Dataverse storage is license-based and shared across the organization, so uncontrolled growth can impact both cost and system performance. By clearly separating data into database, file, and log storage, the model helps organizations store only active, high-value business data in Dataverse while moving documents and historical information to more cost-effective locations. This approach keeps applications fast, predictable, and scalable, while avoiding unexpected capacity issues and ensuring long-term sustainability of the platform.
We should consider the Dataverse Storage Capacity Model in the following scenarios, especially when designing or scaling Power Platform or Dynamics 365 solutions:
- When building enterprise or long-running applications
If the application will store large volumes of business data (such as cases, transactions, or customer records) over many years, capacity planning is essential to avoid performance and cost issues.
- When working with high-growth data
Scenarios like customer support systems, logging, approvals, or integrations generate data continuously. The capacity model helps control growth and plan retention.
- When attachments or documents are involved
If users upload files, emails, or images, the file capacity model becomes important to decide whether data should stay in Dataverse or be stored in SharePoint.
- When auditing, compliance, or monitoring is required
Enabling audit logs, plugin tracing, or automation increases log storage. Understanding the model helps balance compliance needs with storage limits.
- When multiple environments are used
Since storage is shared at the tenant level, organizations with Dev, UAT, and Production environments must plan capacity to prevent one environment from impacting others.
- When cost control and licensing predictability matter
The capacity model helps avoid unexpected storage purchases and supports informed budgeting decisions.
- When designing archival and data lifecycle strategies
It is critical when deciding what data should remain active in Dataverse and what should be archived to external storage for reporting or historical purposes.
Dataverse storage is governed by a capacity-based model that controls how much data you can store, what type of data, and how it is billed and consumed.
Microsoft separates storage into three independent capacity types:
- Database Capacity
- File Capacity
- Log Capacity
Each has different measurement units, purchase options, and consumption behavior.
Database Capacity
Purpose
Stores structured relational data required for business applications.
What is stored
- Standard tables (Account, Contact, Case, Opportunity, etc.)
- Custom tables
- Table rows and column values
- Relationships
- Indexes
- Metadata (tables, columns, relationships)
- Calculated / rollup fields (results)
Think of this as your Dataverse relational database.
Measurement
Measured in GB
Counted based on:
- Row data size
- Column data type size
- Index overhead
- Metadata overhead
Large text fields, rich text, and many indexes increase DB usage significantly.
How Database Capacity is Purchased
Database capacity comes from:
Capacity is pooled at tenant level, not per environment.
How Database Capacity is Consumed
- Every row created or updated consumes DB capacity
- Custom tables consume more than standard tables
- Index-heavy tables increase consumption
- Deleting rows frees capacity (after system cleanup)
Governance Best Practices
- Avoid unnecessary custom columns
- Use appropriate data types
- Archive historical records
- Minimize indexes
- Clean up unused tables
File Capacity
Purpose
Stores unstructured data and attachments.
What is stored
- Note (Annotation) attachments
- Email attachments (tracked in Dataverse)
- Images
- Files uploaded to file-type columns
Files are stored in Azure Blob Storage (Microsoft-managed).
Measurement
Measured in GB
- Size = actual file size (binary content)
- No compression benefit from Dataverse
How File Capacity is Purchased
How File Capacity is Consumed
- Uploading a file consumes capacity immediately
- Deleting attachments releases capacity
- Email attachments tracked in Dataverse consume file capacity
SharePoint integration does NOT consume Dataverse file capacity
Governance Best Practices
- Enable SharePoint integration for documents
- Limit attachment size
- Avoid storing large media files
- Periodically clean old attachments
Log Capacity
Purpose
Stores system-generated operational data.
What is stored
- Audit logs
- Plugin trace logs
- Workflow execution history
- Power Automate run logs (Dataverse-related)
- System jobs
- Email processing logs
This data is mainly for diagnostics and compliance.
Measurement
Measured in GB
- Grows continuously
- Log data is append-only until purged
How Log Capacity is Purchased
How Log Capacity is Consumed
- Enabling auditing increases log usage
- Plugin trace level “All” or “Verbose” increases consumption rapidly
- High automation = higher log growth
Log capacity is the most commonly exhausted capacity.
Summary:
The Dataverse Storage Capacity Model helps organizations understand how their data is stored and managed within the Power Platform. It clearly separates storage into database capacity for business records, file capacity for documents and attachments, and log capacity for system and audit activities. This model is important because storage is shared and license-based, so managing it properly helps keep applications fast, control costs, and avoid unexpected capacity issues. By using the right storage for the right type of data, organizations can ensure their systems remain efficient, scalable, and reliable over time.
Published on:
Learn more