Why Microsoft Fabric Probably Isn’t Worth It for Most.

Microsoft Fabric was announced in May of 2023 with a lot of fanfare (and confusion) with the promise to transform the Power BI experience, and consolidate a number of Azure Data Engineering, Data Science, and Business Intelligence functions into a single cohesive experience infused with cutting Edge AI.

We’ve been using the product since it’s release and thought it would be a good time to reflect on what Fabric does well, what it doesn’t and who it may be a good choice for.

What is Microsoft Fabric?

Microsoft Fabric isn’t necessarily an entirely new product, as much as it is a new way of working with existing Microsoft Technologies. It essentially brings all of the pieces required to go from various different data sources within an organization to a single data lake house or warehouse then make it available for reporting in Power BI or analyze with AI or Machine Learning.

Historically, each of the different pieces lived in different parts of the Microsoft Ecosystem. Data engineers would use tools like Data Factory to build pipelines in Azure where they would also deploy something like an Azure Synapse Data Lake to import and transform data to make it available to a wider audience of users and analysts.

Analysts would then log into PowerBI.com, connect to a data lake or database established by the data engineering group to build their reports.

Microsoft Fabric changed this dynamic by consolidating the data engineering and analyst functions into a single experience which makes it easier to navigate and simplifies billing by not having to pay for each part of the process separately. The screenshot below shows some of the functions that are now available in a PowerBI.com style interface.

Example of available solutions bundled into Microsoft Fabric

Tasks range from the ability to build data pipelines, deploy data warehouses, and build Power BI reports.

The Good Parts of Microsoft Fabric

There are a number of benefits with Microsoft Fabric, most are focused around ease of use and reducing the workload of data engineering and IT teams that would typically have to spend a significant part of their time deploying various assets and managing access control. This is an area where Microsoft Fabric really shines.

Deploying Resources is Easy

Having the ability to deploy various data assets quickly and efficiently without having to create new Azure Resource groups is a decent time saver. Many of the functionalities of Microsoft Fabric are time savers on the setup and administration side and is one of Fabric’s biggest strengths.

Managing Access with Entra ID and Workspaces

Access is controlled through a workspace model that if you’re familiar with Power BI you will be familiar with in Fabric. The difference is that instead of only being able to create new Power BI reports in your workspace, you can deploy a data lakehouse or warehouse.

Data Lakes are Effortless

Microsoft Fabric runs on a common data architecture that’s based on the data lake concept. Essentially, structured and unstructured data can be stored in OneLake, the One Drive for Data. OneLake is equivelant to Azure Blob Storage, an inexpensive bulk data storage solution for the cloud. Then the related solution sits on top of that storage.

One Lake data architecture
What is Microsoft Fabric – Microsoft Fabric | Microsoft Learn

By separating file storage in OneLake from the compute solution, Microsoft offers customers an all-in-one solution similar to Snowflake or Google Big Query, but with the added benefit of having everything in an all in one solution.

Fabric is Constantly Adding Features

It’s clear that Microsoft has a grand vision and grand ambitions for Fabric. They’ve been aggressively adding new features on a monthly or sometimes faster cadence. You can keep track of what’s new by following the Microsoft Fabric Blog | Data Analytics News and Updates or by looking at the most recent Fabric roadmap and release plan that Microsoft updates for most of their products semi-annually.

The Bad Parts of Microsoft Fabric

Even with all of the great features of Microsoft Fabric and vision of bringing multiple products under one roof, there’s still a lot of room for improvement. Many of these issues may be resolved in the future but as it stands are at least worth being aware of before you invest a significant amount of time and money into the platform.

Microsoft Asks for a Lot of Trust

Recent Microsoft product releases have felt more like paid beta tests than fully developed software. Fabric is no exception. There are countless buttons that you will think should exist that simply don’t exist or weird limitations that have yet to be addressed. You can get a feel for some of the missing features by viewing the Fabric Idea and Feature Request Forum.

Microsoft touts the “feature” of auto-optimization to save people time from having to configure various data warehouse or data lake settings. On the surface it sounds like a great idea until you realize that there is no way to manually update any of common settings. It’s one thing to offer auto optimization, it’s another to not give people the capability to change settings at all.

Fabric is a work in progress. This may be the number one take-away from the first year of using Microsoft Fabric. While some of the missing features have been added with updates since the original release, many features have not been added. There’s also no guarantee that missing features will be released in the near future. Even Microsoft Power BI which was released in 2015 is only now adding features that have been popular requests for years.

Performance Isn’t Great

Another issue with Microsoft Fabric is that it’s slow, at least in the mid-tier SKU that we’ve tested. Whether it’s connecting to a SQL end-point or using Data Flows Gen2 it’s a bit on the slow side. In the case of Data Flows, they can almost be unusable for larger datasets, or the performance is slower than if you were to import the same amount of data into Power BI Desktop to modify with Power Query.

Fortunately, Fabric is easily scalable but it comes at a cost, Microsoft Fabric – Pricing | Microsoft Azure. Make sure to test not only the existence of features, but also the speed at which they operate. If you’re integrating data across cloud services already in the same Azure region, performance may be better than if you’re working with external sources.

Most Companies Don’t Need Data Lakes

One of the biggest buzz words in data these days are data lakes. It seems like every IT director wants to deploy a data lake as a default solution even though they work with relatively small amounts of data. To us, this is one of the biggest problems with Fabric. It’s neat that the architecture uses delta parquet and blob storage but from an analytic end user perspective it has zero benefit over a structured SQL database.

For those that have large amounts of data and can take advantage of the features in the data lake they may be better served with a more mature solution like Databricks, The Data and AI Company — Databricks. Companies that don’t have large amounts of data might find more traditional data warehouse solutions like Microsoft SQL Server or PostgreSQL as more cost effective and usable solutions.

People considering which solution is right for their organization will benefit from understanding the difference between available technologies and taking an inventory of what their actual current and near term data needs are. The following video from Seattle Data Guy breaks down some of the major differences of available technologies.

Non-Fabric data solutions give people more flexibility than being stuck within the Fabric environment. Even with the availability of SQL endpoints to connect other systems to Fabric, it’s hit or miss on how well they actually work which is sub-optimal for data infrastructure.


Microsoft is putting a ton of resources into adding features to Microsoft Fabric and marketing effort to make it a winner. Features are constantly being added, but at this point in time customers are being asked to put a lot of trust into Microsoft that functionality which doesn’t currently exist will be added in the future.

This is the biggest problem with Fabric. The vision is there, and it seems like a great idea in theory. However, in the current state many companies will be better off with more mature data lake technologies like Databricks, or more traditional SQL servs like Microsoft SQL Server or PostgreSQL.

It will be exciting to see how Fabric evolves over time, we have a feeling that it will be a real contended for many organizations, especially those that are already using Power BI as their primary reporting tool. However, they may be limiting themselves if the market evolves and they want to use non-Microsoft solutions.

Related Articles

Scroll to Top