Microsoft Fabric - Ingesting 5GB into a Bronze Lakehouse using Data Factory
Ed Freeman's Microsoft Fabric End-to-End demo series continues with this particular video, showcasing how to use Data Factory to ingest roughly 5GB of data from an unauthenticated HTTP source into OneLake. By doing so, he explores the differences between Tables and Files in a Fabric Lakehouse and previews data in the Lakehouse Explorer. The video provides a walkthrough of the process, including UI layout, copy data activity options, configuration of source and destination, dynamic content for destination Lakehouse filepath, and additional settings. Additionally, alternative parameterized pipelines, pipeline run details, and Lakehouse Files are all covered. If you're interested in learning more about Fabric, check out some of their perspectives and introductory content provided in several related links.
Published on:
Learn moreRelated posts
Microsoft Fabric Machine Learning Tutorial - Part 2 - Data Validation with Great Expectations
This tutorial delves into the intricacies of data validation in the realm of Microsoft Fabric and Great Expectations. It demonstrates how a da...
Storage Utilization in Fabric OneLake | Monitor Your Lakehouse Storage
In this blog post, you will learn how to check the amount of storage being used by your Fabric OneLake in Microsoft Fabric. The author address...
Incrementally loading files from SharePoint to Azure Data Lake using Data Factory
If you're looking to enhance your data platform with useful information stored in files like Excel, MS Access, and CSV that are usually kept i...
OneLake: Microsoft Fabric’s Ultimate Data Lake
Microsoft Fabric's OneLake is the ultimate solution to revolutionizing how your organization manages and analyzes data. Serving as your OneDri...
Working with tables in Microsoft Fabric Lakehouse – Everything you need to know!
The Microsoft Fabric lakehouse provides a powerful platform for working with tables. However, as with any platform, there are various nuances ...
Microsoft Fabric - Creating a OneLake Shortcut to ADLS Gen2
In this informative video, learn how to use Shortcuts in Microsoft Fabric to aid zero-copy referencing of data between OneLake and clouds whil...
Data Modeling for Mere Mortals – Part 3: All we need is a Data Lakehouse?!
In this article, the concept of a data lakehouse is explored, and the question arises - is it the solution that meets all the needs for data m...
What is Databricks Lakehouse and why you should care
Databricks has been making waves in the industry, and it's important to understand its impact on the world of data. At its core, Databricks pr...
Dealing with ParquetInvalidColumnName error in Azure Data Factory
Azure Data Factory and Integrated Pipelines within the Synapse Analytics suite are powerful tools for orchestrating data extraction. It is a c...