🚀Jump start Meta data driven DW and Analytics in Fabric
Accelerate your journey to Microsoft Fabric with a proven metadata-driven approach to Data Warehouse and Analytics engineering.
This immersive bootcamp combines architecture deep dives, hands-on labs, and real-world templates so your team can hit the ground running with Fabric—with patterns already used in production.
Ideal for teams with backgrounds in SQL Server, SSIS, ADF or Synapse who want to modernize quickly and adopt Fabric using industrial-strength practices.
Skip months of trial and error.
Your team will leave with a fully operational metadata-driven DWA running in Fabric—complete with templates, pipelines, and automation patterns you can adapt immediately.
Designed for experienced data teams:
- Data Architects
- Data Engineers
- BI & DW Developers
- Teams starting a Fabric rollout
- Organisations wanting a structured, scalable DW delivery model
This course is often the launchpad for organizations partnering with Prodata to accelerate their Fabric implementation.
📘Training Overview
- 📅 Duration: 2 Day (Full-time)
- 🕘 Time: 9:00 AM – 5:00 PM
- 📍Location: In person or remote via Teams
- 💲Fee: €6,000 for 4 attendees, +€1,000 per additional
- 🧑🏫Trainer: Bob Duffy, Microsoft MVP and Principal Data Architect
- ✉️Contact: info@prodata.ie
📚 Course Curriculum
- 01 DWA Architecture
Gain a comprehensive overview of the DWA Framework and its application to Fabric. Dive into internals and lean architecture choices such as LH v DW options, workspace taxonomy and deployment options. - 02 DWA Installation
Step though how to go from zero artefacts to a ready to develop DWA Framework with LH, DW, Meta database, artefacts and sample data. Lab to install DWA. - 03 AdventureWorks ETL
Walk through a fully working enterprise example of DW meta data driven ETL using ERP data based on Adventure Works using supplied Notebook and SQL Templates. Lab to explore solution and run end to end. - 04 Adding Pipelines
Learn how to build and add your own bespoke templates to conform and integrate with the DWA Frameworks. Lab to add new pipelines and explore L1 to L4 concepts. - 05 Adding Notebooks
Learn how to build and integrate Pyspark and Python Notebooks into the DWA Framework. Lab to add new Notebooks and take them from unit test to automated with meta data. - 06 Implementing CDC with SQL Data Sources.
Workshop exploring how to incrementally bring data into your LH using CDC. Lab to use the DWA Framework to bring in additional databases without any coding using standards like open mirroring and custom CDC. - 07 Creating Transforms
learn different types of transforms that can be integrated into your DWA Framework. Lab to create new transforms using TSQL and DuckDB. - 08 Loading Star Schemas
learn how the DWA Framework automated Start Schema Loading using dynamic TSQL Templates for many industry standard patterns. Lab to add new data sources into star schema. - 09 AI Driven Document Intelligence
Automate AI and Document intelligence to extract data from unstructured documents, then operationalize and scale it with DWA. - 10 LLM and Gen AI
Build, Secure , develop and integrate all flavours of LLM Models into DWA. Lab to step through examples in open AI, Azure with Chat and Completions API. - 11 Fabric Data Agents
Create your own Fabric Data Agent on top of Semantic Models - 12 CI/CD and Devops
Workshops and Labs to get started deploying artefacts from your DEV workspace to UAT and PRD. Also covers getting started with using logs to create operational dashboards. - 13 Semantic Model Refresh
Workshops and sample templates towards using API driven refresh within your pipelines.
👥 Who Should Attend
- Data Architects moving to metadata-driven Fabric architectures
- Data Engineers building modern ingestion, ETL, and automation patterns
- DW Developers bringing SQL skills into Fabric’s SaaS environment
- Teams building structured, repeatable Fabric DW solutions
- Anyone seeking to adopt a metadata-driven, scalable, automated DWA Framework
