This project demonstrates an end-to-end ELT (Extract, Load, Transform) pipeline using Azure Storage for raw data ingestion, Snowflake as the cloud data warehouse, and dbt for data transformations and data modeling and airflow for orchestration.
Below is the high-level architecture of the pipeline:
- For datasets, please visit MovieLens
Queries to setup datawarehouse,role,user,database,schema,tables and copy the data from Azure storage Snowflake
Run these commands from your dbt project root folder after deveolping models
dbt builddbt run --select model_namedbt testdbt docs generatedbt docs servedbt compile --select path:analyses/Use SnowflakeOperator to copy data from the azure storage and DbtTaskGroup to execute dbt models


