From 9bbd34a2980176fdde4947f6d18c3ed3cd67cd5b Mon Sep 17 00:00:00 2001 From: oceanumeric Date: Fri, 31 May 2024 20:21:04 +0200 Subject: [PATCH] [update] --- _posts/2024-05-02-etl.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/_posts/2024-05-02-etl.md b/_posts/2024-05-02-etl.md index d306b61..78b59ad 100644 --- a/_posts/2024-05-02-etl.md +++ b/_posts/2024-05-02-etl.md @@ -7,9 +7,9 @@ layout: post We will use the following tools to do ETL: -- postgres: postgres is an open-source relational database management system that is widely used for storing and managing data. We will use postgres to store the data that we extract, transform, and load. -- dbt: dbt is a command-line tool that enables data analysts and engineers to transform data in their warehouse more effectively. dbt allows you to write SQL queries to transform your data and then run those queries in a reproducible and scalable way. -- Apache Airflow: Apache Airflow is an open-source platform that allows you to programmatically author, schedule, and monitor workflows. We will use Apache Airflow to orchestrate our ETL process and ensure that it runs smoothly. +- PostgreSQL: postgres is an open-source relational database management system that is widely used for storing and managing data. We will use postgres to store the data that we extract, transform, and load. +- dbt: dbt is a command-line tool that enables data analysts and engineers to transform data in their warehouse more effectively. dbt allows you to write SQL queries to transform your data and then run those queries in a reproducible and scalable way. +- Airflow: Apache Airflow is an open-source platform that allows you to programmatically author, schedule, and monitor workflows. We will use Apache Airflow to orchestrate our ETL process and ensure that it runs smoothly. - basic cloud services knowledge: we will use basic cloud services such as AWS S3, Google Cloud Storage, or Azure Blob Storage or DigitalOcean Spaces to store the data that we extract, transform, and load.