diff --git a/_posts/2024-05-02-etl.md b/_posts/2024-05-02-etl.md
index d306b61..78b59ad 100644
--- a/_posts/2024-05-02-etl.md
+++ b/_posts/2024-05-02-etl.md
@@ -7,9 +7,9 @@ layout: post
We will use the following tools to do ETL:
-- postgres: postgres is an open-source relational database management system that is widely used for storing and managing data. We will use postgres to store the data that we extract, transform, and load.
-- dbt: dbt is a command-line tool that enables data analysts and engineers to transform data in their warehouse more effectively. dbt allows you to write SQL queries to transform your data and then run those queries in a reproducible and scalable way.
-- Apache Airflow: Apache Airflow is an open-source platform that allows you to programmatically author, schedule, and monitor workflows. We will use Apache Airflow to orchestrate our ETL process and ensure that it runs smoothly.
+- PostgreSQL: postgres is an open-source relational database management system that is widely used for storing and managing data. We will use postgres to store the data that we extract, transform, and load.
+- dbt: dbt is a command-line tool that enables data analysts and engineers to transform data in their warehouse more effectively. dbt allows you to write SQL queries to transform your data and then run those queries in a reproducible and scalable way.
+- Airflow: Apache Airflow is an open-source platform that allows you to programmatically author, schedule, and monitor workflows. We will use Apache Airflow to orchestrate our ETL process and ensure that it runs smoothly.
- basic cloud services knowledge: we will use basic cloud services such as AWS S3, Google Cloud Storage, or Azure Blob Storage or DigitalOcean Spaces to store the data that we extract, transform, and load.