Hi there! I'm Ken, a Data Engineer.
Proficient in designing and deploying scalable data engineering infrastructures on cloud platforms like AWS and GCP.
Experienced with BigQuery, Amazon Redshift, MongoDB, and PostgreSQL.
Familiar with Spark, Airflow, Kafka , Python (Pandas, Numpy, PySpark) , DBT, and Pytest.
Data Modeling & Warehousing: Skilled in designing and implementing data models and data warehouses. Automation & Deployment: Experienced in Git, Docker, Kubernetes, Linux, Shell scripting, and automated deployment. I'm passionate about transforming raw data into actionable insights and building robust, efficient data systems.
An automated ETL pipeline that fetches Amazon deals data, performs cleaning and analysis, and stores results in Google BigQuery. This project showcases my skills in dlt, PySpark, dbt, GCP services (BigQuery, Cloud Run, Cloud Scheduler, GCS), Terraform, and Docker. (https://github.com/garmenty485/Amazon_deals-DE-pipline)
A web game built with the MERN stack, demonstrating my ability to learn new technologies and apply them in practical projects. It includes Socket.io, Google OAuth 2.0, MongoDB, and RESTful APIs. (You can play the game here: https://englishbattlebro.onrender.com/)