Core Responsibilities:
Design and maintain data pipelines on GCP using BigQuery.
Develop and manage DBT models for data transformation.
Write optimized SQL queries for ETL and analytics.
Implement data warehousing solutions ensuring scalability and reliability.
Automate workflows with Python scripts.
Ensure data governance, security, and compliance standards.
Required Skills:
Strong SQL expertise.
Hands-on experience with BigQuery, DBT, and other GCP services.
Familiarity with Python for automation.
Knowledge of data warehousing principles and Git for version control.
Training & Certifications:
Internal resources point to Google Professional Data Engineer certification as the benchmark.
Other relevant certifications include Professional Cloud Architect, Cloud DevOps Engineer, and Machine Learning
Engineer for advanced roles. [GCP Trainings]