We are seeking a skilled Senior Data Engineer with expertise in Python and Google Cloud Platform (GCP) to design, develop, and maintain robust and scalable ETL data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency.
Requirements
- 7–10 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Experience in Apache Spark, Kafka, Redis, Fast APIs, Airflow, GCP Composer DAGs.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
- Experience in data migrations from on-premise data sources to Cloud platforms.
- Bachelor's degree in Computer Science, a related field, or equivalent experience.