We are looking for a Data Engineer to design, build, and optimize our data platform using Airflow, Kubernetes, Google Bigquery, and dbt. This role combines hands-on development with architectural responsibility — ensuring scalability and reliability across our infrastructure.
Requirements
- Experience in designing, building, and maintaining complex data pipelines using Airflow
- Strong understanding of ETL/ELT workflows and modern data engineering principles
- Proven experience deploying and operating workloads on K8s (e.g. Deployments, CronJobs, configs, and basic troubleshooting) in a production environment.
- Experience with Bigquery or other cloud datawarehouse for large-scale data processing
- Cloud Infrastructure
- Experience applying DevOps practices — including Docker, container registries, infrastructure as code with Terraform, and CI/CD — in a production context.
- 3–5 years of experience in data engineering
- Proficiency with Airflow, Kubernetes
- Advanced Python and SQL skills
- Collaborative and pragmatic mindset
- Based in or open to relocation to Madrid
Benefits
- Extra wellbeing days on top of your annual leave allowance
- Up to 3 paid volunteering days each year
- 24/7 confidential Employee Assistance Programme (wellbeing, mental health, legal & financial support)
- Learning & development support via the Frontiers Learning Hub
- Competitive local benefits country dependent (e.g. healthcare and pension/retirement provision)