We are seeking a Senior Data Engineer with experience in GCP, Python, PySpark, SQL, and ETL/ELT pipelines to join our team.
Requirements
- At least 4-5 years of commercial experience as a Data Engineer
- Strong Python and PySpark skills
- Experience with GCP Cloud toolset
- Strong hands-on experience with SQL and query optimization
- Experience with ETL/ELT pipelines development, testing, and management
- Strong experience with Hadoop
- Understanding of key concepts around Data Warehousing, Data Lakes, and Data Lakehouses
Benefits
- Hybrid work in one of our locations
- Working in a highly experienced and dedicated team
- Benefit package with private medical coverage, sport & recreation package, lunch subsidy, life insurance, etc.
- On-line training and certifications fit for career path
- Access to e-learning platform
- Mindgram - a holistic mental health and wellbeing platform
- Work From Anywhere (WFA) option to work remotely outside of Poland for up to 140 days per year