Join our team to architect flawless data infrastructure while making a positive societal impact. Work on high-impact, large-scale data engineering projects in Seattle, WA. We apply AWS, Databricks, dbt, Airflow, and Terraform.
Requirements
- 3+ years of experience in data engineering or a similar role
- Knowledge of SQL, Python, and Spark
- Strong understanding of data modeling and architecture principles
- Proficiency in cloud platforms like AWS, GCP, or Azure and IaC tools such as Terraform
- Experience with modern data warehousing solutions
- Knowledge of at least one orchestration tool (e.g., Airflow, Dagster, Prefect, AWS Step Functions)
- Experience with dbt or similar transformation tools
- Background in implementing data governance and security practices
Benefits
- Significant stock options
- Comprehensive benefits
- Bonus plan
- Commuter benefits
- Excellent office space with complimentary drinks and food options