As a Data Engineer, you will design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt. You will collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions.
Requirements
- 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines.
- Strong Python skills, especially in the context of data orchestration.
- Strong understanding of database management and design, including experience with Snowflake or an equivalent platform.
- Proficiency in SQL
- Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts.
- Experience with Argo, Prefect, Airflow, or similar data orchestration tools.
- Excellent problem-solving and analytical skills with a strong attention to detail.
- Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business.
- Strong communication skills.
- Undergraduate and/or graduate degree in math, statistics, engineering, computer science, or related technical field
- Experience with our stack: AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, along with some ancillary tools
- Experience with DevOps practices, especially CI/CD
- Previous experience managing enterprise-level data pipelines and working with large datasets
- Experience in the energy sector
Benefits
- Competitive compensation based on market standards.
- Flexible Leave Policy
- Office is in the heart of the city in case you need to step in for any purpose.
- Medical Insurance (1+5 Family Members)
- Annual performance cycle
- Quarterly team engagement activities and rewards & recognitions
- L&D programs to foster professional growth
- A supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency