Weekday is seeking a Data Engineer for their client, to build and maintain robust ETL pipelines, optimize data lakes and warehouses, and ensure data quality and performance. The role involves collaboration with data science teams and a focus on analytical problem-solving.
Requirements
- Design and maintain robust ETL pipelines for structured and unstructured data.
- Develop and optimize data lakes and warehouses using Databricks, Azure Synapse, Snowflake, BigQuery, or Redshift.
- Implement both batch and real-time data processing solutions using Spark, Kafka, or Hadoop.
- Translate business requirements into data models, metrics, and dashboards.
- Build interactive dashboards and visualizations using Power BI, Tableau, or Looker.
- Conduct exploratory data analysis and develop reports using Python and SQL.
- Collaborate with data science teams to prepare and deliver structured data for AI/ML models.
- Work closely with engineering, product, and business teams to align on key KPIs and analytics requirements.
- Document data workflows, models, and analysis outputs for transparency and reuse.
- Strong foundation in data modeling (Star/Snowflake schemas) and query optimization.