Design, build, maintain, analyze, and interpret data to provide actionable insights that drive business decisions. Work with large datasets, develop reports, and support data governance initiatives.
Requirements
- Design, develop, and maintain data solutions
- Create data pipelines and ensure data quality
- Contribute to the design, development, and implementation of data pipelines
- Take ownership of data pipeline projects
- Collaborate with cross-functional teams
- Develop and maintain data models
- Implement data security and privacy measures
- Leverage cloud platforms to build scalable and efficient data solutions
- Collaborate with Data Architects, Business SMEs, and Data Scientists
- Identify and resolve complex data-related challenges
- Adhere to best practices for coding, testing, and designing reusable code/component
- Explore new tools and technologies to improve ETL platform performance
- Participate in sprint planning meetings and provide estimations on technical implementation
- Design and develop data pipelines leveraging Databricks, PySpark, and SQL
- Engineer solutions for both structured and unstructured data
- Implement automated workflows for data ingestion, transformation, and deployment
- Apply performance optimization techniques
- Build integrations with multiple data sources
- Collaborate effectively with global teams
Benefits
- Competitive salary
- Benefits package
- 401(k) matching
- Generous Paid Time Off
- Retirement Plan