We are seeking an experienced Data Architect with 10–18 years of expertise in data architecture, engineering, and analytics to design scalable, secure, and high-performing data solutions.
Requirements
- 10–18 years of expertise in data architecture, engineering, and analytics
- Deep knowledge of modern data platforms, cloud technologies, and big data frameworks
- Strong expertise in Databricks
- Proficiency with at least one cloud platform (AWS, Azure, or GCP) and integrating Databricks with them
- Solid understanding of data engineering, ETL, and data modelling principles
- Advanced SQL for querying, transformation, and analysis
- Proficiency in Python for scripting, automation, and data manipulation
- Strong hands-on experience with Apache Airflow for workflow orchestration
- Familiarity with big data technologies (Spark, Hadoop, etc.)
- Knowledge of data warehousing concepts and best practices
- Proficiency in version control systems (Git)
- Excellent communication and collaboration skills
Benefits
- Opportunity to work with a hybrid, collaborative environment that encourages shared learning and innovation
- Chance to work with a team that values flexibility and teamwork
- Potential for career growth and professional development
- Opportunity to work on cutting-edge technologies and methodologies