Snowflake is seeking a Senior Data Architect to join the Applied Field Engineering team, who will provide technical leadership in designing and architecting the Snowflake Cloud Data Platform for enterprise data architecture and overall ecosystem. The ideal candidate will have 10+ years of architecture and data engineering experience, 5+ years in a pre-sales environment, and excellent presentation skills.
Requirements
- 10+ years of architecture and data engineering experience within the Enterprise Data space
- 5+ years experience within a pre-sales environment (Sales Engineer, Solutions Engineer, Solutions Architect, etc...)
- Outstanding presentation skills to both technical and executive audiences
- Ability to connect a customer’s specific business problems and Snowflake’s solutions
- Broad range of experience within large-scale Database and/or Data Warehouse technology, ETL, analytics and cloud technologies
- Hands on Development experience with technologies such as SQL, Python, Pandas, Spark, PySpark, Hadoop, Hive and any other Big data technologies
- Deep understanding of data integration services and tools for building ETL and ELT data pipelines such as Apache NiFi, Matillion, Fivetran, Qlik, or Informatica.
- Familiarity with streaming technologies (ex. Kafka, Flink, Spark Streaming, Kinesis) and real-time or near real time use cases (ex. CDC)
- Experience designing interoperable data lakehouse architectures and experience working with Iceberg, Delta, and Parquet
- Strong architectural expertise in data engineering to confidently present and demo to business executives and technical audiences
- Bachelor’s Degree required, Masters Degree in computer science, engineering, mathematics or related fields, or equivalent experience preferred
Benefits
- Visa Sponsorship
- Four Day Work Week
- Generous Parental Leave
- Retirement Plan