We are seeking an experienced Senior Snowflake Engineer to design, develop, and optimize scalable data solutions on the Snowflake platform. The ideal candidate will have strong expertise in data warehousing, ETL/ELT pipelines, and hands-on experience with Python and Apache Airflow for orchestration.
Key Responsibilities
Design and implement scalable data models and data warehouses using Snowflake
Develop and optimize complex SQL queries, stored procedures, and Snowflake pipelines
Build and maintain robust ETL/ELT pipelines using Python
Orchestrate workflows and automate data pipelines using Apache Airflow
Integrate data from multiple sources (APIs, databases, flat files, streaming sources)
Monitor, troubleshoot, and optimize data pipeline performance and cost in Snowflake
Implement data security, governance, and access control best practices
Collaborate with data analysts, data scientists, and business stakeholders
Ensure data quality, consistency, and reliability across systems
Document architecture, processes, and workflows
Required Skills & Qualifications
7+ years of hands-on experience in data engineering or big data development.
4+ years of experience working with snowflake
Strong experience with Snowflake Data Cloud
Advanced SQL skills (query optimization, performance tuning)
Proficiency in Python for data engineering tasks
Hands-on experience with Apache Airflow (DAGs, scheduling, monitoring)
Experience with ETL/ELT tools and data pipeline design
Knowledge of data warehousing concepts (star schema, snowflake schema, etc.)
Familiarity with cloud platforms such as AWS, Azure, or GCP
Experience with version control systems (Git)
Strong problem-solving and analytical skills
Preferred Qualifications
Experience with cloud-native data services (e.g., S3, Azure Data Lake, BigQuery)
Knowledge of CI/CD pipelines and DevOps practices
Experience with dbt (data build tool)
Understanding of data governance and security frameworks
Exposure to streaming technologies (Kafka, Kinesis)
Education
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field
Nice to Have
Snowflake certifications
Experience working in Agile/Scrum environments
Exposure to machine learning pipelines
Key Competencies
Strong communication and stakeholder management skills
Ability to work independently and in a team
Attention to detail and commitment to data quality
Continuous learning mindset
Skills Required
- 7+ years of experience in data engineering or big data development
- 4+ years of experience with Snowflake
- Strong experience with Snowflake Data Cloud
- Advanced SQL skills
- Proficiency in Python
- Hands-on experience with Apache Airflow
- Experience with ETL/ELT tools
- Knowledge of data warehousing concepts
- Familiarity with cloud platforms
- Experience with version control systems (Git)
What We Do
A Trusted Partner for Every Digital Enterprise Bringing Value. Jade Global is a global IT consulting company with two decades of industry experience that helps the world’s leading businesses and organizations build their digital core, optimize their operations, and accelerate revenue growth. We are headquartered in San Jose, California; Jade Global operates with offices in 13 locations across North America, the UK, and Asia. Renowned as a trusted "partner of choice" for businesses in Healthcare & Life Sciences, Hi-tech, Retail, Manufacturing, and Financial Industries, Jade Global has innovated 30+ industry-specific solutions. Whether your focus is harnessing or expanding Gen-AI, AI, and digital capabilities, transforming operating models, or accelerating insightful decision-making, we’re here to help you gain and maintain a competitive edge with efficient, sustainable models. At Jade Global, it’s all about outcomes—your outcomes—and delivering the results you desire, tailored to your unique requirements
Why Work With Us
We are Great Place to Work Certified company and Jade focus on people first approach.
Gallery






.png)