Role & Responsibilities:
- Define and evolve Snowflake platform architecture, standards, and best practices.
- Translate business goals into a pragmatic technical roadmap and delivery plan.
- Lead and mentor data engineers; establish quality bars, review code, and guide execution.
- Manage delivery using Agile rituals; align priorities across data, analytics/BI, and application teams.
- Design, build, and optimize ELT pipelines on Snowflake with an ELT-first approach (dbt preferred).
- Implement modern data ingestion using tools such as Fivetran, ADF, Glue, or Matillion.
- Set up orchestration and CI/CD for data using Airflow or Dagster and Git-based pipelines.
- Ensure data quality, observability, monitoring, alerting, documentation, and runbooks.
- Apply performance tuning and cost optimization in Snowflake, including query profiling and warehouse sizing.
- Implement security and governance in Snowflake (RBAC, masking, row access policies, auditing, data sharing).
- Facilitate discovery with stakeholders, clarify requirements, and communicate trade-offs and recommendations.
Hard Skills - Must have:
- Deep hands-on Snowflake experience: warehouses, databases/schemas, stages, file formats, external tables, Snowpipe, Streams and Tasks, Time Travel/Fail-safe, query performance, micro-partitions, clustering, and cost management.
- Strong SQL and Python for data engineering use cases.
- ELT-first development with dbt or an equivalent modeling/testing/documentation framework.
- Experience with modern ingestion and orchestration (Fivetran, ADF, Glue, Matillion, Airflow, or Dagster).
- Cloud experience (AWS preferred; Azure or GCP considered), including storage, IAM, networking basics, and secrets management.
- CI/CD for data with Git-based workflows and environment promotion.
- Familiarity with Infrastructure as Code (e.g., Terraform Snowflake provider).
- Solid understanding of dimensional modeling and data quality practices.
Nice to have/It's a plus:
- Snowpark (Python), UDFs or stored procedures, and Dynamic Tables.
- Streaming or CDC with Kafka, Kinesis, or Debezium; event-driven patterns.
- BI or semantic layer exposure (Power BI, Tableau, Looker); metrics layer concepts.
- Experience with Azure Synapse, Microsoft Fabric, or Databricks in lakehouse contexts.
- Security and compliance exposure (PII handling, encryption, auditing, regulated environments).
Soft Skills:
- Proven leadership with coaching, feedback, objective setting, and conflict resolution.
- Excellent stakeholder communication; able to facilitate discovery and manage expectations.
- Ownership mindset with proactive risk identification, prioritization, and mitigation.
- Structured problem-solving; comfortable with ambiguity and change.
Top Skills
What We Do
Allata (pronounced a-ley-ta) is a strategy, architecture and enterprise-level application development company focused on helping clients enhance or scale business opportunities, create efficiencies and automate processes through custom technologies.
We are building a different kind of firm – focused on doing exciting, transformational work for great clients and bringing caring and dedicated people to make our clients goals a reality. Our vision is to build an energized group of talented professionals that can stand strong on their own but work better as a networked team.
We enable business agility at the intersection of people, process, and technology. We provide solutions and expert services to assist businesses to become more nimble, transformative, and disruptive in their respective industries. We define vision, strategy, and value creation models for shaping strategic product designs, managing, and transforming enterprise delivery.
Just as strongly as we care about our clients, we feel that it is important to give back to the community and non-profits that we are passionate about. Every month, Allata donates 2% of our net income to a charitable cause our team believes in.
We live by our mantra:
Family comes first, clients are king, we take great care of our people.