Job Title:
Information Technology Senior Data Engineer
FLSA Status:Exempt
Reports To:Manager Data Engineering
Schedule:Full-time
Effective Date:2025-06-01
Salary:$119,000 – $158,000 annually
LocationLynnwood, WA
Summary
A senior data engineer designs, builds, and maintains the infrastructure and pipelines that enable the collection, storage, and processing of large datasets. The senior role is given more challenging problems that take advantage of the additional experience and capabilities of the person. The senior role is also expected to take on more ambiguous challenges and draw requirements out of less technical folks. They ensure data is reliable, accessible, and optimized for analytics and business intelligence. Their role often involves working with databases, ETL processes, and cloud platforms to support data-driven decision-making.
Core Responsibilities:
- Mentoring Data Engineers: The Senior Data Engineer has more emphasis on teaching best practices and setting standards for the team as a whole. The Senior Data Engineer plays a prominent role in code reviews to ensure the team is following the best practices and standards across the portfolio of projects in flight.
- Work Breakdown: The Senior Data Engineer can take ambiguous problem statements and break them down into their logical components and provide accurate estimates for the level of effort it will take to complete. This is done in collaboration with the non-technical team making requests.
- Data Pipeline Development: Design, build, and maintain scalable and reliable data pipelines to collect, process, and store data from various sources. The senior role takes on the more challenging pipelines leaving simpler problems for the data engineer to complete and learn from.
- Data Modeling: Familiar with OLTP and OLAP modeling and when to use each. Capable of working with flat files, tables, and json to transform data into easy to use structures.
- Collaboration with Data Consumers: Work closely with data analysts, data scientists, and business teams to understand data needs and deliver appropriate solutions.
- Tooling and Automation: Develop tools and scripts to automate repetitive tasks, improve data workflows, and support continuous integration and deployment of data solutions. Familiar with source control tools and typical software development lifecycle.
Daily Operations
A typical day in the life of our data engineers includes the following:
- Morning Standups & Syncs: Participate in daily standup meetings with data teams and stakeholders to align on priorities, blockers, and progress updates.
- Pipeline Monitoring & Maintenance: Check the health of data pipelines, troubleshoot failures, and ensure data is flowing correctly across systems.
- Data Modeling & Architecture: Design or refine data models and schemas to support new analytics or application requirements.
- ETL Development: Build or update ETL (Extract, Transform, Load) processes to integrate new data sources or improve performance.
- Collaboration with Analysts & Scientists: Work closely with data analysts and data scientists to understand data needs and deliver clean, well-structured datasets.
- Documentation & Code Reviews: Document data workflows, update technical specs, and review code contributions from peers to maintain quality and consistency.
Zumiez Culture:
Partners with others to ensure Zumiez creates an empowered, fair & honest, teaching & learning-based, competitive, and fun work environment that recognizes the contributions of our employees including:
- Anchors all interactions and practices around Zumiez’ Cultural Values.
- Partners with data analysts, BI engineers and data scientists to ensure components are secure, fast, stable and easy to support
- Seeks continual self-improvement through independent and relevant knowledge gathering and seeking internal and external training opportunities
Attributes:
- Humble, curious, and a voracious learner
- Forward thinking, creative, and collaborative
- Approachable, calm, and confident
- High degree of emotional intelligence
- Precise and effective in verbal and written communication
- Embraces risk, hates the status quo and rules, and fosters the idea that fair is almost never equal
- Seeks creative solutions, dives into the unknown, and feels comfortable out on limbs
- Thrives in the complexity of working through influence without authority
- Natural problem solver and differentiates where in the technology stack an incident occurs.
Upcoming Areas of Contribution:
The data engineering team is on a journey migrating a legacy on premise solution to a cloud native solution. This long term project consists of the following main tasks:
- Data Inventory and Assessment - Audit existing data assets, schemas, and ETL processes to determine what should be migrated, transformed, or deprecated.
- ETL/ELT Pipeline Modernization - Rebuild or refactor legacy ETL pipelines using cloud-native tools (e.g., Azure Data Factory, AWS Glue, or Google Cloud Dataflow) to support scalable and efficient data movement.
- Data Quality and Validation Frameworks - Implement automated validation checks to ensure data integrity during and after migration, including schema matching, null checks, and reconciliation reports.
- Security and Compliance Configuration - Set up identity and access management, encryption, and audit logging to meet enterprise security standards and regulatory requirements.
- Performance Optimization and Cost Monitoring - Tune cloud resources for performance and cost-efficiency, including partitioning strategies, query optimization, and usage monitoring dashboards.
Preferred Qualifications & Experience:
- Bachelor of Science in Computer Science, Computer Engineering or equivalent
- 6-10 years of professional experience in a data engineering role
- Proficiency in SQL and Data Modeling - Strong command of SQL for querying and manipulating data, along with experience designing normalized and denormalized data models.
- Experience with ETL/ELT Tools - Hands-on experience building and maintaining data pipelines using tools like Apache Airflow, Azure Data Factory, or AWS Glue.
- Cloud Platform Expertise - Familiarity with cloud services (e.g., AWS, Azure, GCP) for data storage, processing, and orchestration in a scalable environment.
- Programming Skills - Proficiency in languages like Python, Scala, or Java for data transformation, automation, and integration tasks.
- Data Warehousing Knowledge - Experience with modern data warehouse solutions such as Snowflake, BigQuery, Redshift, or Azure Synapse.
- Proficient in Software Development Lifecycle (SDLC) - Familiar with source control tools, automated testing, and continuous integration / deployment
Physical Demands and Work Environment:
- Ability to sit at a workstation in an office environment for extended periods of time and work on a PC without limitations.
- Ability to move about, sit, bend, and squat in an office environment to access files and gather information.
Benefits:
- Salary Range: 119,000 - 158,000
- Medical, Dental, & Vision Insurance, following an initial wait period
- Matched 401k after meeting qualifications
- Paid Parental Leave
- Sick Time Eligible
- Life Insurance
- Paid Vacation
- Bonus Potential
- Stock Purchase Program
- Open, casual, pet-friendly office environment
- Employee Discount on Zumiez product
- On-site skate park, on-site cafeteria
Top Skills
What We Do
Zumiez is a leading specialty retailer of apparel, footwear, accessories, and hardgoods for young people who want to express their individuality








