Senior Data Engineer

Reposted 3 Days Ago
Salt Lake City, UT
In-Office
Senior level
Other • Real Estate
The Role
The Sr. Data Engineer leads data projects, architecting scalable data pipelines, mentoring junior engineers, and ensuring data governance compliance while integrating with Microsoft Fabric.
Summary Generated by Built In

*** PLEASE NOTE: This is a hybrid role, requiring this person to work at our corporate headquarters in Salt Lake City, UT. We are unable to sponsor or take over sponsorship of an employment visa at this time. ***

Job Summary
The Sr. Data Engineer serves as a technical expert within the team, owns critical data systems, mentors others, and drives reliability and excellence in modern data engineering. Focuses on optimizing data architectures for scalability, integrating advanced tools beyond traditional warehousing, building proficiency in Python while supporting Fabric migration efforts, and applying Kimball dimensional modeling expertise to ensure robust, performant data solutions. Designs, codes, tests, debugs, and documents complex databases.
Primary Responsibilities

  • Architect scalable data pipelines and dimensional models across hybrid environments, applying Kimball methodology (e.g., bus architecture, star/snowflake schemas, fact table granularity, slowly changing dimensions, surrogate keys).

  • Lead technical execution of data projects, including migrations to Microsoft Fabric (e.g., refactoring Synapse Pipelines to Fabric equivalents while preserving dimensional integrity).

  • Mentor Associates and Mid-level engineers; review pipeline designs, Python code, dimensional models, and implementations.

  • Proactively identify and resolve data performance, quality, security, or scalability issues.

  • Ensure adherence to data governance standards, security practices (e.g., encryption, access controls), and compliance requirements.

  • Break down complex data initiatives into actionable plans, incorporating Kimball principles and Fabric components (e.g., Dataflows, Notebooks, Lakehouse).

  • Implement and maintain Git-based workflows for data pipelines, notebooks, and transformations, including branching strategies for safe development.

  • Configure and execute promotions in Fabric Deployment Pipelines, handling environment-specific rules and content.

  • Conduct data quality and pipeline tests during the development cycle, ensuring changes are reliable before cross-environment deployment.

  • Support migration-related CI/CD activities, such as refactoring Synapse Pipelines to Fabric equivalents with version-controlled artifacts.

  • Independently interpret business requests into technical requirements through direct engagement with requestors.

  • Deliver end-to-end solutions for complex or high-impact requests, including proactive suggestions for improvements.

  • Provide technical guidance during requirement refinement and feedback sessions with requesting teams.

  • Designs and implements archive, recovery, and load strategies. 

  • Determines database structural requirements by analyzing client or internal operations, applications, and programming. 

  • Reviews objectives with clients or internal users and evaluates current systems. 

  • Coordinates new data development, ensuring consistency and providing for integration with existing warehouse structure.

Key Tools & Technologies

  • Advanced SQL for querying, transformations, performance tuning, and Kimball dimensional modeling.

  • Python for scripting, automation, custom logic, and Fabric notebooks.

  • Azure Synapse Pipelines, SQL Server, and Azure Functions.

  • Microsoft Fabric (Lakehouse, OneLake, Pipelines, Notebooks, Dataflows)

  • Git integration and Fabric Deployment Pipelines for version control and CI/CD workflows.

Job Specifications

  • Demonstrates proficiency in all areas of data engineering with advanced in-depth specialization in dimensional modeling, pipeline architecture, and modern cloud platforms.

  • Participates in developing technical/business approaches and new or enhanced technical tools, including CI/CD best practices.

  • Has advanced knowledge of scalable data pipelines, lakehouse architectures, and high-volume processing in Azure Synapse and Microsoft Fabric environments.

Education and Experience

  • Typically requires 5+ years of related experience and a bachelor’s degree (or equivalent experience).

  • Strong hands-on experience with Microsoft Azure data services, SQL Server, Python, and data modeling (Kimball methodology preferred).

  • Experience supporting or leading migrations to modern platforms like Microsoft Fabric is highly desirable.

Work Environment & Physical Requirements
Performs sedentary work in an office environment with limited lifting (less than 10 pounds) or walking required. Close visual acuity required to perform work at computer terminal. No exposure to adverse environmental conditions. Requires repetitive typing motion, talking, hearing, grasping and feeling.
Disclaimer
The job description outlines the general nature and scope of work employees perform in this role. It's not intended to be an exhaustive list of all duties, responsibilities, or qualifications required for the position. The company reserves the right to modify, revise, or update the job description to meet business needs.

If you are a current Extra Space employee, please apply through Jobs Hub in Workday.

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Applications Deadline: Applications will be accepted until the position is filled.

Top Skills

Azure Functions
Azure Synapse Pipelines
Git
Microsoft Fabric
Python
SQL
SQL Server
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Salt Lake City, UT
2,400 Employees
Year Founded: 1977

What We Do

Extra Space Storage offers climate-controlled, self storage units located all over the U.S

It’s not easy being stuck in a job. Join a company that cares about you.

We’re listed on multiple “Best Places to Work” awards (including Forbes, Glassdoor). We’re an S&P 500 company that hasn’t stopped growing since our founding in 1977. Self-storage is our product, helping people is our passion.

Grow with us.

Similar Jobs

Bestow Logo Bestow

Senior Data Engineer

Big Data • Fintech • Information Technology • Insurance • Software
Remote or Hybrid
US
160 Employees
135K-159K Annually

Launch Potato Logo Launch Potato

Senior Data Engineer

AdTech • Big Data • Consumer Web • Digital Media • Marketing Tech
Easy Apply
In-Office or Remote
Salt Lake City, UT, USA
160 Employees
150K-190K Annually

Launch Potato Logo Launch Potato

Senior Data Engineer

AdTech • Big Data • Consumer Web • Digital Media • Marketing Tech
Easy Apply
In-Office or Remote
American Fork, UT, USA
160 Employees
150K-190K Annually
In-Office or Remote
50 Locations
17989 Employees
79K-148K Annually

Similar Companies Hiring

Compa Thumbnail
Software • Other • HR Tech • Business Intelligence • Artificial Intelligence
Irvine, CA
70 Employees
Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees
Fairly Even Thumbnail
Software • Sales • Robotics • Other • Hospitality • Hardware
New York, NY

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account