Sr Data Engineer (Hybrid - Boston, MA)

Reposted 18 Days Ago
Boston, MA, USA
Hybrid
95K-149K Annually
Senior level
Cloud • Hardware • Internet of Things
The Role
The Sr Data Engineer will design and optimize data pipelines, ensure data quality, facilitate collaboration among teams, and evaluate new technologies to enhance data engineering capabilities.
Summary Generated by Built In

Join a high-performing, tight-knit team at a fast-growing company using the Internet of Things (IoT) to transform how organizations maintain compliance, enhance safety, and reimagine operations. SmartSense by Digi and Jolt are trusted by some of the world’s most recognizable brands including CVS Health, Walgreens, Walmart, McDonald’s, Jack in the Box, Hartford HealthCare, and Children’s Minnesota to protect their operations and the people they serve. We’re looking for team-oriented change agents who want to help shape the future of IoT.

 
Position
Data Services team members are passionate about data products, engineering data flows, storage, and enabling predictive analytics.  We are inspired by data products and data services and in building and delivering tools, infrastructure, frameworks that enable insights of our business increasing the value of our data to our customers.  As good stewards of our data we contribute to all aspects related to the handling of data, whether from monitoring data flows, our field sensors, PII (Personally Identifiable Information), or reflecting internal processes such as our supply chain.  
 
What We Offer
In this Sr. Data Engineer role, you will contribute to strategic data engineering solutions moving data from raw to cold storage, through ETL (Extract, Transform, Load) pipelines, to data sets used to train ML (Machine Learning) models.  You will collaborate with our Data Science, Business Analysts and Machine Learning Engineers producing quality data flows, transformations, and cleansing towards improved data products for the customer.  You will facilitate the democratization of data for data scientists to experiment and train machine learning models and business analysts supporting the enterprise.  You will have an enthusiasm and drive to deliver data products that exceed expectations, a passion for data engineering and bring an eagerness to learn.  This is an exciting opportunity for an engineer ready to bring this enterprise forward on our data maturity path towards predictive analytics. Join us on our data journey.
 
What You Will Do
  • Join a tightly knit team solving hard problems the right way
  • Understand the various sensors and environments critical to our customers’ success
  • Know the data flows and technology that are currently in use to transform raw data into analytic products
  • Build relationships with the awesome team members across other functional groups
  • Learn our code practice, work in our code base, write tests, and collaborate with us in our workflows
  • Contribute to on-boarding processes and make recommendations to make on-boarding process better
  • Demonstrate your capabilities defining solutions, implementing, and delivering data products for your user stories and tasks
  • Contributing to systems and processes to implement and automate quality on data pipeline deliverables
  • Implement data tests for quality and focuses on improving inefficient tooling and adopting new transformative technologies while maintaining operational continuity.
  • Contribute to the quality of data and the pipeline after working closely with the product team and stakeholders to understand how our products are used
  • Identify opportunities to improve our infrastructure, operational performance, and data pipeline deliverables
  • Evaluate new technologies and build proof-of-concept systems to enhance Data Engineering capabilities and data products
  • Contribute to improving the efficiency of our pipeline scripts, automation, and general data operations
  • Demonstrate command and accountability for the design and implementation of new features
  • Develop and support data operations and efficiencies in production
  • Demonstrate competencies in data modeling new and existing capabilities while progressing the maturity of our data
  • Influence your peers through your excellence in delivering high quality data products and code reviews
  • Deliver operational data from the data platform to software and analytic teams producing aggregate metrics from real time data streams
  • Establish a reputation for reliability in data contextualization and troubleshooting with the team
  • Improve the velocity of development of data ingestion, orchestration, fusion, transformation, and data analysis.
  • Deliver infrastructure required for optimal extraction, transformation, and data loading in predictive analytic contexts
  • Transform ETL development with optimizations for efficient storage, retention policies, access, and computation while accounting for cost
  • Contributing to the strategic maturity of all our operations and delivery of product requests
  • Define orchestrations of data transformations that distill information to highly valuable signals for ML models
  • Collaborate with your teammates to deliver a data analytics and AI platform for advanced analytic data product development
Within 1 Month, you’ll
  • Join a tightly knit team solving hard problems the right way
  • Understand the various sensors and environments critical to our customers’ success
  • Learn the data models and flows that are currently in use to transform raw data into analytic products
  • Build relationships with the awesome team members across other functional groups
  • Learn our code practice, work in our code base, write tests, and collaborate with us in our workflows
  • Contribute to on-boarding processes and make recommendations to make on-boarding process better
 
Within 3 Months, you’ll
  • Demonstrate your capabilities defining solutions, implementing, and delivering data products for your user stories and tasks
  • Implement data quality tests, support existing pipelines & procedures, and optimize warehouse
  • Work closely with the product team and stakeholders to understand how our products are used
  • Identify opportunities to improve our infrastructure, operational performance, and data pipeline deliverables and influence us all to be better
Within 6 Months, you’ll
  • Evaluate new technologies and build proof-of-concept systems to enhance Data Engineering capabilities and data products
  • Contribute to improving the efficiency of our automation and general data operations
  • Design and implement new features and be accountable for their performance
  • Deliver high quality operational data
  • Generate high quality documentation and detailed analysis
  • Articulate conceptual, logical, and physical data models in confluence
  • Join the on-call rotation for your team supporting product services and responding to incidents.
Within 12 Months, you’ll
  • Establish a reputation as a partner in data analysis and contextualization with clear articulations about our data space for targeted internal audiences.
  • Deliver infrastructure required for optimal extraction, transformation, and data loading in predictive analytic contexts
  • Transform ETL development with optimizations for efficient storage, retention policies, access, and computation while accounting for cost
  • Contribute to the strategic maturity of our operations and delivery of product requests
  • Key Player in the design and delivery of the data pipelines and engineering infrastructure which support machine learning systems at scale
  • Collaborate with your teammates to advance our architecture in support of the predictive analytics roadmap
 
Who You Are and What You Bring
  • Bachelor’s or master’s degree in a technical or quantitative field.
  • 5+ years of hands-on Data Engineering experience, delivering production-grade solutions at scale.
  • Expert in Snowflake, with proven ability to design, optimize, and deploy high-quality solutions for large-scale environments.
  • Advanced SQL and Python skills, including writing efficient, reusable, and well-documented code.
  • Proven experience building and maintaining ETL/data pipelines, including orchestration, monitoring, and optimization for performance and cost.
  • Strong knowledge of data warehousing, data lakes, and relational/non-relational databases.
  • Experience with managed cloud services (AWS or GCP) and implementing secure, scalable data solutions.
  • Experience delivering and articulating data models to support enterprise and data product needs.
  • Proficiency in DBT, including authoring transformations and automated tests.
  • Experience implementing automated testing frameworks (unit tests, integration tests, data-quality checks) for data pipelines.
  • Strong Git and Agile/Scrum experience, including code reviews and collaborative workflows.
  • Excellent communication skills to articulate complex technical concepts simply and collaborate effectively across teams.
  • Experience participating in design and code reviews and communicating feedback respectfully.
  • Must have experience authoring stories and bugs independently and in team grooming sessions.
  • Core technologies: SQL, Python, JavaScript, Snowflake, RESTful, Atlassian, DBT, Git, AWS  
Desired But Not Required
  • Experience using GenAI tools (e.g., Windsurf, Claude, Copilot, Cursor) to accelerate development and improve data workflows.
  • Proven ability to build REST APIs using Python web frameworks such as FastAPI.
  • Familiarity with the Data Science lifecycle, including Machine Learning DataOps and supporting ML model training pipelines.
  • Hands-on experience with orchestration tools such as Airflow or Luigi for managing complex data workflows.
  • Knowledge of data governance practices, including handling PII and implementing secure data access paradigms.
  • Experience working with time-series telemetry data, including aggregation and optimization for analytics.
  • Snowflake SnowPro Certification is a plus; familiarity with other cloud data platforms or Lakehouse architectures is desirable.
  • Experience integrating with BI platforms and building data workflows for analytics and reporting.
  • Background in supporting production environments, including participation in on-call rotations and incident response.
  • Experience working with Kubernetes or other container-orchestration system.
  • Experience deploying data pipelines and data models to a production environment.
  • Experience operating and monitoring production data pipelines.
     

    *Please note that we are unable to provide visa sponsorship for this position. This includes, but is not limited to, work visas, employment-based visas, or residency sponsorship. Candidates must have valid work authorization in the United States at the time of application. Visa applications of any kind will not be considered.

    Digi International offers a distinctive Total Rewards package including a short-term incentive program, new hire stock award, paid parental leave, open (uncapped) PTO, and hybrid work environment in addition to our competitive medical, health & wellbeing and compensation offerings.

    The anticipated base pay range for this position is $95,000 – $149,000. Pay ranges are determined by role, job level and primary job location.  The range displayed reflects the reasonable range we anticipate paying for this position and reflects the cost of labor within several U.S. geographic markets. The specific salary offered within the range will depend on various factors including, but not limited to the candidate’s relevant and prior experience, education, skills, and primary work location.  It is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each position. Pay ranges are typically reviewed and updated annually.

    At Digi, we embrace diversity and inclusion among our teammates. It is critical to our success as a global company, and we seek to recruit, develop and retain the most talented people from a diverse candidate pool. We are committed to providing an environment of respect where equal employment opportunities are available to all applicants and teammates.

    Equal Opportunity Employer
    This employer is required to notify all applicants of their rights pursuant to federal employment laws. For further information, please review the Know Your Rights notice from the Department of Labor.

    Top Skills

    Atlassian
    AWS
    Dbt
    Git
    JavaScript
    Python
    Restful
    Snowflake
    SQL
    Am I A Good Fit?
    beta
    Get Personalized Job Insights.
    Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

    The Company
    Hopkins, MN
    920 Employees
    Year Founded: 1985

    What We Do

    Digi International (Digi) is a leading global provider of mission-critical and business-critical machine-to-machine (M2M) and Internet of Things (IoT) connectivity products and services. We help our customers create next generation connected products and deploy and manage critical communications infrastructures in demanding environments. Our embedded modules and off-the-shelf routers, gateways and network products are designed for relentless reliability and deliver unquestioned performance and security. Our cloud-based software and professional services help customers put their connected products and assets to work across a broad range of mission-critical industry applications. Founded in 1985, we’ve helped our customers connect over 100 million things, and growing.

    Similar Jobs

    ClickUp Logo ClickUp

    Senior Data Engineer

    Cloud • Digital Media • Enterprise Web • Marketing Tech • Software
    Remote or Hybrid
    2 Locations
    1000 Employees
    139K-182K Annually

    Energy Solutions Logo Energy Solutions

    Senior Data Engineer

    Consulting • Energy • Renewable Energy
    In-Office
    Boston, MA, USA
    550 Employees
    140K-165K Annually

    AcuityMD Logo AcuityMD

    Staff Data Engineer

    Healthtech • Software
    Easy Apply
    In-Office or Remote
    2 Locations
    213 Employees
    180K-210K Annually
    In-Office
    4 Locations
    22000 Employees
    140K-160K Annually

    Similar Companies Hiring

    Turion Space Thumbnail
    Software • Manufacturing • Information Technology • Hardware • Defense • Artificial Intelligence • Aerospace
    Irvine, CA
    150 Employees
    Amplify Platform Thumbnail
    Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
    Scottsdale, AZ
    62 Employees
    Fairly Even Thumbnail
    Software • Sales • Robotics • Other • Hospitality • Hardware
    New York, NY

    Sign up now Access later

    Create Free Account

    Please log in or sign up to report this job.

    Create Free Account