Senior Software Engineer - DBT

Reposted Yesterday
Be an Early Applicant
Hiring Remotely in Sri Lanka
Remote
Senior level
Food • Logistics
The Role
The Senior Software Engineer will design, implement, and maintain data pipelines and ETL processes on Google Cloud Platform while optimizing data workflows and ensuring data quality. Responsibilities include collaborating with stakeholders, automating processes, and documenting projects.
Summary Generated by Built In
JOB DESCRIPTION

Senior Software Engineer - Google Cloud Platform, Data Build Tool (DBT), Dataform Developer 

 

The Big Picture 

Sysco LABS is the Global In-House Center of Sysco Corporation (NYSE: SYY), the world’s largest foodservice company. Sysco ranks 56th in the Fortune 500 list and is the global leader in the trillion-dollar foodservice industry.  

 

Sysco employs over 75,000 associates, has 337 smart distribution facilities worldwide and over 14,000 IoT-enabled trucks serving 730,000 customer locations. For fiscal year 2025 that ended June 29, 2025, the company generated sales of more than $81.4 billion.    

Sysco LABS Sri Lanka delivers the technology that powers Sysco’s end-to-end operations. Sysco LABS’ enterprise technology is present in the end-to-end foodservice journey, enabling the sourcing of food products, merchandising, storage and warehouse operations, order placement and pricing algorithms, the delivery of food and supplies to Sysco’s global network and the in-restaurant dining experience of the end-customer.  

 

The Opportunity: 

We are currently on the lookout for a Senior Software Engineer - Google Cloud Platform, Data Build Tool (DBT), Dataform Developer to join our team. The candidate must be highly skilled and innovative Google Cloud Platform (GCP), Data Build Tool (DBT) or Dataform to join our growing Data Management Service team. The candidate will play a central role in architecting, developing, and maintaining data workflows, transforming raw datasets into actionable insights, and enabling scalable analytics using cutting-edge cloud technology and modern data modeling best practices. 

 

Responsibilities:  

  • Designing, Developing, and Maintaining Data Pipelines: Build robust, scalable, and efficient ETL (Extract, Transform, Load) pipelines on Google Cloud Platform utilizing tools such as Cloud Dataflow, Dataproc, BigQuery, and Cloud Composer. 

  • Implementing Data Modeling with Data Build Tool (DBT) or Dataform: Develop and manage data models using Data Build Tool (DBT) or Dataform, ensuring clean, reliable, and maintainable transformation logic that aligns with business requirements and analytics use cases 

  • Data Integrations: Integrate diverse data sources, including internal systems, third-party APIs, and external databases, into unified datasets available on the GCP ecosystem (BigQuery) 

  • Data Quality Assurance: Implement and monitor data quality checks, conduct root cause analysis for data inconsistencies, and proactively resolve issues to maintain data integrity 

  • Collaboration: Work closely with data analysts, data scientists, product managers, and business stakeholders to understand data needs and deliver timely solutions 

  • Performance Optimization: Monitor and optimize the performance of data pipelines and queries, ensuring efficient resource utilization and cost management on GCP 

  • Documentation: Create and maintain comprehensive documentation for data models, ETL processes, Data transformation projects, and cloud architecture to support maintainability and team knowledge sharing 

  • Security and Compliance: Adhere to best practices regarding data security, privacy, and compliance, implementing necessary controls and safeguarding sensitive data throughout the pipelines 

  • Automation and CI/CD: Implement CI/CD pipelines for DBT projects and other data processes to ensure robust deployments and version control 

  • Continuous Improvement: Stay abreast of evolving GCP offerings, DBT features, and industry trends to continuously improve and innovate data engineering practices 

 

Requirements: 

  • A Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Engineering, or a related technical field 

  • Proficiency in developing data models and transformation logic using DBT in cloud environments and Dataform in GCP 

  • Strong SQL development skills, with the ability to write complex, performance-optimized queries 

  • Solid understanding of ETL/ELT architecture, data warehousing concepts, and data pipeline orchestration 

  • Experience with version control systems (e.g., Git), and strong familiarity with CI/CD pipelines for deploying data projects 

  • Expertise in scripting or programming languages such as Python, especially for developing custom data workflows 

  • Knowledge of data quality frameworks, testing, and monitoring strategies 

  • Excellent communication and collaboration skills, with the ability to translate technical processes into business-friendly language 

  • Proven problem-solving skills and an analytical mindset 

  • Self-motivated, detail-oriented, and able to work independently as well as in a team environment 

  • 1+ years of hands-on experience working with Google Cloud Platform services, especially BigQuery, Cloud Storage, Cloud Composer, Dataflow, and Dataproc 

  • 1+ years of experience as SQL Developer 

  • Experience implementing and maintaining data lakes and data warehouses in the cloud 

  • Working knowledge of workflow orchestration tools such as Apache Airflow, Google Cloud Composer, or Prefect 

  • Exposure to business intelligence tools, such as Looker, Tableau, or Power BI for data visualization and reporting 

  • Hands-on experience with data governance, cataloging, and metadata management platforms. 

  • Familiarity with streaming data platforms (e.g., Pub/Sub, Kafka) and real-time data processing in GCP 

  • Background in Agile or Scrum methodologies for project management and delivery 

 

 

Benefits:   

  • US dollar-linked compensation   

  • Performance-based annual bonus   

  • Performance rewards and recognition   

  • Agile Benefits - special allowances for Health, Wellness & Academic purposes   

  • Paid birthday leave   

  • Team engagement allowance   

  • Comprehensive Health & Life Insurance Cover - extendable to parents and in-laws   

  • Overseas travel opportunities and exposure to client environments   

  • Hybrid work arrangement   

   

Sysco LABS is an Equal Opportunity Employer.  

 

Top Skills

Apache Airflow
BigQuery
Cloud Composer
Cloud Dataflow
Data Build Tool
Dataform
Dataproc
Git
Google Cloud Composer
Google Cloud Platform
Looker
Power BI
Python
SQL
Tableau
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Houston, TX
24,120 Employees

What We Do

Sysco focuses on distribution of food products to restaurants, hotels, and other hospitality businesses.

Similar Jobs

Sysco Logo Sysco

Technical Lead

Food • Logistics
Remote
Sri Lanka
24120 Employees

Sysco Logo Sysco

Technical Lead

Food • Logistics
Remote
Sri Lanka
24120 Employees
Remote
Sri Lanka
24120 Employees

Sysco Logo Sysco

Lead - Cloud Services

Food • Logistics
Remote
Sri Lanka
24120 Employees

Similar Companies Hiring

HERE Technologies Thumbnail
Software • Logistics • Internet of Things • Information Technology • Computer Vision • Automotive • Artificial Intelligence
Amsterdam, NL
6000 Employees
Tastewise Thumbnail
Software • Retail • Generative AI • Food • Big Data Analytics • Big Data • Artificial Intelligence
NYC, NYC
120 Employees
Axle Health Thumbnail
Logistics • Information Technology • Healthtech • Artificial Intelligence
Santa Monica, CA
17 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account