Senior Data Engineer

Posted 3 Days Ago
4 Locations
Remote or Hybrid
Senior level
Payments
The Role
The Senior Data Engineer will design, build, and maintain data structures and pipelines using SQL Server and Snowflake, collaborating with team members to support analytics and operational data solutions while ensuring security and compliance.
Summary Generated by Built In
The FreedomPay Commerce Platform is the technology of choice for many of the largest companies across the globe in retail, hospitality, lodging, gaming, sports and entertainment, foodservice, education, healthcare and financial services.  FreedomPay’s technology has been purposely built to deliver rock solid performance in the highly complex environment of global commerce. The company maintains a world-class security environment and was first to earn the coveted validation by the PCI Security Standards Council against Point-to-Point Encryption with EMV standard in North America. FreedomPay’s robust solutions across payments, security, identity and data analytics are available in-store, online and on-mobile and are supported by rapid API adoption. The award winning FreedomPay Commerce Platform operates on a single, unified technology stack across multiple continents allowing enterprises to deliver a consistent, repeatable experience on a global scale.  FreedomPay is a fast paced, high growth company with a great culture with competitive benefits and compensation with a business casual atmosphere.

FreedomPay is experiencing explosive growth and seeks a Senior Data Engineer with expert-level database skills and a passion for technical leadership. 

This role is with FreedomPay’s small but exciting Database Engineering team. This team of data engineers design, build, and maintain all data-related aspects of our products directly or through collaboration with other development teams. This work includes design, creation, and maintenance of Data Structures, Database Objects, ETL processes and reporting visualizations. The primary technology stack runs on the Microsoft SQL Server 2019 and 2022 stacks, as well as Snowflake, CosmosDB, Redis, and additional modern database technologies 

In this role, you will collaborate with team members to design, build, and operate data pipelines and data products across SQL Server and Snowflake. You will contribute to ELT/ETL development, orchestration, monitoring, and cost/performance optimization, and you will partner with analytics and application teams to deliver trusted, well-governed datasets and self-service reporting. 

 Responsibilities

    • Design and evolve data models and database objects for operational and analytical workloads in Microsoft SQL Server and Snowflake (schemas, roles, warehouses, performance and cost optimization). 

    • Build and maintain ELT/ETL pipelines (batch and near-real-time), leveraging Snowflake capabilities (Snowpipe, Streams/Tasks) and orchestration tools (e.g., Airflow or Azure Data Factory) as appropriate. 

    • Implement and support data streaming and event-driven ingestion patterns using technologies such as Kafka, Azure Event Hubs, (topics/streams, schemas, consumers, and replay strategies). 

    • Leverage Redis and other low-latency data stores for caching and real-time access patterns; partner with application teams to define fit-for-purpose SLAs and data freshness targets. 

    • Develop and maintain curated datasets and self-service analytics in Sigma Computing (workbooks, datasets, governance and performance), and support legacy reporting where needed (e.g., SSRS).

    • Collaborate with engineering, analytics, and product teams to deliver data solutions that meet business requirements. 

    • Automate deployments using Git-based workflows and CI/CD (e.g., Azure DevOps), including database migration/versioning (Flyway)  

    • Use Claude Code (AI-assisted development) to accelerate data pipeline delivery (design, implementation, refactoring, documentation, and troubleshooting) while adhering to security, quality, and SDLC standards. 

    • Participate in Agile ceremonies and contribute to continuous improvement of data engineering processes and standards. 

    • Establish data quality, testing, and observability (e.g., unit/integration tests for pipelines, data validation, lineage, alerting, SLAs) to ensure reliable delivery. 

    • Partner with engineering, analytics, and product teams to define and deliver data products (source-to-target mappings, contracts, SLAs), enabling trustworthy analytics and operational use cases. 

    • Ensure data security, governance, and compliance across platforms (PII handling, encryption, auditing, retention), including Snowflake RBAC, secure data sharing, and access controls. 

    • Troubleshoot and resolve performance, reliability, and scalability issues across data platforms; instrument pipelines with logging/metrics and on-call friendly runbooks. 

Qualifications

    • Strong understanding of modern data engineering practices and tools (cloud data platforms, orchestration, testing/observability, DataOps, and AI-assisted development with Claude Code). 

    • Strong English reading and writing communication skills, with an ability to express and understand complex technical concepts.  As other languages are a requirement, that will be explicitly noted during the recruitment process. 

    • Strong analytical, problem-solving, and conceptual skills. 

    • Hands-on experience with Snowflake and integrating it into production data pipelines. 

    • Experience enabling governed self-service analytics with Sigma Computing (datasets, workbooks, access controls, and performance best practices). 

    • Experience with streaming/event platforms such as Kafka, Azure Event Hubs, including schema/versioning considerations and operational support. 

    • Proficiency with Python for data engineering automation and/or building pipeline components; experience with orchestration (Airflow and/or Azure Data Factory) is strongly preferred. 

    • Experience using Claude Code to develop, test, and iterate on data pipeline solutions (e.g., generating boilerplate, improving SQL/Python, and speeding up root-cause analysis) with appropriate human review. 

    • Ability to work in teams and strong interpersonal skills. 

    • Ability to work under pressure and meet tight deadlines.  

    • Ability to anticipate potential problems and determine and implement solutions. 

Education/Experience

    • Relevant training in principles and techniques of database development and modeling. Be familiar with systems concepts, design, and standards.  Provide expertise in software usage, functionality, performance, security, aesthetics, resilience, reuse, comprehensibility, and economic and technological tradeoffs. 

    • Bachelor’s degree in Computer Science, Software Engineering, MIS, or related discipline; or equivalent practical experience. 

    • 7+ years of experience in data engineering and/or database engineering, including building and operating production data pipelines. 

    • Experience working within an AGILE Scrum or SafAgile software development environment 

    • Strong written and verbal interpersonal communication skills in the English language 

     

Technical Expertise

    • Expert in designing, optimizing, and scaling relational and cloud-native data platforms (SQL Server, Snowflake, Cosmos DB, Redis). 

    • Experience with streaming architectures and tooling (Kafka/Event Hubs/Kinesis), including delivery semantics, late/out-of-order events, and operational monitoring. 

    • Strong proficiency in Python and SQL for building pipeline components, automation, and data transformations; familiarity with modern ELT patterns and reusable frameworks. 

    • Experience with orchestration and DataOps, Git-based workflows, CI/CD 

    • Data quality and observability experience 

    • Analytics enablement experience with Sigma Computing, including modeling curated datasets for performance and supporting governed self-service. 

    • Comfortable using Claude Code for AI-assisted development to improve engineering velocity and consistency across SQL/Python codebases, tests, and documentation. 

    • Proven track record of architecting multi-terabyte, high-performance data solutions 

    • Advanced proficiency in T-SQL, query optimization, indexing strategies, and database security 

    • Experience implementing DataOps, CI/CD, and automated testing for database deployments 

    • Five or more years of experience developing the full range of objects in Microsoft SQL 2019 (or better) databases using SQL and T-SQL 

    • Five or more years of experience creating and maintaining ETL packages with SQL Server Integration Services (SSIS); At least three years of experience SSIS with SQL Server 2019 (or better) preferred. 

    • Five or more years collaborating with team members on SQL, T-SQL, SSIS, and SSRS performance tuning problems to resolution. 

    • Working knowledge of deployments utilizing Database Projects, MS-ISPACs and DACPACs preferred 

    • Basic C# applied (non-academic) programming experience knowledge a plus 

    • Familiarity or experience with Azure SQL, Azure Data Factory is a plus 

    • Experience with version control, preferably GIT 

    • Experience with work management tools such as Jira or Azure DevOps 

    • Proficient in SDLC process and able to follow through the lifecycle of a process 

Attributes

    • Strong collaborative approach to team members inside and outside of database development group to develop solutions and solve problems 

    • Professional, positive, and self-motivated approach.  

    • Strong relationship-building, written and verbal communication skills at all levels.  

    • Ability to work independently and as part of a team, managing multiple tasks and priorities.  

    • Commitment to continuous learning and adapting to new technologies and best practices. 

    • Exceptional communication skills, able to articulate complex concepts to technical and non-technical audiences 

    • Commitment to continuous learning and driving organizational excellence 

As the fastest growing commerce company in the industry, we offer the opportunity for tremendous upward mobility within the company as well as development and professional growth opportunities. FreedomPay's fulltime roles provide exceptional benefits including medical, prescription, dental and vision coverage, Life Insurance, Retirement Plans with company match, commission sharing plan, flexible hybrid working environment, and great parental and other leave programs. All positions must be able to successfully pass a background check as well as a credit check.

FreedomPay is an Equal Opportunity Employer, including Disability/Veterans. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or status as a protected veteran.

Top Skills

Airflow
Azure Data Factory
Azure Event Hubs
Claude Code
Cosmosdb
Git
Kafka
Microsoft Sql Server 2019
Microsoft Sql Server 2022
Python
Redis
Snowflake
Ssis
T-Sql
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
Philadelphia, PA
259 Employees
Year Founded: 2000

What We Do

The FreedomPay Commerce Platform is the best way for merchants to simplify complex payment environments. Validated by the PCI Security Standards Council for Point-to-Point Encryption (P2PE) along with EMV, Tokenization, Contactless and DCC capabilities, global leaders in retail, hospitality, gaming, education, healthcare and financial services trust FreedomPay to deliver unmatched security and advanced value added services.

Similar Jobs

Garner Health Logo Garner Health

Senior Data Engineer

Big Data • Healthtech • HR Tech • Machine Learning • Software • Telehealth • Big Data Analytics
Easy Apply
Remote
USA
350 Employees
167K-200K Annually

Luxury Presence Logo Luxury Presence

Senior Data Engineer

Marketing Tech • Real Estate • Software • PropTech • SEO
Easy Apply
Remote or Hybrid
United States
500 Employees
150K-190K Annually

LeafLink Logo LeafLink

Senior Data Engineer

Cannabis • eCommerce • Enterprise Web • Logistics • Payments • Software • Database
Easy Apply
Remote
United States
190 Employees
125K-155K Annually

Upstart Logo Upstart

Platform Engineer

Artificial Intelligence • Fintech • Machine Learning • Social Impact • Software
Easy Apply
Remote
United States
1500 Employees
167K-231K Annually

Similar Companies Hiring

Playground (tryplayground.com) Thumbnail
Kids + Family • Payments • Social Impact • Software
New York City, New York
60 Employees
Rain Thumbnail
Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3 • Infrastructure as a Service (IaaS)
New York, NY
100 Employees
Scotch Thumbnail
Artificial Intelligence • eCommerce • Fintech • Payments • Retail • Software • Analytics
US
35 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account