Data Engineer

Sorry, this job was removed at 12:20 a.m. (CST) on Friday, Jun 27, 2025
Easy Apply
Hiring Remotely in United States
Remote
136K-160K Annually
Blockchain • Fintech • Internet of Things • Cryptocurrency • Web3
Paxos is building the open financial system.
The Role

About Paxos 

Today’s financial infrastructure is archaic, expensive, inefficient and risky — supporting a system that leaves out more people than it lets in. So we’re rebuilding it.

We’re on a mission to open the world’s financial system to everyone by enabling the instant movement of any asset, any time, in a trustworthy way. For over a decade, we’ve built blockchain infrastructure that tokenizes, custodies, trades and settles assets for the world’s leading financial institutions, like PayPal, Venmo, Mastercard and Interactive Brokers. 

About the team 

The Data Engineering team at Paxos builds and maintains the data infrastructure that powers our engineering partners and analytics teams. Our mission is to provide scalable, reliable, and secure data access to support critical business needs, including product analytics, business intelligence, client and regulatory reporting, and operational efficiencies. By continuously enhancing our data platform, we drive innovation, optimize costs, and unlock new revenue opportunities.

You’ll be working closely with talented colleagues such as Joe, Chris and Mike, collaborating across engineering, security, and analytics teams.

About the role

We are looking for a Data Engineer to help scale and optimize our data platform. You will play a key role in ensuring efficient data management, enforcing governance policies, and driving automation across our cloud infrastructure. This role is ideal for someone passionate about data architecture, security, and performance optimization in data warehouses (Snowflake/Redshift/BigQuery) and/or cloud infrastructure (AWS/GCP/Azure).

Your responsibilities will include managing and scaling our Snowflake environment, implementing role-based access controls (RBAC), and leveraging AWS services and Infrastructure as Code (IaC) to streamline operations. You’ll work with a cutting-edge tech stack, including:

  • AWS: EKS/Kubernetes, MSK/Kafka, RDS/Postgres, Lambda, Glue, S3
  • Snowflake: RBAC, Data Governance & Security, Account Management, Warehouse Optimization
  • Infrastructure as Code: Terraform/Pulumi
  • Data Tooling: DBT, Airbyte/Fivetran, Debezium, Dagster/Airflow, Acryl/DataHub, Monte Carlo, Looker

This is a pivotal moment for our team, and your contributions will directly shape the future of our data ecosystem.

What you’ll do 

  • Shape the future of our data infrastructure by designing, scaling, and optimizing our data architecture and cloud infrastructure.
  • Collaborate across teams to define and enforce data governance, security policies, and access controls.
  • Develop and maintain scalable data pipelines and ELT frameworks using AWS-native services, DBT and Snowflake.
  • Oversee Snowflake account management, ensuring resource optimization, RBAC implementation, and compliance with best practices.
  • Automate and streamline data ingestion, monitoring, and access management across AWS and Snowflake.
  • Implement Infrastructure as Code (IaC) using Terraform to efficiently manage AWS and Snowflake resources.
  • Ensure data quality and reliability by designing and implementing validation frameworks and automated testing.
  • Advocate for modern data tooling and architectures, driving efficiency and scalability.

About you 

You have 3+ years of experience in data engineering, with a background in:

  • Cloud Infrastructure & Security – Familiarity with AWS services, including EKS/Kubernetes, MSK/Kafka, RDS/Postgres, Lambda, Glue, and S3, along with security best practices (IAM, KMS, ASM).
  • Infrastructure as Code (IaC) – Proficient in Terraform to manage AWS and Snowflake resources.
  • Cloud Data Warehousing – Hands-on experience with Snowflake, Redshift, BigQuery, or Azure Data Warehouse, including account management, RBAC, data governance & security, and warehouse optimization.
  • SQL & Performance Optimization – Strong skills in query optimization and performance tuning within cloud-based data warehouses.
  • Programming & Automation – Proficiency in Python (or similar) for automation, data pipeline development, and workflow orchestration.
  • Workflow Orchestration – Experience with Dagster, Airflow, AWS Step Functions, or Azure Data Factory to manage data workflows.
  • ETL/ELT Design & Maintenance – Experience in building and maintaining scalable, reliable data pipelines.
  • Data Governance & Compliance – Understanding of security, governance, and compliance best practices for handling sensitive data.
  • Modern Data Tooling – Experience with tools like DBT, Airbyte, Debezium, Dagster, Acryl, Monte Carlo, and Looker to enhance data reliability and observability.

This is a unique opportunity to influence the future of our data infrastructure, work with cutting-edge cloud technologies, and drive meaningful impact across Paxos.

Important Notice for Paxos Applicants

We’ve become aware of fraudulent accounts posting as Paxos recruiters on LinkedIn and other platforms. These scammers attempt to deceive applicants into paying for job opportunities or providing personal financial information. 

To verify a legitimate Paxos recruiter: 

  • We only use @paxos.com email addresses
  • We never ask for payment or financial details to apply, interview, or work here
  • For technical roles, we do not perform a coding interview without prior screening by our engineering team 

Thanks for your interest in Paxos! 

Pay and benefits

Paxos offers a competitive total compensation and benefits package, including equity and bonuses based on both your individual performance and company performance. Eligibility for bonuses is dependent on job level, and actual salary within the range depends on your skills, experience, and qualifications.

Expected range for the base salary component for candidates located within the United States is:
$136,191$160,225 USD

Similar Jobs

CrowdStrike Logo CrowdStrike

Data Engineer

Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
Remote or Hybrid
USA
10000 Employees
170K-260K Annually

Samsara Logo Samsara

Data Engineer

Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
Easy Apply
Remote or Hybrid
United States
4000 Employees
112K-170K Annually

Cencora Logo Cencora

Data Engineer

Healthtech • Logistics • Pharmaceutical
Remote
Pennsylvania, USA
51000 Employees
124K-191K Annually

Applied Systems Logo Applied Systems

Data Engineer

Cloud • Insurance • Payments • Software • Business Intelligence • App development • Big Data Analytics
Remote or Hybrid
United States
3000 Employees
60K-160K Annually
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: New York, NY
350 Employees
Year Founded: 2012

What We Do

Today’s financial infrastructure is archaic, expensive, inefficient and risky - supporting a system that leaves out more people than it lets in.

So we’re rebuilding it.

Why Work With Us

As a company, we want to have a positive impact on the world and each other. We obsess over the right answers to the right questions. We are relentlessly self-improving. We communicate directly and honestly with each other. And we drive results, even outside our roles.

Gallery

Gallery

Similar Companies Hiring

Camber Thumbnail
Social Impact • Healthtech • Fintech
New York, NY
53 Employees
Rain Thumbnail
Web3 • Payments • Infrastructure as a Service (IaaS) • Fintech • Financial Services • Cryptocurrency • Blockchain
New York, NY
70 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account