Software Engineer - Data Platform

Posted 24 Days Ago
Be an Early Applicant
Hiring Remotely in SGP
Remote
Expert/Leader
Fintech • Financial Services
A pioneering fintech and algorithmic trading firm
The Role
Design, build, and maintain data infrastructure for AlgoQuant's quantitative research. Develop data processing pipelines and a Data Lake, ensuring data quality and collaboration with stakeholders.
Summary Generated by Built In
Software Engineer - Data Platform - AlgoQuant Asset Management

Location: Remote, Global
Team: Technology
Reporting to: Head of Data and Machine Learning

About AlgoQuant

At AlgoQuant, we’re building the future of digital asset management — grounded in rigorous research, world-class technology, and a relentless focus on performance. We began as a proprietary trading firm, developing sophisticated algorithmic strategies and operating in some of the most complex and fast-moving markets. That DNA remains at our core, but today we are evolving into a globally distributed investment management business.

Our quantitative environment empowers innovation by combining vast data capabilities, disciplined model development, and highly automated execution. Risk management is embedded in every layer of our systems and decision-making. Technology isn’t just an enabler for us, it’s a core competency and a strategic edge.

Role Overview

We are seeking an experienced Software Engineer to design, build, and maintain the next-generation data infrastructure that powers AlgoQuant’s quantitative research. This role is foundational to the creation of our internal Data Lake, which will enable a world-class research environment for our quantitative researchers.

You will work closely with our Head of Data and Machine Learning, to architect a robust, scalable data ecosystem that integrates seamlessly with our research and trading workflows. This is a hands-on engineering role that combines deep technical expertise with collaboration across research, execution, and technology teams.

Key Responsibilities
  • Market Data Infrastructure
    • Design, build, and maintain real-time and batch data processing pipelines for market, alternative, and on-chain data sources.
    • Ensure high availability and low latency across critical ingestion and transformation processes.
  • Data Lake and Processing Platform
    • Develop and evolve our internal Data Lake and the surrounding data processing ecosystem.
    • Implement modern lakehouse technologies to enable scalable, queryable, and versioned data storage.
  • Data Quality & Monitoring
    • Build validation, monitoring, and alerting systems to guarantee the accuracy, consistency, and completeness of data.
    • Establish robust data quality frameworks and observability tooling across the pipeline stack.
  • Core Libraries & Frameworks
    • Develop and maintain internal Python and C++ libraries for feature calculation, data processing, backtesting, and ML inference.
    • Promote code reuse, performance optimization, and reproducibility across teams.
  • Collaboration & Stakeholder Support
    • Work closely with quant researchers, traders, and the execution team to understand data requirements and support their workflows.
    • Translate research and trading needs into reliable, production-grade data infrastructure.
Requirements
  • Experience: 10+ years in software engineering or data infrastructure development.
  • Languages: Expert-level Python and C++.
  • Distributed Data Systems: Proven experience with Spark, Flink, Slurm, Dask, or similar frameworks.
  • Data Lakehouse Technologies: Hands-on with Apache Iceberg, Delta Lake, or equivalent systems.
  • Messaging & Streaming: Strong experience with Kafka or similar streaming platforms.
  • Infrastructure: Proficient with Linux, Kubernetes, Docker, and workflow orchestrators like Airflow.
  • Machine Learning Exposure: Familiarity with PyTorch, TensorFlow, or model inference frameworks.
  • Cloud Platforms: Experience deploying and maintaining systems on AWS, GCP, or Azure.
  • AI Engineering Tools: Experience using Claude Code, GitHub Copilot, Codex, or similar AI-assisted coding tools.
Nice to Have
  • Experience working with financial market data, cryptocurrency exchange APIs, on-chain data, or alternative data sources.
  • Familiarity with quantitative research environments or systematic trading systems.
  • Contributions to open-source data or ML infrastructure projects.
What Success Looks Like
  • The Data Lake and surrounding platform become the trusted foundation of AlgoQuant’s research environment.
  • Data pipelines and libraries are robust, scalable, and observable, enabling fast iteration for researchers.
  • Researchers and traders have reliable, self-service access to clean, validated data.
  • Collaboration between research, trading, and infrastructure teams results in accelerated innovation and strategy deployment.
Why Join AlgoQuant
  • Help build the data backbone of a cutting-edge digital asset investment platform.
  • Collaborate directly with world-class engineers, researchers, and quants.
  • Be part of a fully remote, high-performance culture that values innovation, autonomy, and continuous learning.
  • Shape the future of data infrastructure in a company where technology drives alpha.

Top Skills

Airflow
Apache Iceberg
AWS
Azure
C++
Dask
Delta Lake
Docker
Flink
GCP
Kafka
Kubernetes
Linux
Python
PyTorch
Slurm
Spark
TensorFlow
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
14 Employees
Year Founded: 2019

What We Do

At AlgoQuant, we're building the future of digital asset management; grounded in rigorous research, world-class technology and a relentless focus on performance.

We began as a proprietary trading firm, developing sophisticated algorithmic strategies and operating in some of the most complex and fast-moving markets. That DNA remains at our core, but today we are evolving into a fully remote, globally distributed Investment Management business. This transformation reflects a broader ambition: to scale our edge, deliver institutional-grade results, and set new standards for the industry.

Our quantitative environment is built to empower innovation, combining vast data capabilities, disciplined model development, and highly automated execution. Risk is embedded in every layer of our thinking, with robust measurement, control, and scenario analysis integrated into our systems and decision-making. Technology is not just a tool for us, it’s a core competency and a competitive advantage.

Similar Jobs

GitLab Logo GitLab

Integration Engineer

Cloud • Security • Software • Cybersecurity • Automation
Easy Apply
In-Office or Remote
4 Locations
2500 Employees

GitLab Logo GitLab

Back-end Engineer

Cloud • Security • Software • Cybersecurity • Automation
Easy Apply
In-Office or Remote
4 Locations
2500 Employees

Coinbase Logo Coinbase

Staff Product Designer

Artificial Intelligence • Blockchain • Fintech • Financial Services • Cryptocurrency • NFT • Web3
In-Office or Remote
Singapore, SGP
4000 Employees
244K-244K Annually

Circle Logo Circle

Solutions Engineer

Blockchain • Fintech • Payments • Financial Services • Cryptocurrency • Web3
In-Office or Remote
Singapore, SGP
1050 Employees

Similar Companies Hiring

Camber Thumbnail
Social Impact • Healthtech • Fintech
New York, NY
53 Employees
Rain Thumbnail
Web3 • Payments • Infrastructure as a Service (IaaS) • Fintech • Financial Services • Cryptocurrency • Blockchain
New York, NY
40 Employees
Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account