Our stack: Python 3 (asyncio), Airflow, Docker, PostgreSQL/TimescaleDB, GitLab, Metabase, Streamlit, RabbitMQ, Kubernetes, AWS (S3, EC2) and we're also playing around with ClickHouse, Apache Iceberg, DuckDB, and Trino.
We're Quantlane, a proprietary trading firm built for traders. Our goal is to create a best-in-class ecosystem that helps traders unlock their full potential. A key part of that is our data platform — providing all kinds of data that traders (and their algos) rely on to make trading decisions
👨💻 How we work
Our development team has 16 people. We work in an agile environment, develop iteratively, and take testing seriously. We have dedicated communities focused on different domains — Data, DevOps, Execution Platform, On-call/Support, and Trading Tooling, among others.
📊 Who we're looking for
This role is for our Data team, which currently has five Data Engineers and one Data Scientist. We tackle interesting data challenges here — our goal is to make sure data is in the right quality, format, and place at the right time.
🔥 Want to help build a data platform that has a real impact on our traders' success? Come join us! 🚀
What your day-to-day will look like:
Building Python applications that process data (pandas, asyncio)
Contributing to the design and improvement of the data platform
Developing backend applications in Python (asyncio, RabbitMQ)
Deploying those applications to production (Airflow, Kubernetes)
Talking to traders and analysts (your internal customers) about the features they need
Taking turns with the rest of the team on platform support
What we offer:
Snacks, 5 weeks of vacation, 5 sick days, and a Multisport card
- Brand new, modern offices in Karlín
Hybrid setup — ideally 3 days on-site
Regular sessions with a company therapist / coach
English lessons with a native speaker
Internal training and contributions toward technical conference
A bit about you:
You have experience working with Data Lake, Lakehouse, and Streaming data concepts
You've built applications that process data — both historical and real-time
You know SW development best practices (Git, testing)
Ideally, you also:
Have hands-on experience with Apache Iceberg, Trino/AWS Athena, or ClickHouse
Are comfortable with data libraries like pandas/polars
Know your way around ETL workflow tools like Apache Airflow
Have solid SQL skills, ideally PostgreSQL
Know Python asyncio
Have an interest in capital markets
What We Do
Our mission is to provide the best ecosystem and platform for traders worldwide to reach their full potential in trading. We are bringing together diverse expertise in technology, research, and game theory to tackle unique opportunities in global capital markets.






.png)

