Responsibilities
- Design and optimize robust, scalable data infrastructure and real-time stream processing systems to support historical and live pipelines using tools like Python, Airflow, Go, and Apache Beam.
- Develop and maintain observability and remediation tools to monitor and analyze trading performance and risk, ensuring reliability and transparency in operations.
- Lead efforts to integrate new financial assets and markets, clarifying requirements and ensuring seamless functionality within existing systems.
- Enhance the resilience, scalability, and performance of accounting and reporting systems to meet evolving business needs.
- Build advanced tooling to unify data from diverse vendors, standardizing symbol mappings to ensure consistency and accuracy across systems.
- Lead complex, company-wide projects by collaborating cross-functionally with research, legal, trading, finance operations, data, and infrastructure teams to deliver comprehensive end-to-end accounting and reporting systems.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from various data sources.
- Guide and support the growth of other engineers on the team by mentoring them and sharing your expertise, best practices, and knowledge.
Requirements
- Bachelor’s degree in Computer Science or equivalent professional experience in a related technical field.
- 7+ years of software engineering experience designing and building high-performance, reliable systems.
- Proven expertise in operating and scaling large-scale, mission-critical production systems, with proficiency in programming languages such as Python.
- Strong communication and project management skills, particularly in navigating complex technical domains and cross-functional collaboration.
- Demonstrated ability to mentor engineers and provide leadership in driving technical direction and system architecture.
Preferred Qualifications
- Expertise in building and optimizing data pipelines (e.g., Apache Airflow, Spark, Kafka).
- Experience with profiling and performance optimizations on distributed systems.
- Familiarity with modern Python data science tooling (pandas, polars, dask, duckdb, etc.).
- Experience with modern data engineering technologies.
Top Skills
What We Do
Founded in 2007 by two machine learning scientists, The Voleon Group is a quantitative hedge fund headquartered in Berkeley, CA. We are committed to solving large-scale financial prediction problems with statistical machine learning.
The Voleon Group combines an academic research culture with an emphasis on scalable architectures to deliver technology at the forefront of investment management. Many of our employees hold doctorates in statistics, computer science, and mathematics, among other quantitative disciplines.
Voleon's CEO holds a Ph.D. in Computer Science from Stanford and previously founded and led a successful technology startup. Our Chief Investment Officer and Head of Research is Statistics faculty at UC Berkeley, where he earned his Ph.D. Voleon prides itself on cultivating an office environment that fosters creativity, collaboration, and open thinking. We are committed to excellence in all aspects of our research and operations, while maintaining a culture of intellectual curiosity and flexibility.
The Voleon Group is an Equal Opportunity employer. Applicants are considered without regard to race, color, religion, creed, national origin, age, sex, gender, marital status, sexual orientation and identity, genetic information, veteran status, citizenship, or any other factors prohibited by local, state, or federal law.