Senior Data Engineer II - QuantumBlack
Who You'll Work With
You will be part of our global data engineering community. You will work in cross-functional Agile project teams alongside data scientists, machine learning engineers, other data engineers, project managers, and industry experts. You will work hand-in-hand with our clients, from data owners, users, and fellow engineers to C-level executives.
Who You Are
You are a highly collaborative individual who wants to solve problems that drive business value. You have a strong sense of ownership and enjoy hands-on technical work. Our values resonate with yours.
What You'll Do
As a Senior Data Engineer, you will:
- Guide a technical team and work with our clients, data owners and users, including C-level executives to understand their needs and architect impactful analytics solutions
- Design and build data pipelines for machine learning models that are robust, modular, scalable, deployable, reproducible, and versioned
- Communicate progress and engineering complexity to stakeholders across all levels of seniority
- Own the technology stack, architecting and building enterprise data and AI platforms while ensuring information security standards are always maintained with limited oversight
- Assist in leading and contributing to R&D projects, internal asset development and help grow the QuantumBlack Data Engineering community
- Provide apprenticeship and mentorship to junior colleagues as well as help grow their coding skillset through collaborative code review
Our tech stack:
While we advocate for using the right tech for the right task, we often leverage the following technologies: Python, PySpark, the PyData stack, SQL, Airflow, Databricks, Snowflake, dbt, Kafka, our own open-source data pipelining framework called Kedro, Dask/RAPIDS, container technologies such as Docker and Kubernetes, cloud solutions such as AWS, GCP, and Azure, and more!
What you'll benefit from:
Real-World Impact- No project is ever the same, we work with top-tier clients across multiple sectors, providing unique learning and development opportunities internationally.
Fusing Tech & Leadership- We work with the latest technologies and methodologies and offer first-class learning programs at all levels.
Multidisciplinary Teamwork- Our teams include data scientists, engineers, project managers, UX and visual designers who work collaboratively to enhance performance.
Innovative Work Culture- Creativity, insight, and passion come from being balanced. We cultivate a modern work environment through an emphasis on wellness, insightful talks, and training sessions.
Striving for Diversity- With colleagues from over 40 nationalities, we recognize the benefits of working with people from all walks of life.
Continuous development and progression- We offer an extensive choice of training sessions, ranging from workshops to international conferences, tailored to your needs as well as a personal mentorship system. We have multiple career paths and geographic locations to evolve within the Firm.
Global community- you'll learn from colleagues around the world by connecting both internally and externally through our various hosted meet-ups.
Visit our Careers site to watch our video and read about our interview processes and benefits.
#LI-MP
Qualifications
- Degree in computer science, engineering, mathematics or equivalent work experience
- Hands-on exposure to architecting and building enterprise data and AI platforms as well as scalable data pipelines for advanced analytics use cases operating in production
- Strong knowledge of programming paradigms and architectural concepts including object-oriented and functional programming, microservices, OLTP, OLAP, Lakehouse, Data Mesh
- Well-versed in SDLC, applying software engineering best practices to drive enterprise-wide improvement including DevOps, DataOps, and MLOps
- Ability to write clean, maintainable, scalable and robust code in a common language (e.g., Python, Scala, Java)
- Proven experience with distributed computing frameworks (e.g., Spark, Dask), cloud platforms (e.g., AWS, Azure, GCP), containerization, and analytics libraries (e.g. pandas, numpy, matplotlib); ability to troubleshoot distributed jobs, diagnose issues and implement solutions
- Familiarity with orchestration frameworks and tools (e.g., Airflow, Luigi, Azure Data Factory, AWS Step Functions, Databricks Job, Oozie)
- Ability to scope projects and define technical workstreams with clear milestones and deliverables
- Experience translating roadmaps to sprint goals while working in an agile and iterative team set-up to adjust to changes in scope and objectives
- Extensive commercial client-facing or senior stakeholder management experience
- Willingness to travel