What you'll do:
- Undertake technical discovery with stakeholders, and understand the business context in designing and implementing solutions for data pipelines, models and applications.
- Hands-on development of robust and scalable data pipelines and models using Python, SQL, and our evolving data stack (e.g., DBT, Snowflake, and MLOps tooling).
- Model data for optimal workload consumption, eg analytical, BI, ML modelling.
- Contribute to our library of analytical assets, including our feature store, enabling self-service analytics and reducing time spent on data wrangling.
- Collaborate with Data Scientists and Engineers to build and maintain end-to-end machine learning pipelines for training, inference, and monitoring at scale.
- Develop and maintain infrastructure, tooling, and monitoring for data applications and reproducible data science workflows.
- Test and deploy data assets and ML pipeline components to production environments, adhering to software engineering best practices (e.g., version control, CI/CD, testing).
- Identify and expose technical debt and related issues for resolution, contributing to the overall health and maintainability of our data ecosystem.
- Stay current with emerging practices, techniques, and frameworks in applied machine learning, data engineering, and analytics engineering.
What you'll bring:
- Minimum of 2 years experience in data engineering, analytics engineering, ML engineering, or a software engineering role with a data focus.
- Strong proficiency in Python for data processing, pipeline development, and scripting. Experience with ML libraries (e.g., scikit-learn, pandas, NumPy) is highly desirable.
- Proven experience in developing, deploying, and maintaining data pipelines and data models in production environments.
- Familiarity with software engineering best practices (e.g., Git, CI/CD, testing, code reviews).
- Solid experience with SQL for data querying, transformation, and optimization.
- Experience or a strong interest in machine learning concepts and MLOps (e.g., model deployment, monitoring, versioning) is a significant plus.
- Experience with cloud data warehouses like Snowflake is highly desirable.
- Experience with data pipeline orchestration tools like Airflow, Dagster, Prefect and data modeling tools like DBT is desirable.
What We Do
Xero is small business accounting software that provides a platform on which businesses can build a fully integrated solution. It’s designed to make life better for people in small business, their advisors, and communities around the world. Xero minimises tedious admin by automating routine tasks, delivers valuable insights when needed, and brings together business data, trusted advisors, and powerful apps in one intuitive platform. By alleviating pain points, Xero empowers small business owners to supercharge their business, simplifying the complex and freeing up time from manual admin so they can focus on what really matters to build the business they’ve always envisaged.
Why Work With Us
Xero is a human-centric organization where you’ll have a tangible impact on the success of small businesses and their communities, globally. Our team of energised, forward-thinkers work to make life better for our customers and each other every day. We’re also always committed to supporting you with a flexible environment.
Gallery
Xero Teams
Xero Offices
Hybrid Workspace
Employees engage in a combination of remote and on-site work.
Join us from home or at one of our beautiful workspaces. Xero has offices in Australia, New Zealand, United Kingdom, United States, Canada, Singapore, and South Africa.






