Principal Data Platform Engineer

Posted 17 Days Ago
Be an Early Applicant
Sydney, New South Wales
In-Office
Senior level
Artificial Intelligence • Machine Learning • Software
The Role
Lead the design and development of modern data platforms, focusing on scalable architecture and high-performance data pipelines while ensuring security and compliance.
Summary Generated by Built In
Principal Data Platform EngineerWho We Are

Simple Machines is a global, independent technology consultancy operating across Sydney, New Zealand, London, Poland and San Francisco. We design and build modern data platforms, intelligent systems, and bespoke software at the intersection of Data Engineering, Software Engineering and AI.

We work with enterprises, scale-ups, and government to turn messy, high-value data into products, platforms, and decisions that actually move the needle.

We don’t do generic. We build things that matter - We engineer data to life™.


RequirementsThe Role

This is a hands-on principal engineering role, not an architecture-only seat and not a support function. You’ll be responsible for technical direction, platform design and architectural decision-making.

You'll design and build greenfield data platforms, real-time pipelines, and data products for clients who are serious about using data properly. You’ll work in small, high-calibre teams and operate close to both the problem and the client.

If you enjoy solving hard data problems, shaping modern architectures (data mesh, data products, contracts), and delivering real outcomes — this is your lane.

What You’ll Be Doing

Lead Platform & Architecture Design

  • Own the end-to-end architecture of modern, cloud-native data platforms
  • Design scalable data ecosystems using data mesh, data products, and data contracts
  • Make high-impact architectural decisions across ingestion, storage, processing, and access layers
  • Ensure platforms are secure, compliant, and production-grade by design

Build Modern Data Platforms

  • Design and deliver cloud-native data platforms using Databricks, Snowflake, AWS, and GCP
  • Apply modern architectural patterns: data mesh, data products, and data contracts
  • Integrate deeply with client systems to enable scalable, consumer-oriented data access

Develop High-Performance Pipelines

  • Build and optimise batch and real-time pipelines
  • Work with streaming and event-driven tech such as Kafka, Flink, Kinesis, Pub/Sub
  • Orchestrate workflows using Airflow, Dataflow, Glue

Work at Scale

  • Process and transform large datasets using Spark and Flink
  • Design systems that perform in production - not just on paper

Own Data Storage & Performance

  • Work across relational, NoSQL, and analytical stores (Postgres, BigQuery, Snowflake, Cassandra, MongoDB)
  • Optimise storage formats and access patterns (Parquet, Delta, ORC, Avro)

Cloud, Security & Governance

  • Implement secure, compliant data solutions with security by design
  • Embed governance without killing developer velocity

Consult and Influence

  • Work directly with clients to understand problems and shape solutions
  • Translate business needs into pragmatic engineering decisions
  • Act as a trusted technical advisor, not just an order taker

Technical Leadership & Quality

  • Set engineering standards, patterns, and best practices across teams
  • Review designs and code, providing clear technical direction and mentorship
  • Raise the bar on data quality, testing, observability, and operational excellence
What We’re Looking For

Core Engineering Strength

  • Strong Python and SQL
  • Deep experience with Spark and modern data platforms (Databricks / Snowflake)
  • Solid grasp of cloud data services (AWS or GCP)

Architecture & Design Judgement

  • Demonstrated ownership of large-scale data platform architectures
  • Strong data modelling skills and architectural decision-making ability
  • Comfortable balancing trade-offs between performance, cost, and complexity

Data Platform Experience

  • Built and operated large-scale data pipelines in production
  • Strong data modelling capability and architectural judgement
  • Comfortable with multiple storage technologies and formats

Engineering Discipline

  • Infrastructure-as-code experience (Terraform, Pulumi)
  • CI/CD pipelines using tools like GitHub Actions, ArgoCD
  • Data testing and quality frameworks (dbt, Great Expectations, Soda)

Delivery & Consulting Mindset

  • Experience in consulting or professional services environments
  • Strong consulting instincts — able to challenge assumptions and guide clients toward better outcomes
  • Comfortable mentoring senior engineers and influencing technical culture

BenefitsWhy Simple Machines
  • You’ll work on interesting, high-impact problems
  • You’ll build modern platforms, not maintain legacy mess
  • You’ll be surrounded by senior engineers who actually know their craft
  • You’ll have autonomy, influence, and room to grow

If you’re a senior data engineer who wants to build properly, think clearly, and deliver real outcomes - we should talk.

Top Skills

Airflow
Argocd
Avro
AWS
BigQuery
Cassandra
Databricks
Dataflow
Dbt
Delta
Flink
GCP
Github Actions
Glue
Great Expectations
Kafka
Kinesis
MongoDB
Orc
Parquet
Postgres
Pub/Sub
Pulumi
Python
Snowflake
Soda
Spark
SQL
Terraform
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: Sydney, NSW
72 Employees
Year Founded: 2008

What We Do

Simple Machines is a global team of creative engineers and expert technologists. We partner with organisations to unleash their data’s potential in new and impactful ways. We design and build data platforms and unique software products. We create and deploy intelligent systems. We engineer data to life.

Our heritage is architecting and engineering highly performant, distributed, data driven platforms and data driven applications that perform at massive scale. We partner with enterprise, governments and global technology companies to put their data to work in the real world.

Simple Machines is partners with leading technology providers including GCP, AWS, Azure, Databricks, Snowflake, Confluent, Immuta.

Sydney | London | Christchurch | San Francisco

Similar Jobs

Atlassian Logo Atlassian

Software Engineer

Cloud • Information Technology • Productivity • Security • Software • App development • Automation
In-Office or Remote
Sydney, New South Wales, AUS
11000 Employees

Atlassian Logo Atlassian

Software Engineer

Cloud • Information Technology • Productivity • Security • Software • App development • Automation
In-Office or Remote
Sydney, New South Wales, AUS
11000 Employees

Atlassian Logo Atlassian

Program Manager

Cloud • Information Technology • Productivity • Security • Software • App development • Automation
In-Office or Remote
Sydney, New South Wales, AUS
11000 Employees

ServiceNow Logo ServiceNow

Senior Manager, Solution Consulting Platform AI& Data (Sydney or Melbourne based)

Artificial Intelligence • Cloud • HR Tech • Information Technology • Productivity • Software • Automation
Remote or Hybrid
Sydney, New South Wales, AUS
28000 Employees

Similar Companies Hiring

Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees
Idler Thumbnail
Artificial Intelligence
San Francisco, California
6 Employees
Fairly Even Thumbnail
Software • Sales • Robotics • Other • Hospitality • Hardware
New York, NY

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account