Position: Data Engineer (Europe-based / Remote)
Location: Europe (fully remote)
Rate: Up to US$ 80/hr (contractor)
Our client partners with high-growth startups to build operational systems that transform how teams work. They focus on tooling, workflows and data infrastructure — treating operations as a product, designing systems that move data, automate work and scale reliably.
For this role, you’ll be part of the data & engineering effort that underpins those operational systems, contributing to ETL pipelines, data orchestration and typescript development.
As a Data Engineer you will:
Design, build and maintain ETL/ELT pipelines to collect, transform and load data from multiple sources into a central data layer (data warehouse, data-lake or equivalent)
Work end-to-end: from data ingestion and schema design, through transformation/cleaning, to enabling downstream analytics, dashboards and workflows
Write robust, well-tested code in TypeScript (and possibly other languages/tools) for data & pipeline orchestration
Partner with cross-functional stakeholders (ops, analytics, engineering) to understand data needs, define requirements and ensure data quality, reliability and availability
Monitor data pipelines, set up alerting, ensure data accuracy, performance and scalability as the systems grow
Collaborate in building the infrastructure for data as a source of truth, enabling the broader internal tooling and operational systems the company delivers
Contribute to design, architecture and best practices around data workflows, including performance, versioning, observability, documentation and automation
Operate in a remote, high-autonomy context, often with startup clients and varied tooling environments
Proven experience building ETL/ELT pipelines in a production environment (designing ingestion + transformation + loading + monitoring)
Strong proficiency in TypeScript (experience writing backend/data-oriented TS code)
Solid understanding of data engineering concepts: data modelling, schema design, warehousing, data quality, scheduling, monitoring, error handling
Familiarity with modern data tooling/stacks (for example: cloud data warehouses such as Snowflake, BigQuery, Redshift; orchestration tools; ETL frameworks; job scheduling; APIs/webhooks)
Excellent problem solving — comfortable working with messy data, identifying bottlenecks, designing scalable, maintainable solutions
Comfortable working remotely, collaborating asynchronously and autonomously across time zones
Based in Europe (time-zone compatible with Europe; remote but must reside in Europe)
Experience working in a startup or dynamic environment where rapid iteration and high autonomy are expected
Familiarity with internal-tooling or operations automation stacks (e.g., connecting CRMs, Slack, billing systems, data warehouses)
Experience in typescript CI/CD, strong engineering discipline around tests, code review, observability
Experience with data visualisation, dashboards, or analytics (so data flows serve downstream users)
Experience with orchestration tools (e.g., Airflow, dbt, Prefect, Dagster) or cloud job frameworks
Experience integrating disparate data sources, APIs/webhooks, real-time or near-real-time data movement
Flexible remote work from Europe — you choose your location and working hours (within reason for team overlap)
Contract engagement at up to US$ 80/hr
Opportunity to work across high-impact projects with fast-growing startups and help build systems that scale
High autonomy and opportunity for ownership: you’ll be shaping how the data layer supports both internal tooling and external client operations
Exposure to modern tooling, workflows and operational systems built at scale
Please send:
A brief cover note explaining your relevant experience (particularly in building ETL pipelines + TypeScript)
Your CV/resume with role titles, dates, brief summary of projects
One or two examples (or descriptions) of data pipeline work you have done: ingestion, transformations, monitoring, something you’re proud of
Confirmation that you are based in Europe and available for remote contractor work
Your hourly rate expectation (max is US$ 80/hr) and earliest availability
Our client values building inclusive teams across geographies. They believe in craftsmanship, clarity, autonomy and delivering high value.
If you’re passionate about data, engineering and driving operational excellence, they’d love to hear from you.
Top Skills
What We Do
G2i is a hiring community connecting remote developers with world-class engineering teams. Our unique approach combines rigorous technical assessments with a solid commitment to developer health, ensuring companies get skilled developers who are supported, valued, and ready to execute from day one.
Our transparent vetting process includes in-depth, performance-ranked developer profiles, recorded technical interviews, and soft-skills assessments. Whether you're working on a short-term project or burning down a backlog, G2i connects you with a community of pre-vetted developers.
Planning to hire ten or more engineers? We create a Custom Talent Pipeline, allowing for specific customizations in sourcing, assessment criteria, technical interview questions, and integration with your existing HR systems and processes.
G2i partners with clients who support the developer health mission—matching developers with environments that improve their health, support recovery from burnout, and enable professional growth through restful work.
Is your team overworked or understaffed? Contact us today to learn how G2i can help you.
More information about our mission and commitment to developers and clients can be found at https://g2i.co or follow us on X @g2i_co








