Middle Data Engineer

Reposted 3 Days Ago
Be an Early Applicant
Hiring Remotely in Warsaw, Warszawa, Mazowieckie
In-Office or Remote
Mid level
Software
The Role
The Data Engineer will design and maintain data pipelines and infrastructure, support analytics and automation, and collaborate on ML/GenAI projects.
Summary Generated by Built In

Dream Big.  Go Beyond. Be Unstoppable.
 

About Us

Kyriba is a global fintech leader empowering CFOs and finance teams with cloud-based treasury, payments, and risk management solutions. We serve 3,000+ customers worldwide, managing $15 trillion in payments annually and helping businesses optimize liquidity performance across the enterprise.


We're on a mission to become the most sought-after cloud technology company globally.  We think big, innovate relentlessly, and challenge the status quo every day. If you are a problem-solver who’s ready to push boundaries and achieve more than you thought possible-you'll find an exceptional career within an extraordinary business.

Location: Warsaw
Type of contact: UoP

We are seeking a versatile and innovative Data Engineer to design, build, and maintain scalable data pipelines and infrastructure that support analytics, reporting, Machine Learning (ML), Generative AI (GenAI), Business Intelligence (BI), and automation initiatives. The ideal candidate will have practical experience with Google Cloud, BigQuery, and modern data processing, with a keen interest in enabling advanced analytics and automation across the organization.

Key Responsibilities

Data Engineering

  • Design, implement, and optimize robust ELT/ETL pipelines using Google BigQuery, Cloud Storage, and GCP services (e.g., Dataflow, Pub/Sub, Cloud Composer) to support analytics, ML, BI, and automation use cases.

  • Build and maintain data architectures for structured and unstructured data, ensuring data quality, lineage, and security.

  • Integrate data from multiple sources, including external APIs and on-premise systems, to create a unified, well-modeled data environment.

  • Apply BigQuery best practices including partitioning, clustering, materialized views, and cost/performance optimization.

Machine Learning & GenAI

  • Collaborate with Data Scientists and ML Engineers to deliver datasets and features for model training, validation, and inference.

  • Develop and operationalize ML/GenAI pipelines, automating data preprocessing, feature engineering, model deployment, and monitoring using Vertex AI and/or BigQuery ML.

  • Support the deployment and maintenance of GenAI models and LLMs in production environments, including prompt/feature pipelines and inference orchestration.

  • Stay current on emerging ML and GenAI technologies and best practices across the GCP ecosystem.

Business Intelligence & Reporting

  • Partner with BI Developers and Analysts to provide clean, reliable, governed data sources for reporting and dashboarding in Looker (semantic modeling in LookML).

  • Enable data access and transformation for self-service BI; ensure BI solutions are scalable, secure, and performant.

  • Integrate advanced analytics and ML/GenAI outputs into BI datasets, Looks, and Explores for actionable insights.

Automation

  • Partner with Automation Specialists to design and implement data-driven automated workflows using MuleSoft and/or GCP services (e.g., Cloud Functions, Workflows, Cloud Run).

  • Develop and maintain automation scripts and integrations to streamline data flows, improve operational efficiency, and reduce manual effort.

Governance & Collaboration

  • Implement data governance, security, and compliance best practices across all data assets, leveraging tools such as Dataplex and Data Catalog for lineage and metadata.

  • Document data flows, pipelines, and architectures for technical and business stakeholders.

  • Collaborate across teams (data science, BI, business, IT) to align data engineering efforts with strategic objectives and SLAs.

Requirements

  • Bachelor’s or Master’s degree in Computer Science, Engineering, Information Systems, or related field.

  • Proven experience as a Data Engineer or similar role.

  • Expertise with Google BigQuery and Google Cloud Storage; solid knowledge of GCP data and streaming services (Dataflow/Apache Beam, Pub/Sub, Cloud Composer/Airflow).

  • Strong programming skills in Python and SQL

  • Experience building reliable data pipelines for analytics, ML, BI, and automation use cases.

  • Familiarity with ML frameworks (scikit-learn, TensorFlow, PyTorch), MLOps on GCP (Vertex AI Pipelines/Model Registry) or BigQuery ML, and GenAI libraries/tooling where applicable.

  • Experience supporting BI/reporting solutions, preferably with Looker and LookML.

  • Hands-on experience with automation/integration platforms such as MuleSoft is a strong plus.

  • Understanding of data governance, security, quality, and compliance on cloud platforms.

  • Excellent communication, collaboration, and problem-solving skills.

Nice to Have

  • Experience deploying and operationalizing GenAI/LLM solutions at scale on GCP (Vertex AI, vector search, embeddings).

  • Experience with API development and integration (Cloud Run/Functions, Apigee).

  • Knowledge of DevOps/CI-CD for data solutions (Cloud Build, Git, Infrastructure as Code such as Terraform).

  • Relevant Google Cloud or Looker certifications (e.g., Professional Data Engineer, LookML Developer).

Our Values Guide Everything We Do

  • Think Big & Constantly Innovate: We have the confidence to think big, embrace change, challenge the status quo, and continuously evolve - incorporating new technologies and driving industry progress.

  • Put our Customers’ Needs First: We are passionate about delivering the highest value for our customers and supporting them with end-to-end care throughout their journey with us.

  • Act with Integrity: Integrity is at the heart of everything we do. We take personal responsibility for our actions, own our decisions, and honour each other’s contributions. We empower each other through honesty, respect, trust and transparency.

  • Work as One Team: We are driven by our common goals and share in each other’s successes and failures, learning and working together as a team where everyone can bring their best selves.

  • Strive for Excellence while Having Fun: We enjoy tackling new challenges together, and revel in continuous improvement as we deliver, with ultimate professionalism, the very best for our customers, while exceeding our own expectations.

Kyriba offers a comprehensive compensation package, including a range of health, welfare and wellbeing benefits designed to support both your professional and personal life.

Kyriba believes that everyone has the ability to make an impact, and we are proud to be an equal opportunity employer committed to providing employment opportunity regardless of sex, race, creed, color, gender, religion, marital status, domestic partner status, age, national origin or ancestry, physical or mental disability, medical condition, sexual orientation, pregnancy, military or veteran status, citizenship, and genetic information.

If you require a reasonable accommodation to complete any part of the application or interview process, or to perform essential job functions, please contact us at [email protected]. Requests will be handled confidentially and in accordance with applicable local laws.

Top Skills

Aws S3
Aws Sagemaker
Databricks
Databricks Mlflow
Huggingface
Langchain
Mulesoft
Python
PyTorch
Qlikview
Scala
Scikit-Learn
SQL
TensorFlow
Am I A Good Fit?
beta
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
HQ: San Diego, CA
972 Employees

What We Do

Kyriba treasury software is comprised of a set of powerful products that span treasury, risk, payments and working capital. Try Now!

Similar Jobs

N-iX Logo N-iX

Data Engineer

Information Technology • Consulting
Remote
2 Locations
2135 Employees

Motorola Solutions Logo Motorola Solutions

Administration and Sales Support with German

Artificial Intelligence • Hardware • Information Technology • Security • Software • Cybersecurity • Big Data Analytics
Remote or Hybrid
Poland
23000 Employees

Capco Logo Capco

Senior Consultant

Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Remote or Hybrid
Poland
6000 Employees

Capco Logo Capco

User Interface Designer

Fintech • Professional Services • Consulting • Energy • Financial Services • Cybersecurity • Generative AI
Remote or Hybrid
Poland
6000 Employees

Similar Companies Hiring

Scotch Thumbnail
Software • Retail • Payments • Fintech • eCommerce • Artificial Intelligence • Analytics
US
25 Employees
Milestone Systems Thumbnail
Software • Security • Other • Big Data Analytics • Artificial Intelligence • Analytics
Lake Oswego, OR
1500 Employees
Fairly Even Thumbnail
Software • Sales • Robotics • Other • Hospitality • Hardware
New York, NY

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account