Applico Capital is the leading venture capital firm focused on the $8 trillion B2B distribution industry. Through our learnings and understanding of the industry, we are building a tech startup, currently in stealth, to solve the industry's biggest problems as it comes to unlocking AI-enabled synergies.
Our mandate is to leverage AI and modern technologies to reimagine the role of the traditional distributor and transform how the entire industry operates.
We are looking for highly technical builders who thrive in entrepreneurial, scrappy, and collaborative environments.
About the Role:We are looking for a Data Engineer to create the infrastructure, automation, and monitoring that make machine learning reliable, repeatable, and scalable. You will enable our AI Scientists and Engineers to move faster, while ensuring compliance, observability, and cost efficiency.
This is a scrappy, hands-on role in a startup-style team where building durable, automated systems is as important as moving quickly. You’ll ensure that ML becomes a dependable part of daily business operations. You will also extend MLOps practices to support agentic AI systems – managing orchestration, monitoring emergent behavior, and ensuring safe, governed use of AI-augmented workflows.
Key Responsibilities- Design, build, and maintain data ingestion and transformation pipelines using modern open-source and cloud-native tools
- Integrate structured and unstructured data from ERP, CRM, PIM, CMS, and third-party sources
- Develop and manage data models, staging, and warehouse/lakehouse layers
- Implement data quality, validation, and observability frameworks to ensure reliability
- Collaborate with the Head of Data Architecture and Full-Stack Data Engineers to define schema standards and ingestion patterns
- Automate repeatable workflows (e.g., Airbyte, Dagster, Prefect) to reduce manual work and ensure reproducibility
- Support analytics, reporting, and AI use cases through well-designed, versioned data products
- Contribute to infrastructure automation and CI/CD practices for data pipelines
- Leverage AI tools (LLMs, code generation, enrichment APIs) to accelerate development and improve data coverage
Requirements
- 3–6 years of professional experience as a Data Engineer, ETL Developer, or Data Platform Engineer
- Proficiency in Python and SQL for data wrangling, pipeline automation, and transformation
- Hands-on experience with modern open-source data tooling such as dbt, Airbyte, Meltano, Dagster, or Prefect
- Familiarity with cloud data environments (AWS, GCP, or Azure) and infrastructure-as-code principles
- Solid understanding of data modeling, schema design, and relational concepts
- Experience integrating APIs, flat files, and other external data sources
- Working knowledge of data quality and observability tools (Great Expectations, Soda, or similar)
- Exposure to or curiosity about semantic modeling, graph data, and AI enrichment workflows is a plus
- Comfortable in fast-paced, startup-style environments where iteration, learning, and impact come first
- Work on one of the most ambitious AI and data transformations in industrial B2B
- Build with autonomy in a small, expert team backed by a large, stable business
- Learn directly from senior data architects and AI engineers
- Help shape a scalable, open, automation-driven data platform from day one
- Languages: Python, SQL
- Data Tools: dbt, Airbyte/Meltano, Dagster, Prefect, DuckDB, Delta Lake, Postgres
- Cloud & Infra: AWS or GCP, Terraform, Docker, GitHub Actions
- Data Governance: Great Expectations, OpenLineage, Soda
- APIs & Services: FastAPI, GraphQL
- AI/Automation (Optional): LangChain, LangGraph, OpenAI APIs, n8n
Top Skills
What We Do
We help B2B distribution's largest enterprises use technology to solve big problems and unlock new opportunities. Applico Capital brings together strategic partners and entrepreneurs to accelerate growth and create value at tremendous scale.
Through venture capital, we invest out of the B2B Distribution industry's first venture capital fund dedicated to investing in distribution tech.
Through private equity, we are the technology operating partner to large B2B Distributors as they harness AI to lead their vertical.









