Technical Product Owner

Sorry, this job was removed at 04:11 p.m. (CST) on Thursday, Oct 02, 2025
Be an Early Applicant
28 Locations
Remote
Information Technology • Consulting
The Role
Who We Are
Massive Rocket is a high-growth Braze & Snowflake agency that has made significant strides in connecting digital marketing teams with product and engineering units. Founded just 5 years ago, we have experienced swift growth and are now at a crucial juncture, aspiring to reach $100M in revenue. Our focus is on delivering human experiences at scale, leveraging the latest in web, mobile, cloud, data, and AI technologies. We pride ourselves on innovation and the delivery of cutting-edge digital solutions.

Every role at Massive Rocket is Entrepreneurial - Successful people at Massive Rocket will not only think about their role but understand the roles around them, their goals and contribute to the success and growth of their team, customers and partners.

What We Offer
🚀 Fast-moving environment – you will never stop learning and growing
❤️ Supportive and positive work culture with an emphasis on our values
🌍 International presence – work with team members in Europe, the US, and around the globe
🪐 100% remote forever
🧗🏼‍♂️ Career progression paths and opportunities for promotion/advancement
🍕 Organized team events and outings

What we’re looking for
We are looking for a Technical Product Owner (TPO) to drive the vision, strategy, and execution of our Kafka-based Data Engineering platform. You will act as the bridge between business stakeholders and engineering teams, owning the real-time data streaming stack roadmap. Your role is to ensure our Kafka pipelines and data engineering initiatives deliver scalable, reliable, and compliant event-driven solutions that empower business outcomes.

This role combines technical depth (Kafka, event-driven architectures, data governance) with product ownership (roadmap definition, prioritization, stakeholder alignment).

Responsibilities

1) Product Ownership
Define and own the product vision and roadmap for Kafka/Data Engineering capabilities. Translate business needs into technical requirements and prioritized backlogs. Ensure clear acceptance criteria and measurable KPIs.

2) Stakeholder Management
Collaborate with Data, Analytics, Marketing, and Product teams to define event models, SLAs, and integration needs. Align stakeholders on priorities, trade-offs, and delivery timelines.

3) Technical Strategy
Shape the architecture and evolution of Kafka-based pipelines (topics, partitions, retention, compaction, Connect/Debezium, Streams/ksqlDB). Partner with engineers to ensure scalable, secure, and cost-efficient solutions.

4) Governance & Compliance
Drive schema governance (Avro/Protobuf), data quality enforcement, and regulatory compliance (GDPR/CCPA, PII handling). Ensure monitoring, observability, and incident management practices are in place.

5) Delivery Management
Own backlog grooming, sprint planning, and delivery tracking. Ensure throughput, latency, and consumer lag targets are met. Manage risks, dependencies, and SLAs.

6) Optimization & Innovation
Continuously evaluate and introduce improvements in reliability, cost-efficiency, and latency reduction. Assess new tools, frameworks, and best practices in data streaming and event-driven systems.


Required Skills and Qualifications:

• Product Ownership & Agile Delivery
- 4+ years of proven experience as a Product Owner or Technical Product Owner in data engineering or streaming domains.
- Demonstrated ability to own a product vision and roadmap, align it with business goals, and communicate it effectively to technical and non-technical stakeholders.
- Hands-on experience in backlog management, user story writing, prioritization (MoSCoW, WSJF, RICE), and defining acceptance criteria.
- Strong experience with Agile/Scrum/Kanban frameworks, backlog grooming, and sprint ceremonies.

• Stakeholder Engagement & Business Value
- Skilled at gathering and refining requirements from diverse stakeholders (data, analytics, product, marketing).
- Ability to translate business outcomes into technical user stories for Kafka/data engineering teams.
- Experience in balancing trade-offs (cost, reliability, time-to-market, compliance) and negotiating priorities.
- Comfortable defining KPIs, SLAs, and success metrics for platform capabilities.

• Technical Acumen
- Strong understanding of event-driven architectures, Kafka ecosystem (Confluent, Kafka Connect, Schema Registry, Streams/ksqlDB).
- Familiarity with data governance, compliance, and security requirements (GDPR/CCPA, PII handling, encryption, RBAC/ACLs).
- Working knowledge of cloud platforms (AWS/Azure/GCP), containerization (Docker, Kubernetes), and Infrastructure as Code (Terraform/Helm).
- Ability to engage in technical discussions on latency, throughput, consumer lag, schema evolution, and cost optimization.

• Leadership & Communication
- Excellent written and verbal communication skills (English C1 or higher).
- Proven ability to present technical concepts in business language and influence decisions.
- Experience in cross-functional leadership within engineering squads or consulting/client-facing roles.

Preferred Qualifications:
Experience with platform product ownership (internal developer platforms, streaming data services).
Familiarity with observability and reliability practices (Prometheus, Grafana, Datadog, incident response).
Exposure to Customer Data Platforms (e.g., mParticle) or tag management systems (e.g., Tealium).
Background in data engineering/software development (Python, Kotlin, Spark/Flink) is a strong plus.

If you're ready to launch your career to new heights at a company fuelled by passion and innovation, we want to hear from you!

Similar Jobs

GitLab Logo GitLab

Senior Back-end Engineer

Cloud • Security • Software • Cybersecurity • Automation
Easy Apply
In-Office or Remote
33 Locations

GitLab Logo GitLab

Site Reliability Engineer

Cloud • Security • Software • Cybersecurity • Automation
Easy Apply
Remote
28 Locations

GitLab Logo GitLab

Back-end Engineer

Cloud • Security • Software • Cybersecurity • Automation
Easy Apply
In-Office or Remote
33 Locations
In-Office or Remote
36 Locations
Get Personalized Job Insights.
Our AI-powered fit analysis compares your resume with a job listing so you know if your skills & experience align.

The Company
London
71 Employees
Year Founded: 2018

What We Do

Global Braze Agency, Massive Rocket, offers full-service solutions for businesses (Data, Engineering & CRM). We help our customers use data to understand their customers and automate communications across channels. We Grow Customer Lifetime Value. Our Global Delivery Centres support the most sophisticated customers in the world across the US, EMEA and APAC regions. Automate your end-to-end customer experience with our key services: Marketing Technology Stack Design Customer Engagement (CRM) Data Warehouse Management (DW) Customer Data Management (CDP) We extend your team with specialists Strategy | Planning | Setup | Integration | Execution As a consultancy, we deliver solutions that don’t just help digital marketing teams achieve their goals but also generate predictable growth. We adopt and implement new-age solutions that fulfil the needs of the new-age customer-centric companies. We are proud to be the technology partners of the leading technology solution providers in the industry: 1. Braze (One of the 4 Certified Level 4 “Orbit” Braze Partners) 2. mParticle (Solutions Partner of the Year) 3. Snowflake 4. Segment Get in touch to know more

Similar Companies Hiring

Scrunch AI Thumbnail
Software • SEO • Marketing Tech • Information Technology • Artificial Intelligence
Salt Lake City, Utah
Amplify Platform Thumbnail
Fintech • Financial Services • Consulting • Cloud • Business Intelligence • Big Data Analytics
Scottsdale, AZ
62 Employees
Standard Template Labs Thumbnail
Software • Information Technology • Artificial Intelligence
New York, NY
10 Employees

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account