Rabbit is a GCP cost-optimization and automation platform that helps enterprises make BigQuery and other GCP workloads truly cost-efficient. We combine deep platform intelligence, automation, and generative AI to turn cost insights into action.
The Challenge: Enterprises are spending millions on BigQuery without truly understanding where the waste is. Traditional tools provide surface-level metrics. We're building the intelligence layer that finds hidden inefficiencies and automatically fixes them.
Role OverviewIf you love data engineering for the impact, not just the tech — this is your role.
You'll be at the core of Rabbit's mission: researching and delivering innovative ways to reduce BigQuery costs through deep insights and automated optimization features. You'll build intelligent systems that help enterprises save millions — using data pipelines, GCP APIs, and algorithms you design yourself.
The opportunity: This isn't about maintaining someone else's ETL jobs. You'll research cost optimization strategies, prototype algorithms, and watch your ideas turn into features that customers actually use. Your pipeline optimization could save a customer millions a year. That's the kind of impact we're talking about.
In our startup environment, you're not just implementing specs — you're discovering the problems worth solving and building the solutions. We expect you to bring bold ideas, challenge assumptions, and build data systems that create real business value.
What You'll Do- Build and maintain data pipelines that power Rabbit's BigQuery cost and usage analytics
- Research, design, and implement automated cost-saving features that deliver measurable value to customers
- Develop deep cost insights and anomaly detection capabilities to help customers understand and control their BigQuery spending
- Collaborate with architects to design robust data models and ETL flows that scale
- Ensure the accuracy, reliability, and performance of Rabbit's data pipelines
- Continuously improve efficiency — both in pipeline performance and in cost outcomes for customers
- Investigate and resolve real-life customer issues, debugging data pipelines and cost anomalies to ensure accurate insights and optimization
- Occasionally interact directly with clients to troubleshoot complex issues, understand their unique use cases, and validate solutions
- 5+ years in data engineering or data platform development
- Strong SQL skills with deep BigQuery experience — you should know BigQuery internals, not just basic queries
- Comfortable with application development (Java or Node.js) — you won't be writing backend services daily, but you shouldn't be afraid to jump into the codebase when needed
- Hands-on experience with DBT
- Familiarity with cost optimization, performance tuning, or query execution planning
- Comfortable working in a small, fast-moving team with autonomy and accountability
- Bonus: experience with GCP cost management or automation frameworks
For data engineers who want to build, not just maintain:
- Your algorithms in production, not in notebooks: Stop doing proof-of-concepts that never ship. Here, your optimization algorithms go straight to production and save customers millions. You'll see the direct ROI of your work
- Research meets reality: You'll spend time researching BigQuery internals, testing optimization theories, and building features based on your discoveries — not just running someone else's playbook
- Ownership from day one: Design your own pipelines, choose your own tools, architect your own solutions. We trust senior engineers to make senior decisions
- Work with cutting-edge at scale: BigQuery optimization at petabyte scale, real-time anomaly detection, cost prediction algorithms — these are hard, unsolved problems. Perfect if you're bored with CRUD
- Learn and grow fast: Work alongside data architects and senior engineers who will challenge you. Shipping to Fortune 500 companies will level you up faster than any course ever could
Not for everyone. Perfect for you if: You're driven by impact, not comfort. You'd rather tackle hard problems with smart people than easy problems alone. You want your work to matter.
Top Skills
What We Do
Let’s become partners, and move your business to the cloud together! Aliz helps you reach your business goals with Machine Learning and data solutions – powered by Google Cloud. Because of our strong partnership with Google, we have access to the most modern technologies and developments in the industry; and those resources combined with our dedicated, professional team is ensured to give your company a competitive edge. Data has become the most valuable resource – if you use it well. We help you put your company’s data to use, gain valuable insights, predict future trends, and optimise your business to increase ROI. Join those who are already in the cloud, taking full advantage of their automated processes – and make your business even more successful than it is now. We are committed to delivering quality to our clients. To ensure that your company is provided the best possible solution tailored to your business needs, we use cutting-edge technologies, and we’re very proud to be able to call ourselves the only Premier Google Cloud partners in the CEE region. So far we have helped airlines, retailers, telco companies, and many more high-profile clients to make their business more successful. We are specialised in Machine Learning and Big Data – with our analytics and data warehousing solutions, the sky is the limit.
.jpg)







