Fractal River is a consulting firm that helps companies scale faster by building robust technical and operational infrastructure. We do this by working hands-on with our clients to build the systems they need to scale, while at the same time transferring knowledge and helping them develop their own internal capabilities.
We work with a diverse range of companies—from fast-growing startups to well-established organizations—who need to build or modernize their data and analytics infrastructure. We leverage the right tools and latest technologies to balance technical rigor with flexibility based on our customers' unique needs. Our work spans cloud infrastructure across Amazon, Google, and Microsoft platforms, integrating systems throughout the customer journey including HubSpot, Salesforce, and Zendesk.
Beyond infrastructure, we help companies establish strong data foundations: ensuring data quality, implementing governance frameworks, defining metrics that align with business logic, creating centralized data models, and establishing sources of truth. This ensures our clients not only have visibility into their data but can trust the numbers across all reporting layers.
We incorporate and leverage AI at all levels of our company: helping clients create AI-centric products, creating and deploying AI agents along their customer journey, implementing AI-based support systems for internal users, and maximizing the use of AI for development processes.
We are seeking a Software and Data Engineer to join our team.
What you’ll do:
- Help design and implement data pipelines and analytic infrastructures with the latest technologies.
- Create and deploy AI tools and agentic infrastructure to enhance both client and internal capabilities.
- Build data warehouses using tools like dbt, working with platforms such as Snowflake, Redshift, and BigQuery.
- Create beautiful dashboards and reports, and work with customers to create self-service data exploration capabilities.
- Build data pipelines and integrations to connect systems, automate processes, and ensure data flows seamlessly across platforms.
- Leverage APIs from multiple systems to extract and update data, trigger and monitor processes, and in general help tie our customers’ infrastructures into cohesive platforms that power their growth.
- Maintain and oversee cloud infrastructure to ensure it runs with the reliability and performance our customers expect.
- Help create data models, establish data governance frameworks, define metrics, and ensure data quality across reporting layers.
- Develop technical documentation and best practices for our customers.
- Drive internal improvement initiatives by identifying opportunities for efficiency gains, discovering new best practices, and proposing internal projects that enhance our capabilities and processes.
- Contribute to our evolving DevOps and DataOps practices, helping shape how we work as we continuously improve.
- Coordinate projects, activities, and tasks to ensure objectives and key results are met.
- Identify opportunities, generate innovative solutions, and improve existing product features.
Who we are looking for:
Our ideal candidate is someone with 1-5 years of working experience in fast-paced environments, a high tolerance for ambiguity, and a passion for constant learning. We’re looking for motivated, highly adaptive people who enjoy the challenge of working with brilliant people in multiple innovative companies to put in place technical infrastructures that power their growth. They must be able to interact with customers and perform not only technical work but also understand needs, extract requirements, and design and propose solutions.
Beyond technical skills, we’re looking for someone who is:
- Curious and driven to learn: You seek out knowledge, ask questions, and view ambiguity as an opportunity to explore.
- Proactive and a self-starter: You take initiative, anticipate what’s coming, and actively identify ways of accomplishing tasks.
- Comfortable with Python and SQL fundamentals, but more importantly, you know when and how to leverage AI tools to accelerate your work while maintaining quality and understanding.
- Eager to leverage AI tools (ChatGPT, Claude, Gemini, etc.) to fuel creativity and problem-solving—not to replace your deliverables, but to expand your thinking and improve them. You understand AI’s limitations and know the boundaries to ensure delivery of proper work.
- Collaborative and communicative, able to work with customers and team members effectively.
- Adaptable: Thrives in environments with constant iteration and welcomes creative solutions to challenging problems.
- Process-minded: Capable of developing and improving best practices in data, DevOps, and AI-assisted workflows. You naturally spot opportunities for improvement in both client and internal processes.
- Detail-oriented: You have strong attention to detail and care about the quality of your work.
We value learning agility and the ability to understand what you’re building over being an expert in every tool. If you understand the fundamentals and know how to leverage AI to accelerate your learning and implementation, you’ll thrive here.
English Language Requirements:
CEFR level C1 (Advanced) means you can:
- Understand a wide range of demanding, longer texts and recognize implicit meaning
- Express yourself fluently and spontaneously without much obvious searching for expressions
- Use language flexibly and effectively for social, academic, and professional purposes
- Produce clear, well-structured, detailed text on complex subjects
Nice-to-have's (not required):
- Familiarity with data warehouses like Snowflake, Redshift, or BigQuery
- Experience with data modeling, data governance, and creating complex visualizations
- Experience with BI tools such as Looker, Sigma, or Power BI
- AWS/GCP/Azure services knowledge
- Experience with development tools such as Terraform, Docker, CircleCI, and dbt
- Certifications (AWS, Google Professional Data Engineer, etc.)
- Knowledge of Google Analytics, Salesforce, HubSpot, and/or Zendesk
- Comfort working in non-hierarchical, diverse work environments
- Bachelor’s degree in computer science or similar field
Besides the benefits that, as per law, you are entitled to, we offer:
- Personal development plan with an accelerated career track.
- Access to an extensive reference library of books, case studies, and best practices.
- Unlimited access to AI tools (ChatGPT, Claude, Gemini, etc.)
- Unlimited budget for work-related books.
- Online training (Udemy, nanodegrees, etc.), English language training.
- Stretch assignments, pair programming, and code reviews with new technologies.
- Yearly performance bonus based on personal performance.
- Yearly success bonus based on company performance.
- Home office setup including fast internet, large monitor, and your favorite keyboard and mouse. After a few months, gaming chair, espresso maker, standing desk, and speakers (or other items like these).
- Monthly wellness budget to cover items such as sodas, tea, snacks, pet care, yoga, or other wellness-related items.
- Company card for wellness budget and company expenses.
- Three floating holidays and unlimited (but reasonable) PTO starting the first year.
- Fun company off-sites!
We are proud to be an equal opportunity employer.
Top Skills
What We Do
Fractal River is a consulting firm that helps growth-stage startups scale faster by:
* Optimizing marketing, sales, customer success, and service delivery operations by defining processes and implementing the right tools.
* Building their analytics, integration, and data engineering capabilities to deliver better visibility, improved metrics, point-of-service data, embedded in-product analytics, data products, and more.
* Acting as a fractional team so companies can develop their internal teams gradually while enjoying the benefits of best practices, expert support, and the latest technologies and architecture.
Our tooling expertise includes HubSpot, Salesforce, Zendesk, ChurnZero, TaskRay, Looker, Tableau, PowerBI, dbt, Fivetran, Snowflake, Redshift, BigQuery, AWS/GC infrastructure, Terraform, Python, Docker, and multiple DevOps orchestration and monitoring tools.








