As a member of Flow’s data team, you’ll shape the data architecture and AI-assisted analytics that power decision-making across our multifamily real estate portfolio. We’re looking for an engineer who builds robust data solutions and works directly with the people who use them every day — property managers, leasing directors, and marketing teams.
This isn’t a role where you build pipelines in isolation and hand them off. You’ll be embedded with operations teams, translating their real-world problems into elegant technical solutions — and increasingly, into AI-assisted tools and insights that make those solutions smarter over time. When a property manager says “tours are down this week,” you’ll investigate data quality, understand the operational context, surface actionable insights, and build the analytical intelligence that helps teams act faster and with more confidence. Not just a dashboard. A better way of working.
At Flow, we’re building sophisticated data infrastructure for the modern real estate industry. We leverage cloud-native technologies and AI-assisted development to move fast while maintaining production-grade quality. You’ll use tools like Claude Code and Cursor as force multipliers — which means strong written communication is how you work effectively. Clear prompts, clear documentation, clear stakeholder updates. AI amplifies your impact when you communicate well. When you don’t, it becomes a liability.
We value engineers who think critically, work independently, and own problems end-to-end. You’ll have significant autonomy to scope projects, make architectural decisions, and challenge assumptions when the data doesn’t support the hypothesis.
We believe our team is better together. Every position at Flow has an onsite or in-office requirement.
Responsibilites
- Design, build, and maintain data pipelines and architecture using Snowflake, dbt, and AWS
- Implement scalable ETL processes and data models that power analytics across the organization
- Build AI-assisted analytics and insights that help operations teams make better decisions faster — surfacing patterns, anomalies, and opportunities that traditional dashboards miss
- Tackle complex data challenges like entity resolution, multi-system attribution, and household matching
- Work directly with operations teams across leasing, marketing, and property management to translate daily workflow challenges into data and AI-powered solutions
- Own problems end-to-end — from stakeholder conversations through implementation to deployment and adoption
- Document architectural decisions and explain technical tradeoffs to non-technical stakeholders in clear, jargon-free language
- Use AI coding assistants to accelerate development while maintaining production-quality code
- Write effective prompts for AI tools and review generated solutions critically — knowing when to trust AI output and when to take over
Ideal Background
- A minimum of three years of data engineering experience building scalable data systems in operations-heavy, data-forward organizations
- Experience working in product and operations environments where data directly impacts daily business decisions
- Deep expertise in data modeling with dbt — you understand how to model around the deficiencies and quirks of upstream operational systems
- Comfort building or contributing to AI-assisted analytics — whether that’s integrating LLM-powered insights into existing workflows, building intelligent alerting, or designing systems that surface the right information at the right time
- Strong written and verbal communication skills — comfortable writing clear documentation, effective AI prompts, and explaining technical concepts to non-technical stakeholders
- High proficiency in SQL, Python, and modern data stack technologies including Snowflake, Fivetran, Dagster or Airflow, and dbt
- A solid understanding of the principles and challenges of ensuring high availability, fault tolerance, and efficiency in distributed data systems
- The ability to strike a balance between elegant design and pragmatic tradeoffs, all while prioritizing continuous delivery of value to the business
- Experience building data solutions in fast-moving startup environments
- Strong analytical skills with a natural curiosity to understand and solve complex business problems through data — and an interest in how AI changes what’s possible
Skills Required
- Minimum 7 years of data engineering experience
- Deep expertise in data modeling with dbt
- High proficiency in SQL and Python
- Experience building data solutions in fast-moving startup environments
- Strong analytical skills and curiosity about data
What We Do
Flow solves the complexity of fund infrastructure. We connect information across investors, providers, investments and systems to realize our mission of empowering thought. Powered by dynamic tools and human centric interfaces, our cross platform SaaS technology networks and adapts to the myriad of ways funds operate to optimize productivity and increase time invested into making decisions. By bringing Flow to the $10 Trillion alternative asset market, we are able to increase the speed of decisions, the speed of capital and the speed of growth that serve as the lifeblood for all market based economies.






