In early September, researchers, data executives, founders and practitioners converged at the Future Data conference to debate what’s next in the world of data, from emerging architecture designs to cutting-edge research into visualization and large-scale data systems.

The conference itself marks a critical time for data, analytics and decision-making. On the one hand, the ability to capture vast amounts of data and make it accessible to anyone feels uncapped. On the other, despite all the data we have at our fingertips, there is still a dearth of data-driven decision-making in organizations.

It’s the modern-day equivalent of A Tale of Two Cities: “It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness.”

Despite this massive acceleration of data availability, many organizations lack the tools and the capacity to make use of it and struggle to use even a fraction of their information to inform daily decisions. The challenge of getting fast, useful answers from the data, for instance, was a consistent complaint heard across the conference. As Ben Horowitz summed up well on the second day: “If it takes a month to get back with an answer, you’re never going to be a data-driven culture.”

But don’t mistake the frustration for futility. By taking clear steps now, teams can accelerate the pace of decision-making, navigate change more confidently and start putting their first-party data to work to inform future product decisions.

 

Scaling Cloud-Native Data Systems

First, there’s the shift toward a more cloud-native data architecture. Companies are moving away from last decade’s investment in data lakes to more structured, cloud-based data warehouses. While the initial justification for this migration is often cost and scale of storageultimately, companies find that the real return on investment (ROI) in a cloud-native stack is to accelerate access to data to inform daily decisions.

For emerging companies, this is a huge opportunity to build a sustainable competitive advantage against any incumbents. Data pipelines make the real-time delivery of updates to the warehouse easier than ever. Analytics engineering and transformation tools can reformat data on the fly and data observability and quality monitoring platforms are emerging to improve the reliability of fragile SaaS and web data.

By getting data to the places where decisions are made faster, small organizations can execute faster and more confidently. The good news is most teams already capture the kind of data necessary to drive these decisions and the talks at Future Data did sketch out a blueprint for building a flexible data foundation:

  1. As soon as possible, start piping your SaaS and customer data into a single, cloud-based data warehouse like Snowflake, Redshift, BigQuery or Synapse.
     
  2. Build out analytics-friendly views directly in the warehouse. This may mean breaking some old business intelligence habits, but getting to a granular view of customers, transactions and user sessions is key to better analytics.
     
  3. Go beyond the dashboard and enable your team to ask deeper “why” questions about changing metrics with tools that automate the analysis process within the warehouse.

 

Automating and Accelerating Data Analysis and Insight Generation

The downside of all this data we’re collecting is that, while we have more context than ever about our businesses, it’s increasingly difficult to put it all to use.

Its certainly not a new problem. As Herbert Simon said in his 1971 paper “Designing Organizations for an Information-Rich World,” “a wealth of information creates a poverty of attention.” Almost 50 years later, researchers and innovative founders are focused on augmenting our ability to work with this complex data, both efficiently and effectively.

When you’re trying to diagnose the reasons why customer acquisition costs are increasing, average order values are flat and retention rates are dropping, there are just too many possible reasons to explore. The power of new augmented analytics platforms is that they can combine the best of machine learning and statistical testing with large-scale data warehouses to comprehensively test hypotheses and “prioritize the attention” of analysts working with the data.

To adapt, the best strategy for a data team is to combine the strengths of technology — speed, processing power and iteration — with the key skills of context, navigating ambiguity and domain expertise that analysts bring to the table. This “human-in-the-loop” model will emerge as the dominant model for rapid, accurate decision-making in leading organizations.

 

Improving Decision-Making With Data

Finally, the ability to scale data collection and augment the analytics process will matter only if they’re in the service of improving how people make decisions. The flip side of this is that, to really unlock informed decision-making, analytics tools need to be more accessible across the business and analysts need to be more integrated into individual departments.

Consequently, I see a future operating model where companies shift away from a centralized, shared service model for analytics and one where expert analysts are increasingly embedded directly in business units. This is definitely a pendulum swing back toward earlier, IT-centric models, but the difference now is that these cloud-native data platforms are better equipped to support distributed teams and collaborative work.

This is a final way that emerging companies can gain a competitive advantage. By centralizing data but pushing analysis back out into the places where decisions are made, they can accelerate decision-making without losing the ability to monitor the impacts of those decisions. This includes more people in the process, improves visibility and ultimately serves to capture more feedback on what’s working and what’s not.

With the adoption of a more cloud-native architecture, companies finally have the ability to track every dimension of their business with fine-grained accuracy. But without the ability to accelerate and augment how they put that data to use, they won’t see the ROI in the decision-making process.

Particularly for companies seeking to enter a new market or compete with established competitors, investing in scalable data systems early, building processes that automate and accelerate insight generation and distribute the decision-making process can build a more sustainable advantage.

Read This NextBigger Isn't Better — When It Comes to Data

Expert Contributors

Built In’s expert contributor network publishes thoughtful, solutions-oriented stories written by innovative tech professionals. It is the tech industry’s definitive destination for sharing compelling, first-person accounts of problem-solving on the road to innovation.

Learn More

Great Companies Need Great People. That's Where We Come In.

Recruit With Us