
The data-and-analytics revolution has the power to reshape how companies organize, run operations, manage talent, and create value. A small number of organizations are already seeing that impact, typically because they are generating real business returns from their data. But it is not yet the norm.
One reason is straightforward. CEOs and other senior executives, who are the only people positioned to drive enterprise-wide change, often avoid getting pulled into the technical “weeds.” That hesitation is understandable. Advanced analytics can feel complex and specialized, especially as machine learning becomes more central and data sets grow larger. It is easy for senior leaders to assume this work should be left entirely to technical experts.
That approach is a mistake. Advanced analytics is not merely a technical capability. It is a business capability. Senior leadership must be able to clearly state what analytics is meant to achieve and then translate that purpose into decisions and actions across the organization, not just within an analytics team. The value comes from how the insights change behavior: how priorities are set, how resources are allocated, how teams execute, and how performance is measured.
This article outlines eight critical elements that help leaders build clarity of purpose and the ability to act. Leaders who develop strong intuition in these areas do more than pressure-test analytics efforts or ask better questions. They are better equipped to handle the related executive challenges that determine whether analytics becomes a competitive advantage: grounding ambitious analytics goals in core business principles, deploying the right mix of tools and talent, and using clear metrics to evaluate impact. When leaders do this well, the odds of improving performance through analytics rise materially.
After all, performance is the point. Not pristine data sets. Not interesting patterns. Not “killer” algorithms. Advanced analytics is simply a means to an end: a disciplined way to identify a value-driving answer and then operationalize it across the business. You are far more likely to find that answer when you are clear on two things: the purpose your data is meant to serve and the decisions the business intends to improve with it. Those will look different across companies, industries, and geographies, where analytics maturity varies widely. But regardless of your starting point, analytics only delivers when its insights sit at the center of how you define, manage, and continuously improve performance as competitive dynamics evolve. Without that, you are not putting advanced analytics to work—you are just doing analytics.
“Better performance” means different things to different organizations. For one company it might mean faster cycle times; for another it might mean lower risk, improved customer retention, or higher margins. Because of that, the data that matters is not “all the data.” It is the subset that directly supports your intended use case. Some of those data points will be difficult to access, and not every data point has equal value. The highest-value data is the data that helps you answer the business question tied to your purpose, even if it is imperfect or incomplete.
The right analytics question is determined by your priorities, and clarity is non-negotiable. Broad questions like “How can we reduce costs?” or “How can we increase revenues?” are useful starting points, but the real payoff comes when you translate them into specific, actionable decisions. That might mean asking how to improve the productivity of each team member, how to raise quality outcomes for patients, or how to materially shorten time to market in product development. The goal is to connect the question to a real operating context: the function, the workflow, the decision-maker, and the metric that defines success. In a world of limited time and budget, analytics rarely pays off when it begins with vague curiosity—such as “What patterns do the data show?”—instead of a defined value hypothesis.
A large financial company learned this the hard way. It started by collecting as much data as possible and waiting to see what insights emerged. When the outputs proved mildly interesting but financially insignificant, leadership reset the approach. With strong C-suite sponsorship, the company defined a purpose statement focused on reducing product development time and paired it with a concrete measure tied to customer adoption. That sharper focus helped it deliver successful products for two market segments. Another organization took a similar detour by building a “data lake” first, spending years refining and cleaning data without deciding what the data would be used for. Only later did management begin to clarify the issues that mattered most—but the market does not wait for analytics programs to mature.
Organizations that reverse the sequence—starting with the question before building the data estate—tend to realize value faster, even if only a portion of the data is ready. A prominent automotive company, for example, began with a straightforward objective: improve profits. It quickly identified a high-leverage driver—reducing development time and the costs created by misalignment between design and engineering. Once that focus was clear, the company drew on ten years of R&D history to generate insights that materially improved development timelines and boosted profitability.
Small edges can create outsized results. A classic example comes from the 1896 Olympics, where one runner used a crouched four-point stance at the start of the 100-meter dash. That small technique helped him gain an advantage and win gold, and it soon became the standard because competitors adapted quickly. Business works the same way: improvements that seem incremental can compound into meaningful advantage, but only briefly—because best practices spread fast. The implication for analytics is straightforward. Aim for tangible, specific wins that move a real metric, while building the broader capability to keep finding the next edge as conditions change.
The encouraging reality is that disciplined, fast-moving organizations can still lift performance and regain advantage. There are rarely “easy fixes,” but there are often small points of difference that can be found, amplified, and compounded. In practice, the impact of advanced analytics usually shows up as thousands of incremental improvements rather than a single breakthrough. When a company breaks a process into its smallest components and optimizes each step where it can, the gains can be substantial. When those gains are repeated across multiple processes—and then connected—the effect can become exponential.
Almost any business activity can be decomposed this way. At the high-tech end, GE has embedded sensors in aircraft engines to monitor performance in real time, enabling faster adjustments and sharply reducing maintenance downtime. But the same logic applies in more conventional settings. One consumer packaged goods company wanted to improve margins on a well-known breakfast brand. It mapped the manufacturing process into sequential micro-steps and used analytics to test where value could be unlocked. The biggest opportunity turned out to be surprisingly specific: a slight adjustment to baking temperature improved taste while reducing production cost. The business case was not theoretical; it showed up directly in the P&L.
The payoff grows further when organizations stop treating processes as separate islands. A large steel manufacturer applied analytics across several critical domains—demand planning and forecasting, procurement, and inventory management. Within each area, it identified the few drivers that mattered most and eliminated hidden sources of waste, generating savings in the range of 5 to 10 percent. Those gains multiplied when the company connected the processes and enabled near–real time information flow from one stage to the next. By rationalizing the end-to-end system from demand planning through inventory management, it approached roughly 50 percent savings—amounting to hundreds of millions of dollars—driven by the compounding effect of many small improvements.
It is worth being cautious with the familiar warning “garbage in, garbage out.” It is true that data quality matters, but the phrase can become a reflex that blocks learning. Valuable inputs often exist inside organizations in “non-data” formats—free-text maintenance notes, call logs, emails, slide decks, and other qualitative artifacts. Quantitative teams sometimes dismiss these sources as inconsistent, dated, or too messy to analyze. The result is that potentially decisive signals are ignored simply because they do not look like clean, structured data.
In reality, decision-making in the real world routinely combines hard evidence with softer signals. People evaluate probabilities, interpret context, and adjust based on subtle cues. Even a simple situation—choosing the fastest supermarket queue—typically involves qualitative factors: which cashier seems efficient today, whether a customer is paying cash, whether bagging support is available, whether items need weighing. Those are imperfect inputs, but they are still informative. Past averages help, but they do not guarantee what will happen next.
Hard, historical data also has limits when the environment is changing. One company installed a highly rigorous investment-approval process and refused to fund new products without provable historical evidence supporting projected ROI. The intention was sensible—protect capital and avoid waste. The consequence was harmful: product launches slowed so much that the company repeatedly mistimed the market. Only after management broadened its definition of “usable input” to include softer signals—industry forecasts, expert judgment, and social-media commentary—did it gain a clearer view of current demand and improve launch timing.
None of this implies that all soft information is equally valid. It does mean that incomplete or biased inputs are not automatically worthless. In many contexts, they are essential for bridging gaps between more precise measures and for forming an informed view of emerging conditions. To use soft and hard information responsibly, companies should build a data provenance model that tracks the origin of each input and assigns a reliability score that can be updated over time. This is not just a transparency exercise; it is risk management. Provenance helps leaders stress-test confidence in a go/no-go decision and determine when it is worth investing to strengthen a critical data set.
Many insights are found at the boundaries—where different data sets intersect. Organizations often analyze one repository at a time: HR studies employee performance, operations focuses on asset data, finance reconciles the P&L. Each perspective can be useful on its own, but a significant share of untapped value sits between the silos. When data sets are combined, the signal often becomes clearer and the true drivers become easier to see.
One industrial company illustrates the point. Its core business relied on expensive, state-of-the-art machines capable of multiple processes, and it had invested billions in acquiring them at scale. The machines produced high-quality performance data, and the operations analytics team monitored them closely. Even so, repairs were taking longer and costing more than expected, and downtime was hitting the bottom line. Operations could not find a convincing technical explanation from the asset data alone.
The breakthrough came only when the company examined the equipment performance data alongside HR information. The machines were missing scheduled maintenance checks because the responsible personnel were absent at key times. The root cause was not engineering; it was incentives and attendance behavior. Once the organization aligned the relevant incentives, performance improved. The fix was straightforward—but it was visible only when the company connected different sources of information and treated the system as an integrated whole rather than a set of separate parts.
A useful image for the industrial-company example is a Venn diagram: place two data sets side by side and the overlap often reveals the insight. Extend that concept to dozens of data sets and the potential becomes much greater—provided the pursuit of “more data” does not create so much complexity that it prevents anyone from using the analysis. The practical objective is a multifaceted view that stays decision-ready. If teams run analytics in silos, if results fail under real-world constraints, or—worst of all—if the conclusions are sound but sit unused, then the effort has not delivered value.
Analytics needs a purpose and a plan, but it also needs an operating rhythm that assumes conditions will change. A useful analogy is the OODA loop—observe, orient, decide, act—developed by US colonel John Boyd. The advantage comes from cycling through decisions quickly and accurately, using feedback to refine the next move. High-performing organizations treat analytics the same way: they do not run a one-time analysis and declare victory; they build a continuous loop that repeatedly tests assumptions, updates inputs, and adapts actions as evidence changes.
This approach is common in leading digital organizations, but it also works well outside technology. A global pharmaceutical company, for example, monitors data for early indicators that a process is drifting off track, intervenes quickly, and then tightens the loop so that trials progress faster and more reliably. A consumer-electronics manufacturer similarly moved from data collection to action by building interim architecture that supported a few focused “insights factories” for high-priority use cases. The organization generated recommendations early, incorporated feedback in parallel, and used early wins to fund subsequent scaling.
Digitized data and modern algorithms can dramatically accelerate these loops, particularly when models improve as new inputs arrive. Even so, machine learning is not the only tool that matters, and it should not be treated as a replacement for other methods. Strong decisions often come from triangulation: using multiple analytic techniques, comparing outputs, and combining perspectives to stress-test conclusions. In practice, organizations with sophisticated automation still benefit from sanity-checking results with simpler univariate or multivariate analysis. The strongest loops blend people and machines—continuously monitoring data quality, incorporating new signals, and responding intelligently as conditions evolve.
Even excellent algorithms do not sell themselves in a leadership meeting. Data scientists can produce rigorous work yet struggle to communicate it in a way that executives can adopt quickly. That gap is predictable: companies hire technical talent for quantitative depth, not presentation skill. But the consequences can be real. One manufacturer built a technically strong model to support R&D options-pricing decisions; the analysis was sound, but the output felt too complex to decision makers, and adoption stalled.
Usability is therefore not cosmetic; it is a core component of impact. A clean, intuitive interface earns attention and builds trust faster than a dense computation that requires translation every time it is used. That is why consumer-grade product design expectations have moved into enterprise software. Leading analytics organizations increasingly embed design capability in their teams so findings are easy to interpret, compelling to engage with, and simple to act on. When insights are presented clearly and interactively, users across the organization are far more likely to adopt them and keep coming back.
Getting analytics used at scale requires more than technical excellence; it requires a team built for end-to-end delivery. Decisions about which analyses to run, which data sources to prioritize, and how to present results all depend on human judgment and cross-functional context. The most effective analytics programs treat this as a team sport.
At a minimum, that team includes data scientists to develop and apply advanced methods; engineers who can build reliable pipelines and production systems; architects who can design scalable platforms; and interface developers and designers who can translate insight into an experience people will actually use. Equally important are “translators”—professionals who can move between technical and business worlds, turning analytic output into operational decisions, management conversations, and measurable outcomes.
Because demand for these capabilities exceeds supply, organizations rarely succeed by simply paying a premium for new hires. More durable approaches combine selective senior hiring, occasional acquisitions or partnerships for targeted capability, and systematic reskilling of internal talent with quantitative foundations. Several financial institutions and industrial companies have built strong programs this way, and a common ingredient is realism about individual limits paired with confidence in what a complementary team can achieve together. Occasionally, a “unicorn” hire appears, but it is typically more reliable to build a collaborative group where the capabilities exist across the team rather than inside a single person.
Over time, the goal is not to centralize all analytics work in a specialist group, but to make analytics part of daily decision-making across functions. The employees who thrive in that environment tend to share recognizable traits: intellectual curiosity, comfort with ambiguity, openness to diverse views, discipline to iterate, and a strong bias toward real-world outcomes. That last point is the anchor. Analytics is not an isolated science project; it exists to produce practicable insights—and to ensure those insights get used.
Culture is what makes adoption possible. From the moment an organization begins its analytics journey, it should be clear that math, data, and even design are not sufficient on their own. The real power comes from adoption. Analytics cannot remain a point solution sitting off to the side; it has to be embedded into operating models, real processes, and day-to-day workflows. Put simply: analytics does not matter until it gets used.
There are plenty of cautionary examples. Organizations have invested heavily in tools that were technically impressive—detailed seismology forecasts, highly accurate flight-system indicators—only to watch frontline teams ignore them. In one case, a company appeared to have done everything “right.” It had a clear mission focused on top-line growth, strong data sources that were thoughtfully weighted and mined, rigorous analytics, and sharp insights on cross-selling. The interface was even elegant: call-center representatives received real-time pop-up prompts triggered by voice recognition, suggesting relevant products based on what customers were saying. The approach was sophisticated and well executed—except the representatives routinely closed the pop-ups. Their incentives rewarded speed and call volume, not thoughtful cross-sell behavior. Adoption failed because the operating model made the “right” action irrational for the people expected to take it.
When incentives and workflows are aligned, however, analytics can unlock exceptional outcomes. One aerospace firm faced a difficult portfolio of R&D decisions for next-generation products amid significant technological, market, and regulatory uncertainty. Some options looked safer based on history, while others carried high upside but were still unproven. At the same time, industry economics were shifting from product-led to service-centric models, making the trade-offs even more complex. The organization needed a decision approach that was both dynamic and reliable.
By focusing first on the right business questions, stress-testing options, and communicating trade-offs through an interactive visual model that was both usable and compelling, the company uncovered a valuable strategic insight: increasing investment along a particular R&D path would keep three technology options open for longer. That flexibility bought time to see which direction the technology would evolve and avoided the worst-case outcome—getting locked into an expensive, wrong choice. One executive described the advantage as the difference between betting on a horse at the start of the race versus paying a premium to place the bet halfway through, when more information is available.
It is not incidental that this success story ends where it began: with senior leadership engagement. In practice, the strongest early indicator of a successful analytics program is not pristine data or even elite technical talent, but sustained commitment from the C-suite. Leaders are the ones who can clarify the business questions that matter, force cross-functional collaboration, align incentives, and insist that insights translate into action. The objective is not to “have analytics.” The objective is to put analytics to work.

ChangEdwardS partners with creative leaders in business and society to tackle complex and important challenges. Our focus is on business strategy that brings transformational approaches. We want to empower organizations to grow and build sustainable competitive advantages that bring a positive impact to society. We bring deep industry specific functional expertise with a range of perspectives that question the status quo.
© ChangEdwardS. All rights reserved