recent

4 ways the US election could impact 2025 climate policy

This Walmart executive runs on people power and consistent routines

An AI chatbot can reduce belief in conspiracy theories

Credit: Marysia Machulska

Ideas Made to Matter

Analytics

How to build an effective analytics practice: 7 insights from MIT experts

By

Faculty members in MIT Sloan’s Master of Business Analytics program don’t just teach courses like Hands-On Deep Learning and Econometrics for Managers.

They also have deep experience applying the tools of modern data science, optimization, and machine learning to solve real-world business problems for organizations like Facebook, Salesforce, Booz Allen Hamilton, and the intelligence wing of the Israeli Defense Forces.

Here, they share insights about mistakes to avoid, strategies to adopt, and the developments in analytics and data that excite them most.

Let business decisions drive data strategy

 Associate Professor, Operations Management

One of the biggest mistakes companies make about analytics is the disconnect between the technology and real business decisions. Companies tend to collect data for the sake of having data, and to develop analytics for the sake of having analytics, without thinking about how they are going to use the data and analytics capabilities to inform business decisions.

Successful analytics organizations are always decision-driven. They start by asking what business decisions they need data and analytics for, then investing resources to collect the right data and build the right analytics.

It is equally important to have the right talents in the organization that speak both the language of analytics and business, so that they can be the bridge between the tech and the business decision-makers. These employees understand the business needs and how analytics can be utilized to satisfy those needs; at the same time, they’re able to communicate the tech solution to business decision-makers in an understandable, intuitive way, as opposed to delivering a “black-box solution” that is hardly adopted by humans. 

Use deep learning to get value from unstructured data

Professor of the Practice, Data Science and Applied Machine Learning

I am personally most excited by deep learning. Traditional analytics methods are very effective for structured data, but we weren’t previously able to get value from unstructured data — images, audio, video, natural language, and so on — without a lot of labor-intensive preprocessing.

With deep learning, this limitation is effectively gone. We can now leverage unstructured and structured data together in a single, flexible, and powerful framework and achieve significant gains relative to what we could do earlier. This is possibly the most significant analytics breakthrough that I have witnessed in my professional career.

Pick use cases that deliver value

Lecturer, Operations Research and Statistics

The strategy to build an analytics practice is simple. First, identify three sources of use cases and start to build them. The three sources include:

  • Use cases that support C-level metrics (think revenue, cost, and risk).
  • Business processes that can be supported by self-serve analytics and dashboards.
  • Compliance must-do activities. 

I use these three sources because they will be looked at differently by the ultimate scorekeepers — the finance function.

The second important activity is to staff the bench to meet demand once these use cases start driving value. Companies will often erroneously hire for a variety of roles and think the work is done. Given the fluidity of the post-COVID work environment and often short tenures of scarce analytics talent, companies must not only staff up but establish pipelines of talent.

One way companies do this is to partner with a higher-ed organization like the MIT Sloan Master of Business Analytics program, work with students on capstone projects, hire those students when they graduate, and then ask them to work with a new crop of students. This virtuous cycle ensures a happy, competent, and staffed bench of analytics talent.

Develop a new organizational language based on data-enabled models

Professor, Operations Management

Related Articles

Try this data framework for analytics advantage
Data literacy for leaders
What is synthetic data — and how can it help you?

Data and analytics technologies are a critical enabler to create intelligent workflow and decision processes and systems. That said, many companies think about this through a technical lens and miss the fact that this is an end-to-end organizational challenge. 

The opportunity to design intelligent decision processes emerges from the ability to sense the organizational environment better than ever. It requires a new organizational language based on data-enabled models. Organizations must deeply understand their existing decision processes and the data they generate, and then develop layers of data-enabled models to allow the design of innovative intelligent decision processes. To be successful, it’s critical that organizations understand and manage required changes in decision rights and workforce role definitions.

Embrace the full analytics pipeline, upstream and downstream

Assistant Professor, Operations Research and Statistics

Most analytics projects in practice are focused on the development of deep learning and artificial intelligence tools. This is the shiny object that any analytics team is trying to build, improve, and deploy, with an emphasis on technical performance indicators — “My accuracy is 87%,” and so forth.

However, these represent only a narrow subset of the full analytics pipeline, which spans data management, descriptive analytics (such as data visualization and pattern recognition), predictive analytics (using machine learning tools, including but not restricted to deep learning), prescriptive analytics (using optimization), and business impact.

Time and time again, analytics projects take shortcuts across that pipeline — upstream and downstream.

At the upstream level, many analytics teams forgo critical steps to ensure the quality of their data, the representativeness of their data, and their own understanding of their data. One remedy for that is systematic exploratory data analysis baked into the analytics pipeline.

At the downstream level, analytics teams oftentimes fail to address the challenges associated with complex, large-scale decision-making in complex systems. This is where analytics projects could gain an additional edge by systematically embedding predictive tools into prescriptive analytics pipelines and decision-support systems.

Address specific business use cases to improve decision-making

Assistant Professor, Operations Management

With all the hype around machine learning, it is easy to forget that predictions are most useful when they inform decision-making. I’ve seen organizations roll out predictive models that weren’t going to inform actual decisions at all.

But even if a predictive model directly feeds into decision-making, improving predictions doesn’t always improve the decisions. Instead, new analytics capabilities are most powerful when they’re done to address specific business use cases to improve decision-making.

To ensure successful outcomes, it’s best for companies to measure these capabilities by the quality of the decisions they produce rather than just the accuracy of the predictions feeding into them.

Establish a centralized system for randomized experiments

Associate Professor, Marketing

Firms building an analytics process must have consistent definitions and practices. The foundation for trustworthy analytics is consensus on how basic metrics are defined and how common analyses are conducted.

This is sometimes one more indirect benefit of setting up a centralized system for randomized experiments (A/B tests and beyond): It often requires figuring out what metrics will show up when analyzing a given test, and this requires getting teams to agree on just how particular metrics are defined — be it number of days active, time spent on site, or even ad revenue per user.

These benefits are on top of the more direct benefits of making it easier to run experiments and making their results standardized and trustworthy.

Read next: In-demand data and analytics skills to hire for now

For more info Tracy Mayor Senior Associate Director, Editorial (617) 253-0065