19 August 2018

How higher-education institutions can transform themselves using advanced analytics

By Mark Krawitz, Jonathan Law, and Sacha Litman

Many college and university leaders remain unsure of how to incorporate analytics into their operations. What really works? Leaders in most higher-education institutions generally understand that using advanced analytics can significantly transform the way they work by enabling new ways to engage current and prospective students, increase student enrollment, improve student retention and completion rates, and even boost faculty productivity and research. However, many leaders of colleges and universities remain unsure of how to incorporate analytics into their operations and achieve intended outcomes and improvements. What really works? Is it a commitment to new talent, technologies, or operating models? Or all of the above?


To answer these questions, we interviewed more than a dozen senior leaders at colleges and universities known for their transformations through analytics. We also conducted in-depth, on-campus visits at the University of Maryland University College (UMUC), a public institution serving primarily working adults through distance learning, and Northeastern University, a private nonprofit institution in Boston, to understand how their transformations went.1We combined insights from these interviews and site visits with those gleaned from our work with more than 100 higher-education engagements across North America over the past five years, and we tapped McKinsey’s wide-ranging expertise in analytics-enabled transformations in both the public and private sectors.

Our conversations and engagements revealed several potential pitfalls that organizations may face when building their analytics capabilities—as well as several practical steps education leaders can take to avoid these traps.

Advanced analytics use cases

Transformation through advanced analytics can be difficult for any organization; in higher education, the challenges are compounded by sector-specific factors related to governance and talent. Leaders in higher education cannot simply pay lip service to the power of analytics; they must first address some or all of the most common obstacles.

Being overly focused on external compliance. Many higher-education institutions’ data analytics teams focus most of their efforts on generating reports to satisfy operational, regulatory, or statutory compliance. The primary goal of these teams is to churn out university statistics that accrediting bodies and other third parties can use to assess each institution’s performance. Any requests outside the bounds of these activities are considered emergencies rather than standard, necessary assignments. Analytics teams in this scenario have very limited time to support strategic, data-driven decision making.

Isolating the analytics program in an existing department. In our experience, analytics teams in higher-education institutions usually report to the head of an existing function or department—typically the institutional research team or the enrollment-management group. As a result, the analytics function becomes associated with the agenda of that department rather than a central resource for all, with little to no contact with executive leadership. Under this common scenario, the impact of analytics remains limited, and analytics insights are not embedded into day-to-day decision making of the institution as a whole.

Failing to establish a culture of data sharing and hygiene. In many higher-education institutions, there is little incentive (and much reluctance) to share data. As a result, most higher-education institutions lack good data hygiene—that is, established rules for who can access various forms of data, as well as formal policies for how they can share those data across departments. For example, analytics groups in various university functions may use their own data sets to determine retention rates for different student segments—and when they get together, they often disagree on which set of numbers is right.

Compounding this challenge, many higher-education institutions struggle to link the myriad legacy data systems teams use in different functions or working groups. Even with the help of a software platform vendor, the lead time to install, train, and win buy-in for these technical changes can take time, perhaps two to three years, before institutions see tangible outcomes from their analytics programs. In the meantime, institutions struggle to instill a culture and processes built around the possibilities of data-driven decision making.

Lacking the appropriate talent. Budgets and other constraints can make it difficult for higher-education institutions to meet market rates for analytics talent. Colleges and universities could potentially benefit from sourcing analytics talent among their graduate students and faculty, but it can be a struggle to attract and retain them. Furthermore, to successfully pursue transformation through analytics, higher-education institutions need leaders who are fluent in not only management but also data analytics and can solve problems in both areas.
Would you like to learn more about the Social Sector Practice?

Deploying best practices

These challenges can seem overwhelming, but transformation through analytics is possible when senior leaders in higher-education institutions endeavour to change both operations and mind-sets.

Leaders point to five action steps to foster success:

Articulate an analytics mandate that goes beyond compliance. Senior leaders in higher education must signal that analytics is a strategic priority. Indeed, to realize the potential of analytics, the function cannot be considered solely as a cost center for compliance. Instead, this team must be seen as a source of innovation and an economic engine for the institution. As such, leaders must articulate the team’s broader mandate. According to the leaders we interviewed, the transformation narrative must focus on how analytics can help the institution facilitate the student journey from applicant to alumnus while providing unparalleled learning, research, and teaching opportunities, as well as foster a strong, financially sustainable institution.

Establish a central analytics team with direct reporting lines to executive leaders. To mitigate the downsides of analytics teams couched in existing departments or decentralized across several functions, higher-education leaders must explicitly allocate the requisite financial and human resources to establish a central department or function to oversee and manage the use of analytics across the institution. This team can be charged with managing a central, integrated platform for collecting, analyzing, and modeling data sets and producing insights quickly.

For example, UMUC has a designated “data czar” to help define standards for how information is captured, managed, shared, and stored online. When conflicts arise, the data czar weighs in and helps de-escalate problems. Having a central point of contact has improved the consistency and quality of the university’s data: there is now a central source of truth, and all analysts have access to the data. Most important, the university now has a data evangelist who can help cultivate an insights-driven culture at the institution.

In another example, leaders at Northeastern created an analytics center of excellence structured as a “virtual” entity. The center is its own entity and is governed by a series of rotating chairs to ensure the analytics team is aware of and paying equal attention to priorities from across the university.

In addition to enjoying autonomous status outside a subfunction or single department, the analytics team should report to the most-senior leaders in the institution—in some cases, the provost. When given a more substantial opportunity to influence decisions, analytics leaders gain a greater understanding of the issues facing the university and how they affect the institution’s overall strategy. Leaders can more easily identify the data sets that might provide relevant insights to university officials—not just in one area, but across the entire organization—and they can get a jump-start on identifying possible solutions.

Analysts at Northeastern, for instance, were able to quantify the impact of service-learning programs on student retention, graduation, and other factors, thereby providing support for key decisions about these programs.

Win analytics buy-in from the front line and create a culture of data-driven decision making. To overcome the cultural resistance to data sharing, the analytics team must take the lead on engendering meaningful communications about analytics across the institution. To this end, it helps to have members of the centralized analytics function interact formally and frequently with different departments across the university. A hub-and-spoke model can be particularly effective: analysts sit alongside staffers in the operating units to facilitate sharing and directly aid their decision making. These analysts can serve as translators, helping working groups understand how to apply analytics to tackle specific problems, while also taking advantage of data sets provided by other departments. The university leaders we spoke with noted that their analysts may rotate into different functional areas to learn more about the university’s departments and to ensure that the department leaders have a link back to the analytics function.

Of course, having standardized, unified systems for processing all university data can help enable robust analysis. However, universities seeking to create a culture of data-driven decision making need not wait two years until a new data platform is up and running. Instead, analysts can define use cases—that is, places where data already exist and where analysis can be conducted relatively quickly to yield meaningful insights. Teams can then share success stories and evangelize the impact of shared data analytics, thereby prompting others to take up their own analytics-driven initiatives.

The analysts from UMUC’s decision-support unit sometimes push relevant data and analyses to the relevant departments to kick-start reflection and action, rather than waiting for the departments to request the information. However, the central unit avoids producing canned reports; analysts tend to be successful only when they engage departments in an honest and objective exploration of the data without preexisting biases.

Strengthen in-house analytical capabilities. The skills gap is an obvious impediment to colleges’ and universities’ attempts to transform operations through advanced analytics—thus, it is perfectly acceptable to contract out work in the short term. However, while supplementing a skills gap with external expertise may help accelerate transformations, it can never fully replace the need for in-house capacity; the effort to push change across the institution must be owned and led internally.

To do so, institutions will need to change their approaches to talent acquisition and development. They may need to look outside usual sources to find professionals who understand core analytics technologies (cloud computing, data science, machine learning, and statistics, for instance) as well as design thinking and operations. Institutions may also need to appeal to new hires with competitive financial compensation and by emphasizing the opportunity to work autonomously on intellectually challenging projects that will make an impact on generations of students and contribute to an overarching mission.

Do not let great be the enemy of good. It takes time to launch a successful analytics program. At the outset, institutions may lack certain types of data, and not every assessment will yield insightful results—but that is no reason to pull back on experimentation. Colleges and universities can instead deploy a test-and-learn approach: identify areas with clear problems and good data, conduct analyses, launch necessary changes, collect feedback, and iterate as needed. These cases can help demonstrate the impact of analytics to other parts of the organization and generate greater interest and buy-in.

It is easy to forget that analytics is a beginning, not an end. Analytics is a critical enabler to help colleges and universities solve tough problems—but leaders in higher-education institutions must devote just as much energy to acting on the insights from the data as they do on enabling analysis of the data. Implementation requires significant changes in culture, policy, and processes. When outcomes improve because a university successfully implemented change—even in a limited environment—the rest of the institution takes notice. This can strengthen the institutional will to push further and start tackling other areas of the organization that need improvement.

Some higher-education institutions have already overcome these implementation challenges and are realizing significant impact from their use of analytics. Northeastern University, for example, is using a predictive model to determine which applicants are most likely to be the best fit for the school if admitted. Its analytics team relies on a range of data to make forecasts, including students’ high school backgrounds, previous postsecondary enrollments, campus visit activity, and email response rates. According to the analytics team, an examination of the open rate for emails was particularly insightful as it was more predictive of whether students actually enrolled at Northeastern than what the students said or whether they visited campus.

Meanwhile, the university also looked at National Student Clearinghouse data, which tracks where applicants land at the end of the enrollment process, and learned that the institutions it had considered core competitors were not. Instead, competition was coming from sources it had not even considered. It also learned that half of its enrollees were coming from schools that the institution’s admissions office did not visit. The team’s overall analysis prompted Northeastern to introduce a number of changes to appeal to those individuals most likely to enroll once admitted, including offering combined majors. The leadership team also shifted some spending from little-used programs to bolster programs and features that were more likely to attract targeted students. Due in part to these changes, Northeastern improved its U.S. News & World Report ranking among national universities from 115 in 2006 to 40 in 2017.

In another example, in 2013 UMUC was trying to pinpoint the source of a decline in enrollment. It was investing significant dollars in advertising and was generating a healthy number of leads—however, conversion rates were low. Data analysts at the institution assessed the university’s returns on investment for various marketing efforts and discovered a bottleneck—UMUC’s call centers were overused and underresourced. The university invested in new call-center capabilities and within a year realized a 20 percent increase in new student enrollment while spending 20 percent less on advertising.

The benefits we discussed barely scratch the surface; the next wave of advanced analytics will, among other things, enable bespoke, personalized student experiences, with teaching catered to students’ individual learning styles and competency levels. To realize the great promise of analytics in the years to come, senior leaders must focus on more than just making incremental improvements in business processes or transactions. Our conversations with leaders in higher education point to the need for colleges and universities to establish a strong analytics function as well as a culture of data-driven decision making and a focus on delivering measurable outcomes. In doing so, institutions can create significant value for students—and sustainable operations for themselves.
About the author(s)Marc Krawitz is an associate partner in McKinsey’s New Jersey office. Jonathan Law is a partner in the New York office and leads the Higher-Education Practice. Sacha Litman is an associate partner in the Washington, DC, office and leads public and social sector analytics.

The authors would like to thank business and technology leaders at the University of Maryland University College and Northeastern University for their contributions to this article.