Use NPS Scores to Improve Programs

IMD relies on Net Promoter Scores to determine who supports the school and how programs can be improved.

The customer used to be king. Today, the customer is god, with the power to determine if a company lives or dies. Because online platforms allow consumers to communicate their opinions of products and services to the whole world, and potential buyers rely on candid testimonials from existing consumers, customers can make or break a company. That’s true for retail giants like Apple and Amazon, but it’s equally true for universities and the business schools within them. Any organization that wants to survive must listen to the voices of its consumers.

How can business schools determine what their students and other stakeholders admire about their programs and what they’d like to see improved? At IMD, we’ve been relying on the Net Promoter Score (NPS) since 2009. NPS is a management tool that measures customer loyalty by asking a single question: “How likely is it that you would recommend us to a friend or colleague?” Those who would are promoters, those who wouldn’t are detractors, and those with no strong opinions are passives. The final score, which can range from -100 to 100, is calculated by subtracting the number of detractors from the number of promoters

To determine our NPS, we survey our participants multiple times after they’ve taken our programs. Our program directors analyze the comments to gain insights into customer satisfaction and use key takeaways to spark program improvements. We follow four commandments that might be useful for other schools looking to build relationships with stakeholders:

1. Ask the ultimate question. When we ask students and other “customers” of the school how likely they are to recommend our programs, we learn a great deal about what’s working and what we could do better to strengthen the impact of our programs. We find that NPS scores usually correlate with program evaluations. The difference is that NPS looks at the overall experience and long-term impact, while evaluations look at satisfaction with all the sessions and details of the program.

2. Ask it again. We don’t just invite customers to evaluate our programs upon completion; we follow up six months later to solicit feedback about how effective our programs are and how they help participants transfer knowledge to the workplace. Finally, one year after participants have completed programs, we reconnect with them to see if their educational experience is still having an impact on their professional lives. Our goal is to make sure that our programs continue to deliver real results that allow clients to put their learning to use once they are no longer with us.

The lesson here is to use more than one measurement. If we find out as much as we can about how well our programs are working—and we find out as often as we can—we can gauge what we’re doing well and how we can improve.

3. Act on the findings—and fast. It’s great to know what clients think about every aspect of our programs, but once we know what they think, we have to get to work to improve our products and services.

Are many former students giving the same negative critique about a course? If so, we have the professor immediately fix the problems that can be easily addressed and begin working on the longer-term improvements that might take time to implement. Is another program receiving consistently high marks? Then we make sure we continue to offer it and do what we can to enhance it.

4. Respond to all participants. As I mentioned, the survey allows us to differentiate between promoters, passives, and detractors, and we respond to them all. For instance, we know it’s critical to keep our promoters loyal, so we engage with them however we can. We meet them, listen to what they have to say, and give them special treatment—which might consist of wishing them happy birthday, meeting them whenever they’re back on campus, or offering them free seats at popular events. But we also don’t ignore those who have had less enjoyable experiences. We try to find out what went wrong and see how we can rectify the situation.

To sum up, we know that it’s essential to always stay in touch with our students and other stakeholders. Our brand isn’t what we want it to be; it’s what they think it is. We must ask the ultimate question, because our students have the ultimate power to determine our school’s success.

For more information about Net Promoter Score, visit www.netpromoter.com/know.

Dominique Turpin is president of IMD in Lausanne, Switzerland.