Making an Impact

Business school administrators know it’s not enough for their faculty to produce good research; they also must share it with key constituents. But how can they measure the impact of scholarly contributions? An AACSB study looks for the answer.
Making an Impact

As a Jesuit institution, Saint Joseph’s University in Philadelphia, Pennsylvania, considers it essential to blend ethical ideals into its educational offerings. When the Haub School of Business codified its strategic mission five years ago, faculty identified key goals they would y to meet in their research and teaching, including upholding Jesuit ideals and serving key industries.

So the Haub School was in good shape to participate in an exploratory study launched in 2008 by AACSB International that asked schools to evaluate whether their scholarly contributions truly support their stated missions. Haub faculty were asked to self-assess how well their research aligned with five specific criteria drawn from that mission statement so the school could judge how well it was reaching its target markets. The answer came as a pleasant surprise, says Stephen Porth, associate dean for academic affairs: Eighty-eight percent of 545 intellectual contributions met at least one criterion.

Schools must be clear-eyed about their research missions, certain of their markets, and honest about where they could improve.

Haub is among ten schools participating in the AACSB study, which is specifically looking at whether it’s feasible t implement new accreditation standards that would require schools to “demonstrate the impact of faculty intellectual contributions on targeted audiences.” Assessing such an impact means schools first must identify the audiences the wish to serve, then devise metrics that will help them determine if they’ve been successful. In other words, they must be clear-eyed about their research missions, certain of their markets, and honest about where they could improve.

The ten participating schools are deep in the process of accreditation maintenance. So far, three schools—Saint Joseph’s, California State University in Northridge, and the University of Mannheim in Germany—have completed their reaccreditation efforts and submitted their research assessments to AACSB. The Queensland University of Technology in Brisbane, Australia, is still a year out from finalizing reaccreditation, but is well into its study examining the impact of its research.

The other schools in the study include Binghamton University, State University of New York; College of William and Mary in Williamsburg, Virginia; Loyola University Chicago in Illinois; University of Alberta in Edmonton; University of Evansville in Indiana; and University of Minnesota in Minneapolis and St. Paul.

All of them will share with other AACSB members how they approached the exercise, how they devised evaluation metrics, and what changes they plan to make based on the information they uncovered. Some will make presentations at AACSB conferences; others will participate in webinars; and all of them will contribute to AACSB’s Resource Center at

It remains to be seen whether AACSB will decide to recommend any changes to the accreditation standards or their accompanying guidelines for interpretation. But four of the schools participating in the study have already found the exercise to be useful—and revealing.

The Jesuit Mission

At Saint Joseph’s, faculty have a pretty clear-cut idea of which markets they’re trying to serve. “We have an industry-focused program with emphases on food marketing, pharmaceutical marketing, and insurance and risk management,” says Joseph DiAngelo, dean of the Haub School. “We’re also a Jesuit school, so we’re heavily focused on Jesuit ideals, including ethics.”

To determine the alignment of faculty research with the school’s mission, Porth and DiAngelo worked closely with finance pofessor Jean Heck. He developed a database and spreadsheet for each faculty member to identify what journal articles and books they’d written within a specific period of time. Then the faculty members evaluated how their research contributed to five key parts of the school’s mission statement, which include:

  • Advance the body of knowledge in the field
  • Earn recognition as a leading Jesuit school of business.
  • Meet the Jesuit ideals of ethics, social justice, and responsibility.
  • Contribute to the practice of management and teaching.
  • Meet the needs of key industries and strategic niches. 

The fact that almost 90 percent of the research met one of these objectives “made us feel good that we were achieving what we had intended in the five goals of our mission, says DiAngelo. Porth and DiAngelo also were pleased that a significant amount of faculty research was focused on ethics. DiAngelo adds, “In fact, that became a major focal point of the accreditation review for the peer review team. We subsequently put together a monograph of all the work in ethics. We never would have thought to do that if we hadn’t looked at our research in this way, so this was an unexpected byproduct of the exercise.”

The assessment study also exposed gaps in the research portfolio. Says DiAngelo, “We learned we need to do a little more pedagogical research. We could also probably do more basic research, but since we’re not a doctoral-granting institution, we tend to focus more on applied work.”

Whether or not assessing intellectual contributions becomes part of the accreditation standards, DiAngelo believes doing so is a useful exercise. “It helped us evaluate our mission, consider how it’s carried out, and look at how it impacts what we do,” he says. “We’ve been asking ourselves for years, ‘What does it mean to be a Jesuit institution? How is it different from a secular one?’ I think this study helped us see that the mission isn’t just words on a paper. It really has affected some of the ways people at our college conduct research and teach.”

The Regional University

At Cal State Northridge, the assessment project was approached from a “reverse engineering standpoint,” says Judy Hennessey, professor of marketing and associate dean at CSUN’s College of Business and Economics. The school already had a good understanding of its regional mandate and its two target audiences—the community at large and businesses within the community. So Hennessey and her team focused on answering two questions: How does our research impact business in our community and forward the school’s mission outside the classroom? What kind of quantitative and qualitative assessment will capture this impact?

AACSB is emphasizing the need for broad faculty involvement in determining how the school can assess achievement of its research objectives, but that’s hard to get in wide-open forums, Hennessey says. So she and her team first asked department chairs to get faculty input on those two questions so preliminary debates could be held within “cozier” environments.

What tended to bubble up at the beginning, she says, were familiar discussions about the need for publishing top research in high-quality journals—which was not the focus of this particular exercise. “We had to say, ‘Yes, that’s important,’ and then set it aside,” says Hennessey. “We know that we can’t go out into the community as experts if we aren’t valid as academics, so we let the faculty bring that forward as a given. Then we could talk about how our research impacts our other, non-academic audiences.”

Faculty ultimately identified several indicators that demonstrate what impact their research is having on their region. One indicator is the work being done by their various research centers, which turn out economic analyses and forecasts for the community, support small businesses and startups, and offer management and organizational development assistance. Measuring the impact of the centers was a little harder.

“It’s easy to count how many small businesses we served and how many people attended our economic summit,” says Hennessey. “A better measure would be how much money we saved those businesses, but those numbers are harder to get.”

Another indicator that faculty identified is the way the school uses media in its outreach efforts. “We’ve found it valuable to put out expert commentaries or do radio shows that accompany our economic summits and regional forecasts,” says Hennessey. In fact, the school participates in a weekly radio broadcast on financ and the economy that appeals to executives and laypeople interested in money issues. Until recently the school also published a weekly column in a regional newspaper.

Again, measuring the value of these efforts is the tricky part. “We can count up the times and minutes of exposure, but that doesn’t really capture impact,” Hennessey says. “We’re still struggling with that. On the other hand, some-times just describing the things you do helps you understand that they matter.”

The next goal, Hennessey says, is to identify areas that need improvement—and then make changes. For instance, she says, the school wants to strive harder to create job placement and internship programs for community members. It also wants to fil gaps in its community outreach programs. “We saw a lot of places where we had expertise that wasn’t known or being utilized,” she says.

In addition, Hennessey and her team are trying to incorporate what they’ve learned into their ongoing strategic planning process. “We have the beginnings of a statement that will be an extension of our mission, and it drives home what we mean when we say we want to make an ‘impact on the community,’” she says. “Next year, we’ll make an active attempt to see whether we’ve followed these new strategic directions for filling the gaps we exposed in this study.”

All in all, she believes the exercise will help the school clarify its mission and establish benchmarks for how well it is meeting that mission—something any institution needs to do. “When we teach students to develop a business plan, we tell them to first size up where they are and evaluate their strengths, weaknesses, opportunities, and threats. Then they need to progress in a systematic way,” says Hennessey. “I think we’re just using what we teach.”

The German System

For the Business School of the University of Mannheim, assessing the impact of research meant first figuring out h to motivate faculty to participate, since German schools are not led by deans who can insist on compliance. Therefore, says Dirk Simons, BSUM’s associate dean for research, the school set up working groups from all departments to dis-cuss the impact of their research and posted results online so everyone could comment on discussions.

The next step was to identify the school’s primary target markets, which turned out to be the international academic research community, the corporate sphere, the public sector—and junior faculty working on their own career trajectories. In Germany, Simons explains, assistant professors have six years to win full professorships; after that they must leave the university system. If they’re assistant professors at universities with active research groups, they increase their chances in the job market. “We consider them part of our audience, because we need to manage good research projects so they can develop their skills,” says Simons.

After identifying its audiences, the school listed indicators that would help it measure the effect of faculty research. Professors have an impact, it was determined, when they organize conferences, run corporate workshops, or serve on governmental advisory boards.

Next, Simons and his team analyzed the fact that the German university system is driven primarily by reputation. For instance, if a department chair is highly respected, his area attracts more funding, which creates a better research environment, which attracts better assistant professors, who do more research and go on to better jobs. At the same time, professors can only enter salary negotiations with their home universities if they have offers from other universities, and they only attract these offers by building good reputations.

“So we asked, ‘What is creating reputation?’ and we came up with the waterfall model,” says Simons. “A professor’s reputation is first bid up in the research community, when he goes to conferences, publishes in journals, and presents to colleagues. A corporation looking for an advisor will contact someone who has a good reputation in the academic field Students want to work for professors with good corporate reputations, because that increases their career opportunities.”

Everyone at Mannheim agreed up front to recognize several forms of worthwhile research over and above publication in a few peer-reviewed journals. “Because we’re part of the research community, we have to earn a national and international reputation, but because we’re a public university, we have to serve the public and corporate worlds as well,” says Simons. “This was a valuable realization.”

The exercise also made it clear that, while it was important to produce research that targeted each market, no one professor could do it all. So now faculty expect that a certain percentage of them will focus on academic research, another percent on the corporate market, and another percentage on public institutions.

Simons found the exercise useful for several other reasons. “Business schools are under pressure to answer the question of whether they’re doing any valuable research at all,” he says. “Through this study, we were able to document that we’re spending a lot of effort and doing a lot of good for our particular region of the European Union.”

While BSUM performed its initial evaluation of research output as part of the one-time exploratory study, the faculty have agreed to continue with the exercise every two years. “This way we’ll know if we’ve achieved our targets and we can talk about how to adjust our portfolio if we haven’t,” says Simons. “We want to try to live the process.”

Impact in Australia

Administrators at the Queensland University of Technology find that AACSB’s exploratory study dovetails perfectly with their own goals of improving the quality and impact of faculty research. The Faculty of Business instituted those goals recently as part of a response to a new national directive, Excellence in Research for Australia (ERA), which emphasizes quality over quantity of published research.

Publication in respected journals is only one way to make a mark; another is to disseminate research results through the popular press.

The first part of QUT’s strategy revolves around encouraging faculty to publish in top journals as identified by ER and the Australian Deans Business Council. Professors who publish in the best journals—ranked as A*, A, or B—receive workload credits and other incentives. That’s required a shift in mindset, since only 57 percent of QUT’s faculty research appeared in those top journals between 2006 and 2008. The business school set a target of having 80 percent of faculty research published in highly ranked journals by 2012, a goal that had an immediate effect, says Peter Little, executive dean of the Faculty of Business. In 2009, 70 percent of the school’s refereed articles were in those top publications.

At the same time, QUT is storing information about faculty publication records, adds Little. “So now we can have a comprehensive picture at any one time of our faculty’s total output and the quality of their output.” The school also con-siders quality of output when it recruits and hires new faculty.

The new study might provide a way to quantify the impact of research—and show everyone how much it matters.

In addition to focusing on publication quality, the school is looking at ways to measure impact, especially on business. While some of QUT’s projects are funded by Australia’s national competitive research grant body, many receive money from corporations and individual businesspeople. Others are launched through cooperative research centers that bring together government, industry, and academia for extensive projects. “We think one way to measure the impact of research is to track how involved industry is in knowledge production.” says Rachel Parker, assistant dean of research.

For instance, the school has undertaken an ambitious corporate education program aimed at executives managing large, complex projects. Australia is one of eight countries investing in the joint strike fighter being developed by Lockheed Martin—which, at a cost of about $400 billion, could be “the largest project on Earth,” says Little. The procurement arm of the Australian government asked QUT to develop a program that would improve the management skills of federal project leaders and major defense contractors working on such projects. The government also is funding a longitudinal study that will determine if such programs actually result in significant savings and better project management. The pro-gram is based on a similar one QUT has run for the Shell Oil Company with three other international universities.

As a third element in their new research strategy, QUT is closely monitoring how much faculty contribute to public policy, either through research programs or the university’s research centers. Recently, the school’s Australian Centre for Philanthropy and Nonprofit Studies helped create a model for financial reporting for nonprofit organizations—a model that has been adopted throughout Australia and is being considered by two Canadian governments.

The school expects industry-funded research to meet the same high standards—and be published in the same top journals—as research funded by national competitive grants. That’s in large part because of the nature of the projects themselves. “For instance, our Australian Centre for Entrepreneurship Research is conducting the largest Australian study ever undertaken on entrepreneurial startups, which attracted funding from major banks and financial services companies,” says Little. “We know the results of that will make it into high-quality journals.”

But QUT administrators have decided that publication in respected journals is only one way to make a mark; another is to disseminate research results through the popular press. Thus, they’re encouraging faculty to use the school’s corporate communications facilities to publicize their work in consumer media outlets as well. One piece of recent research that received worldwide coverage was a behavioral economics study, examining human behavior under extreme conditions, as demonstrated by which people ran for lifeboats on the Titanic and Lusitania and which ones stayed behind.

“Not only do such news articles help us share important ideas, they’re very good for the reputation of the school,” says Little.

The Next Step

Whether or not an impact of research component becomes part of new accreditation standards, representatives of these schools feel like they benefited from participating in the study. For one thing, the process helped all the schools clarify their research missions—and even see themselves in a different light.

Says CSUN’s Hennessey, “I come from the marketing area, so I like the term ‘positioning.’ This exercise let our school explore its position from a lot of different perspectives.”

Mannheim benefited by participating in the exploratory exercise, Simons believes, but the school also brought value to the study by supplying a viewpoint that was non- American—and non-native-English-speaking.

“Different institutional settings employ different terminology and entertain different ideas,” Simons says. “We thought that, if we participated in the pilot study, we could help explore how research works in different institutional settings.”

As vice chair of the accreditation quality committee, which writes the standards, DiAngelo of Saint Joseph’s believes an impact of research study could be very helpful for a school’s accreditation team trying to show peer reviewers how well the school is meeting its own mission.

“It’s an excellent way to document that your mission is actually a live document, which can be the toughest part for a team,” he says. “So many missions are so generic that you can take out one school’s name and insert another one. But an exercise like this helps the team see that you live the document.”

Faculty research is a central component of management education, generating new knowledge that changes business strategy in the real world and provides case studies in the classroom. Management educators have always known that, but they haven’t always been able to prove it. AACSB’s new study, say those who participated, might provide a way to quantify the impact of research—and show everyone, including the researchers themselves, how much it matters.