The Dark Side of Alternative Metrics

With new metrics come new abuses, from scholars trying to game the system to editors setting up predatory journals. But the industry is working on answers.  

Once the academic community began to assign value to nontraditional metrics such as the number of times a paper is viewed, downloaded, and shared, it was inevitable that people would begin trying to game the system. To push up their counts, some are doing everything from downloading their papers repeatedly themselves to paying people to download their papers for them. Weeding this activity out of the system is a “a cat-and-mouse game,” says Gregg Gordon of the Social Science Research Network.

But sharing platforms like SSRN are getting better at detecting fraudulent behaviors. Gordon describes an event about 14 years ago that shaped how SSRN addresses fraudulent behaviors on its platform today. One day, he recalls, one of his colleagues posted a co-authored paper to the network in an important but relatively small area of law. When within 24 hours it had 1,600 downloads, Gordon knew something wasn’t right. It turned out that his colleague’s co-author had promised students in his class free doughnuts if they drove up the download count. That, in turn, led to one enterprising student posting the request on a hacker site. And then the hackers got to work.

Rather than block the attack, Gordon and SSRN’s IT staff decided to learn how to protect their site by watching the best hackers in the world try to compromise it. “We sat with a pizza and a bottle of red wine watching those hackers do every bad thing they could think of to our site, and we watched them talk to each other on their website in real time,” says Gordon. “Once their interest started to wane, we shut off access. That night, we got about US$50,000 worth of cybersecurity consulting for free.”

Since then, SSRN has learned even more about detecting fraudulent patterns of behavior. Some people, says Gordon, have set up systems to download their own paper hundreds—and sometimes thousands—of times in succession. For that reason, the system will send up an alert when it detects multiple downloads of a paper from the same account, IP address, or university server. Of the approximately 25 million to 35 million downloads SSRN tracks each year, less than half can be verified.

Those fraudulent downloads create “one heck of a dataset,” says Gordon. “We can see interesting patterns, which alert us if there’s something we need to look at more closely.”

Another negative consequence to the rise in digital academic publishing is a concomitant increase in the number of so-called predatory journals—which often have little to no peer review process and can charge authors expensive fees to get published. Many schools have long relied on the infamous Beall’s List of predatory journals, started in 2008 by Jeffrey Beall, who until recently was a librarian at the University of Colorado in Denver. Beall wanted to help academics avoid falling victim to the actions of predatory open-access publishers. But he shut down the list in 2017, possibly due to a lawsuit by a publishing group protesting its inclusion, leaving a void in the market. Its content has since been reposted online, but it is no longer updated.

In June 2017, journal directory Cabell’s International introduced a resource intended to serve a similar purpose as the Beall’s List. As of this writing, Cabell’s Blacklist includes 8,726 journals, in all fields, that have been found to use unscrupulous practices. The Blacklist is the counterpart to Cabell’s Whitelist of preferred journals. Both are subscription-based services.

With the advent of online digital publishing, the cost of starting journals has dropped dramatically, says Lacey Earle, the company’s vice president of business development. “As traditional print publishing evolves even further into the digital realm and as advances are made in digital storage and distribution technologies, the costs of starting a journal are no longer greater than the potential profits,” Earle says. “Just like anyone can set up an online store that doesn’t actually fulfill orders or exists solely to lift credit card information, anyone can set up a journal. The only difference is that there are already many systems in place to prevent consumer fraud.” That’s not the case for predatory publishers.

Cabell’s developed its Blacklist evaluation process over two years, with the input of journal publishers, editors, and researchers. Today, a team of journal quality auditors—many of whom have years of experience as journal editors—evaluate journals from all disciplines. They compare each journal’s practices to 68 behavioral indicators in order to determine if its behavior warrants addition to the Blacklist.

Publications are added to the list when they fail in one or more areas of scholarly integrity. “Each behavior that we look for is weighted according to its severity,” Earle explains. “Behaviors that are directly indicative of deception, such as falsely claiming a journal has an Impact Factor, are weighted more heavily than behaviors that tend to correlate with deception.” Journals whose practices do not rise to the level of the Whitelist or fall to the level of the Blacklist remain unlisted altogether.

While the Blacklist is one resource, Earle encourages schools to use a variety of tools to determine their preferred publication outlets. These include sources such as Impact Factors, Altmetric, and Cabell’s own Classification Index and Difficulty of Acceptance ranking.

But even as online platforms work to weed cheaters out of the system, a school’s primary defense against unscrupulous practices is to educate its faculty, says Wilfred Mijnhardt of the Rotterdam School of Management. Business schools can design workshops, webpages, and other training materials that make faculty aware of preferred journals for publication and to train them early on to spot red flags—for example, journals that invite faculty to publish, that charge authors fees to publish, or that have names comparable to more established publications.

In addition, faculty should be kept apprised of new but promising journals—which RSM calls “aspirant primary journals”—that might lack history or that are spin-offs of established and highly regarded publications. Two such examples are the Strategic Entrepreneurship Journal and Global Strategy Journal, both satellites of the Strategic Management Journal. These young publications have adopted the same policies as their mother journal, so both are worthy of faculty’s attention, says Mijnhardt.

“Faculty should view these journals like investment funds for their careers—if they invest in the right ones, it will yield greater reputation and recognition down the road,” he adds. “They can use a portfolio of journals for their work as a way to spread the risk.”

Most important, says Mijnhardt, business schools must train professors not only to maintain their integrity as they promote their research, but to give predatory journals a wide berth. “Predatory journals are only a danger to faculty if a school doesn’t have a clear publishing and integrity strategy or offer professional research support to help faculty find the best possible journals for their scholarship. If a school has such a strategy in place, its people will not be easy targets.”

This article originally appeared in BizEd's September/October 2018 issue. Please send questions, comments, or letters to the editor to bized.editors@aacsb.edu.




Related Reading: Read more about the six forces compelling business academics to embrace social impact in their scholarship in “Can B-Schools Rethink Research?