Doctoral student Jorge Mejia, assistant professor Shawn Mankad, and associate professor Anandasivam Gopal have analyzed 130,000 postings on the popular user review site Yelp to discover whether a computer-assisted text analysis of these reviews could predict a restaurant’s closure more accurately than user ratings alone.
The researchers focused on reviews for 2,600 restaurants that had recently operated in the Washington, D.C., metropolitan area. They identified 454 that had closed between 2005 and 2014, as well as linguistic patterns in the reviews that indicated the type of restaurant and the quality of the user’s experience. They then matched restaurants of similar type to better compare reviews.
For instance, restaurants whose reviews included words like “good,” “friend,” “great,” “nice” and “neighborhood” had a high rate of survival. The researchers also identified words and phrases under five categories: overall quality, speed, responsiveness, food quality, and atmosphere. Reviews loaded with “high-weighted” words and phrases—such as “wonderful ambience” and “attentive wait staff”—also provided an accurate prediction of high restaurant quality.
The study went further to find that high-dollar restaurants were more likely to close than less expensive ones, and that speed of service played a larger role in restaurant closures than atmosphere or food quality.
“Using our approach, a restaurant could monitor its performance in the dimensions that matter the most to customers,” the researchers write. Says Mankad, “We are surrounded by all of this free, unstructured data. We should be using that data.”
“More Than Just Words: Using Latent Semantic Analysis in Online Reviews to Explain Restaurant Closures” is a working paper based on Mejia’s dissertation.