How online reviews can be distorted by the time and place in which they are written.
The rise of online retailing has been accompanied by an increasing wealth of consumption information in the form of peer-to-peer reviews. A survey in 2008 found that online shoppers rely heavily upon user-reviews (65% indicated they used such sources ‘always’ or ‘most of the time’; Freedman, 2008), and it is not difficult to see their appeal. User-generated reviews likely promise conditions of impartiality and credibility, particularly on aggregate sites such as Amazon. However, these sources may not be as reflective of quality as consumers might expect. For instance, in a recent study of Consumer Reports scores (unbiased expert product testing) and Amazon user reviews, researchers revealed a very weak relationship between the two sources, indicating that such peer-to-peer content may lack information about objective product quality (De Langhe, Fernbach, & Lichtenstein, 2015). This discrepancy was partially attributed to insufficient sample sizes, price information, and brand image biases. And in the October issue of the Journal of Consumer Psychology, researchers have revealed yet another way in which user-generated reviews may be distorted by irrelevant factors (Huang, Burtch, Hong, & Polman, 2016).
The team explored how the time and location of a restaurant evaluation posted on TripAdvisor, a popular travel website, can affect the positivity of the review. In particular, they looked at how far away from the restaurant the consumers were at the time of writing, and how long ago it was they had dined at the location. The researchers were motivated by findings such as when people think of events in broad and abstract terms (versus concrete, detail-oriented terms), they tend to be more positive about the experience. Given that psychological distance, which can be increased by time (i.e. how far in the future or past an event is) and space (how far away an event, person, or object is physically), can lead people to think in a broader mindset, the team predicted that when reviewers posted their restaurant reviewer further away from the location, or further in the past, they would be more positive in their review.
To test this, the researchers gathered 166,215 restaurant reviews from TripAdvisor, and coded how far away the reviewer was, how long ago it was since they dined, and also included a host of other control variables (measures to ensure that any effects aren’t driven by something such as time of the year, or the device that the user submitted the review with). Consistent with their predictions, the authors found that not only did spatial and temporal distance positively affect reviews, there was an additional combined effect such that when the reviewer was both posting in the past and from a distant location, the review was even more positive. The theoretical reasoning was further supplemented by follow-up analyses: the reviews written from distant locations or the past tended, on average, to be described in more abstract terms (i.e. more adjectives, less verbs).
Findings such as these highlight the fact that while there is a myriad of benefits to user-generated reviews, they are not immune to basic psychological influences. The practical implications are twofold. Firstly, consumers seeking advice from such sources should be aware that the reviews can be skewed by subtle psychological factors, and possibly take into account reviews written long after the original consumption experience. Secondly, organizations interested in cultivating favourable user-generated reviews may be advised to delay their requesting reviews via email, directly increasing temporal distance and likely increasing the odds that the user is further from the original location. Other measures might include systematic targeting of tourists and other distant customers for reviews, thereby ensuring greater psychological distance (tourists will not only live further away, but may not have a chance to post their review until they have returned from vacation).
De Langhe, B., Fernbach, P. M., & Lichtenstein, D. R. (2015). Navigating by the stars: investigating the actual and perceived validity of online user ratings. Journal of Consumer Research, ucv047.
Freedman, L. (2008). Merchant and Customer Perspectives on Customer Reviews and User-generated Content”, The E-tailing Group, accessed November 26, 2012 http://www.e-tailing.com/content/wp-content/uploads/2008/12/2008_WhitePaper_0204_4FINAL-powerreviews.pdf (2008).
Huang, N., Burtch, G., Hong, Y., & Polman, E. (2016). Effects of multiple psychological distances on construal and consumer evaluation: A field study of online reviews. Journal of Consumer Psychology.