Start Exploring Today! Visit our directory and connect with what matters most.
Skip to content

Planning to Hit The January Sales? Beware Of AI Product Reviews

Are we being tricked by overly positive and potentially misleading AI reviews when shopping online? Photo by Heidi Fin on Unsplash
Are we being tricked by overly positive and potentially misleading AI reviews when shopping online? Photo by Heidi Fin on Unsplash
  • Customers struggle to tell the difference between AI-written and genuine product reviews, according to new research.
  • AI reviews appear to be more positive, but have less far substance and lack the human touch (at the moment…) 
  • AI also has the tendency to unhelpfully go haywire and make things up – spreading rampant falsehoods about products.?

In an era where AI is the talk of the town –?as well as boardrooms and investor meetings?– and is increasingly permeating our lives (it’s already busy making customer service worse, with the trilling of useless chatbots), a new study raises interesting questions about whether consumers can distinguish between product reviews written by humans and those generated by AI.  

New research conducted by emlyon business school and Toulouse School of Management suggests that we are becoming increasingly unable to distinguish between AI and human-generated content.  

While not exactly works of literature, customer reviews are something people rely on every day when purchasing products and services, particularly during busy shopping periods like Christmas, Boxing Day and the January sales. AI has the potential to throw out an unlimited number of fake or biased reviews

And here’s the thing—reviews are only helpful when they are genuine. Honest, detailed feedback lets others know exactly what they’re getting into. When reviews are real and transparent, they lead to more thoughtful decisions and happy customers. If AI can spit out real-sounding reviews, we are heading for a world of even more disinformation. 

AI vs Humanity 

Yingting Wen, Professor of Marketing at emlyon business school, and Sandra Laporte, Full Professor at Toulouse School of Management, ran an investigation into how generative AI tools, such as ChatGPT, compare with human marketers in crafting product reviews.  

The study was conducted through three separate experiments, focusing on sensory products, which rely heavily on descriptive and emotive language to influence purchasing decisions. 

They found that while AI-produced reviews tended to be overwhelmingly positive, they lacked the authenticity and nuanced tone characteristic of human writing. AI-generated reviews were also chock full of lies and inaccuracies. 

To complement the findings from the text analysis tool, the researchers conducted a second study in which humans evaluated both AI- and human-created reviews. The results corroborated the earlier findings: human reviewers perceived AI-generated content as less genuine and less detailed, despite its often positive and persuasive tone. 

The third experiment shifted focus to social media, where the researchers analysed branded and unbranded posts created by human marketers and ChatGPT. Human raters assessed the sentiment, engagement potential, and overall effectiveness of these posts. The findings underscored AI’s capability to produce engaging and emotionally appealing content, but AI still fell short in crafting anything close to the rich and varied narratives that human marketers excel at. 

Even so, “Human raters struggle to notice the differences between human and AI reviews, rating half of the selected reviews from AI higher in embodied cognition and usefulness,” say the researchers, especially when the reviews were generated by ChatGPT 4 (rather than version 3.5). 

“As generative AI technologies advance, understanding their capability to emulate human-like experiences in marketing communication becomes crucial,’ they add.  

The Current Weaknesses of AI  

AI content was far more prone to positivity than human writers.  

“Generative AI tools like ChatGPT are increasingly being used to automate tasks such as crafting social media posts and responding to customer comments, which can boost engagement and purchase intent,” Professor Wen notes. However, she also emphasized that AI still lacks the nuanced understanding and authentic voice that human creators bring to the table. 

While generative AI can be a valuable tool for creating emotionally engaging and persuasive content at scale, human input remains crucial for ensuring authenticity, accuracy, and a deeper connection with consumers. 

But as AI tools become more sophisticated, their ability to mimic human creativity and communication will continue to improve, potentially narrowing the gap in quality and authenticity. 

As Professor Wen and Professor Laporte’s research demonstrates, AI has already proven its worth in producing quantities of information for quality-light social media posting. Its limitations, though, particularly in terms of detail, authenticity, and accuracy, highlight the need for human editing of whatever it produces. 

Businesses can leverage AI for its strengths in scalability and persuasion while relying on human marketers for authenticity and strategic oversight. However, this raises ethical questions –? shouldn’t reviews be written by humans, not machines? 

While AI models are becoming increasingly adept at mimicking human-like narratives, the ethical questions this poses are still right front and centre. AI doesn’t seem to be coming close to answering those, yet. Maybe it should stick to solving equations –?or becoming bamboozled when I attempt to book a flight. 

For consumers, the challenge of distinguishing between AI- and human-generated content highlights the need for greater transparency and accountability in digital communication. 

As technology evolves, the line between human and machine-crafted narratives will undoubtedly continue to blur. Right now, perhaps we might extend the dictum “I shouldn’t lie when reviewing a product”, to include “I shouldn’t use AI to generate fake reviews”. 

By, Thomas Willis

Interested in this topic? You might also like this…

Leave a Reply

Your email address will not be published. Required fields are marked *