It isn’t manly, but I had a good giggle last week when I stumbled across an unused line from an early draft of my 1-STAR REVIEW: ACCEPTANCE SPEECH:
STRUGGLING CRITIC: “Hard for Me to Classify…Sorry!“
BKHEWETT: “No apology necessary. Amazon pre-classifies its inventory, so you really shouldn’t strain yourself.”
When I first wrote this, I worried my gut response was too harsh and softened things up a bit before posting. No need to be too harsh, I thought. Plenty of things to mock here without getting personal.
Turns out, I needn’t have worried.
Apparently, there is an entire ecosystem centering on fake reviews: companies that pay for them, companies (or individuals) that write them, and companies that hunt down those fake-reviews and discredit them. This isn’t unique to Amazon. Yelp and Walmart are two other entities that cope with this problem daily. (I won’t go into the boring details, but PBS and Time will.)
In addition to the human-driven fake-reviews, sometimes entities create software programs to do this dirty work for them, scanning Amazon’s free content and making poorly written comments on select items in order to build internet presence.
But aside from being irritating and setting a bad example for our impressionable youth, bot-reviews (and other fake reviews) are a problem. Consumers use reviews to make purchasing decisions, and fake reviews lead to misinformed decision-making. In addition, retailers (and authors) generate new products based on feedback from reviews. Fraudulent feedback can lead companies (and authors) to invest resources unwisely in developing products that consumers don’t really want.
For fun and profit, here are some things to consider before using a review to make a purchasing decision or to develop content:
- In Amazon, does the review have a “verified purchase” tag? If it doesn’t, it might mean that the reviewer wasn’t signed into when they wrote it, but the absence of this tag could also indicate duplicity. For fun, do an Amazon store search for “Uranium Ore” and scan through some of the reviews. There’s a short one about four turtles and a rat that I found particularly interesting.
- Does the review sound human? Are the fancy words used correctly? If not, there’s a good chance it was written by a bot. (Illegal drug use only accounts for a small percentage of poorly written reviews.)
- Are grammar and punctuation standards used to an acceptable level? I generally accept fourth-grade as the standard of excellence for reviews, and all of my fourth graders (2004-2006) could punctuate better than my one-star reviewer. If it doesn’t meet that standard, it might be a bot.
- Is the review really short, or really vague? Both brevity and ambiguity are a bot’s best camouflage, given the previous indicators. A bot-review will often use language that might apply to any number of books, or products. “I’m so glad I got this.”
- Did your mom write it? If so, trust it. (Unless it exhibits previously-mentioned characteristics. . .)
So that laughably poor one-star review probably wasn’t legit, and I went through all that trouble to tease it. . . . On a lighter note, my 1-star (and 2-star) reviews are down 100% for RINGS, probably because I never offered it for free. I guess bots can’t afford the pricier fantasy and science-fiction titles.
But that brings me to my next point: What makes a good review?
To be continued. . .