As an indie book publicist, I've worked with every book review service I could find. GetMyBookReviewed.com is my answer to dealing with helping indie authors navigate the review industry.

This week I ran across an obviously AI written book review on Indie Reader. I thought it was, so I ran it through an AI detector and I was right. That led me to looking further.

So on the IndieReader.com main review page HERE I took the 12 reviews and ran them through GTPZero.me. It is a great AI detector and I use it often. Any review coming in for one of my authors, I check for AI and plagiarism.

So out of 12 reviews on the main page, 4 of them were AI generated. Three from the same reviewer and then two from other reviewers. Plus the one I found, 5 AI reviews out of 13, or 40% of the new reviews are AI are fake. That isn't the best average.

What we did:

Thirteen reviews, five AIs.

Off Course On Purpose GPTZero result

This isn't a one off bad reviewer, 3 different reviewers (if they are real people) all turned in AI reviews and IndieReader didn't check. Just published them as is. A two-fold problem; reviewers being lazy, a review site also being lazy.

And worse, four of the five AI reviews are tagged IR Approved. Just what does IR Approved mean at this point?

The Problem:

Indie authors are paying for real professional reviews; they're spending hard earned money to get their book reviewed, and IndieReader is taking shortcuts. Authors use these reviews for marketing, Amazon editorial reviews, agent and publisher queries, quotes on their cover, website and marketing materials. If its fake and easy to detect, it has zero editorial credibility and looks worse on the author.

The trust issue:

When you go to a site like IndieReader, you trust them. You trust that they are gatekeeping real reviewers and real reviews. That the reviewers are actually going to read your book, consider its qualities, good and bad parts, and provide a real, honest and human review. Not just upload it to ChatGPT and hope for the best. Not at any point in the purchase process (and I've purchased more than one review for a client there) do they disclose that reviewers (or IndieReader) might use AI and let potential author clients make that choice before buying a review

Also this means that IndieReader doesn't have any gatekeeping. Potentially no human editing, definitely no AI or plagiarism detection. Authors have no way to verify what they are paying for until they get their final review and run it through an AI detector themselves.

How To Check Your Own Reviews

I use GPTZero (https://gptzero.me). It does AI detection and plagiarism. There is a free tier if you just want to do one or two quick checks, for professionals, its worth the subscription.

Copy your new review, paste it into the box, get an AI probability score. It isn't 100% accuarate, but its pretty damn close. I've tested with unknown text, text I know is human written (my own) and stuff I know is AI. Its really damn accurate,.

What the scores mean:

  • 0-30%: Likely human-written
  • 30-50%: Mixed, worth scrutiny
  • 50%+: Significant AI presence
  • 85%+: Probably AI-generated
  • 100%: Definitely AI

What to look for manually:

I spotted AI usage in OFF COURSE ON PURPOSE because I saw "narrative" used more than once. Much like the poorly spotted em dash, words like narrative aren't used all that often by humans. Humans use "story", "book" or "journey". Narrative is a more formal term that doesn't get used often. And not twice in a book review. Other tells are generic phrases like "From the outset..." or "positions his narrative as...". No specific scene references from the middle/end of the book or a templated recommendation paragraph. If you take the review and it could apply to any book in your genre without a specific references to your story, characters or setting, its probably a quickly scanned and uploaded book to AI.

What to do if your review fails an AI check

First, contact the review service immediately. Document the review in the AI detector with a screenshot and send it to them. Request a rewrite or a refund. And check any rewrite for AI as well. Asking for a discount or a full refund and a replacement review isn't too extreme. Ask about their AI policy (if they have one, most don't yet) and pull the AI review from every place you used before it gets archived by Google or Amazon. Move to a service with an public Anti AI use policy, it isn't common but some services are starting to include it. City Book Review recently added a non-AI policy to their terms of service.

Why This Matters

I shouldn't need to point this out, but apparently I do. AI is being used as a shortcut by lazy humans. Whether the reviewer or the review outlet, human judgement matters in reviewing and evaluating books written by humans. (Coming soon, AI reviews of AI written books.) And indie authors, looking to promote their book on limited funds are the most vulnerable. Often they don't even realize (at least right now) that AI can be used for this purpose. Or how to check. That's what this post and this website are here to do, flag the obvious offenders using AI to cut corners, reduce costs and add even more AI slop into a market fighting it at every level.

Violations like this undermine trust in every player in this industry, not just the paid reviews, but all reviews. Use Pubby for Amazon reviews, do you check to see if they're AI? What will Pubby do if you point it out? The paid and trade market for review services not only needs some form of industry standards, but it needs an enforcement mechanism more than just public shaming (I see you IndieReader) or a mention in Writer Beware.

Indie authors (and even some traditionally published authors), publicists and marketing specialists need to realize, you're paying for the credibility of not only the reviewer, but the platform. AI reviews have zero credibility. If someone discovers your editorial reviews on Amazon are AI generated, that damages your author brand, even if you didn't know. You didn't know, but finding out is just a tab away. Every author should be checking. Hold your review service to that standard. That any review they send you needs to be less than 50% AI detected (at that point, you just have a very formal review or an reviewer that's pretty lazy, but not an fully AI review).

Why reviewers might use AI

Ok, lets be fair, reviewing books isn't a high paying job. Most services are just paying the reviewer with a free copy of the book. Paid review services pay their reviewers, but still not all that much. Its a publishing credit and free book business. Sad but true.

Over committing can be a problem. Reviewing for multiple services, dependent on the income, reviewers can be tempted to just do a quick AI review, maybe rewrite it a bit, add 5 stars to keep the author happy and coming back, and hope for the best. But its dependent on the review outlet to check for that. Every review, paid or not, run it past an AI detector. It is the lowest bar that an author should expect when getting a paid review. That it isn't an AI review and that the reviewer read the whole book. Low bar and for the money review sites charge, its the actual minimum expected.

How many reviewers are living off of paid reviews? I don't know but its a fair question for review sites. Are your power reviewers doing too many reviews in too short a period of time to actually reading each book. Are they just doing the minimum for payment? And does each review site have an AI policy they make every reviewer understand and agree to? Do their reviewers understand the ethical issues and the risk each author takes when using their serviced?

Why services don't catch AI usage

At this point in technology, I don't understand why everyone isn't aware of this issue. PARTICULARLY paid book review services. Detection tools exist. You can automate your review pipeline and catch obvious AI generated reviews on submission, not after publication. Yes I'm picking on IndieReader, but for such a major player in the paid review service to be so negligent with their reviews, speaks very poorly of their systems.

Any review service you choose, from IndieReader to Kirkus should have a public statement about AI reviews. If they don't, ask before you buy. Just because they have a reputation in the industry doesn't mean their reviewers won't be lazy, greedy or disinterested. It isn't your job to catch them. Its their job to protect you.

Services that depend on paid reviews for the majority of their income are a risk no matter what. They have a financial incentive to hand out 4 and 5 star reviews as fast as they can. The 12 reviews I checked on Indie reader were just over 4 days. Extrapolating, that means they do 3 reviews a day and 40% of them are AI generated. Over a year, that's 1100 reviews (or $325,000) and 438 of them are AI. I only did a small sample, but from just 4 days of review, almost every other one was AI. That's a systemic problem at the publisher level.

At a minimum, every review outlet should be able to assure authors and publishers that they have AI and plagiarism detection in place for all reviews. Not some, not just for new reviewers, but every review should be checked before publication. That's the minimum we should expect. Reviewers that fail either test should be banned from reviewing from your outlet again. Immediately and no exceptions. They get fired often enough, they'll be effectively banned from the industry.

The assumption that reviewers (or the review outlet) are honest and upright isn't the correct status any more. AI is too easy and adjacent to everyone. As much as I hate the origin of the reference, Reagan was right, "Trust but Verify". Review outlets focuses on pumping out as many paid reviews as they can need to be looked at more carefully. Look at the reviews on a site before you buy. Are most of the reviews books you've never heard of? Are they obviously self-published? Then that site has a financial incentive to both give you a vanity 4 or 5 star review so you'll come back and to turn a blind eye to reviewers that do those reviews for them. Volume over quality. Volume over verification. Volume over accountability.

Book review services have none of the above. They're optimizing for volume, not outcomes. As long as reviews ship and authors pay, there's no incentive to check, verify, or take responsibility when it goes wrong.

Self-respect in business = you stand by your product. These services just cash the check and ignore the fallout.

What Good Review Services Do

They implement AI and plagiarism detection when a reviewer submits a review. This also assumes the review service isn't using AI itself and skipping the human reviewer part of the equation. Might be obvious to everyone else, but those financial incentives to cut costs and increase revenue cross all ethical lines.

New reviewers need to have their initial reviews checked before ever onboarding them. And every review for the first 90 days should be checked. Granted I think every review should be run through the same set of checks, new reviewer, old reviewer, every review, every time.

The quality of the reviews needs to be more important than the volume. Paid review services that depend on naive authors to not know that its a vanity review service that exists to separate them from their money need to be called out. They aren't a business, they're predators.

The bar is low, no AI reviews, no plagiarism, no vanity reviews. If its a bad book, call it out. Tell the author what they need to improve. Maybe they won't come back, but at least you did your job.

Bottom line (tl,dr) every author or publicist needs to check the the reviews they get, paid or otherwise and if one comes up higher than a 50% AI result, call them out. Tell the company directly that they screwed up, ask for your money back and make it public. A human written book deserves a human written review. No exceptions.

Don't trust me? Check this article out in an AI detector. ZeroGPT called it 97% human written. I guess the 3% is just my impeccable grammar.

Evidence / Screenshots

Murder to Movies GPTZero result
Off Course On Purpose GPTZero result
Carnival Chaos GPTZero result
Chasing Nirvana GPTZero result
Alternate Sources GPTZero result
Mountain Home GPTZero result
IndieReader review sample page