In our most recent "how can we help you?" thread, a reader asks:

I've received a referee report from a journal that looks very much like it was written using AI. (Five short paragraphs structured like an undergrad essay, vague and general, slightly contradictory—it says my arguments are compelling but there's room for rebuttal.) Should I say or do anything about it?

Good question. Irrespective of how this case turns out, I suspect this is going to be an increasing issue, and it's not clear to me what authors or journals can or should do.

What do readers think?

Posted in , ,

6 responses to “Reporting a potentially AI-written referee report?”

  1. Matthew

    At the very least, I’d feed the report through one of the better-regarded AI screeners (Originality.AI is a good one, or GPTZero, etc). If it comes back as very likely AI, I’d probably screenshot that verdict and send it to the journal editor—not in a “reconsider my essay” way, but rather in a “you might want to be aware” sort of an email.

  2. Young SLAC Prof

    Funny: I just submitted a report on a paper that I very strongly believe was written by AI.
    I’m with Marcus though: obviously this is bad, especially if AI is reading the paper AND ‘coming up with’ criticisms. However, I don’t see what one could do besides express concern to the editor.

  3. The Church Lady

    I think this is a very tricky issue. It is pretty serious to accuse someone of sending in a referee report generated by AI. So your evidence for such a claim should be VERY STRONG as well. That is just a warning. Marcus asks: what should authors do? What should journals do? … we also need to ask: what should referees do? That is the easy one – they should not use AI and just read the paper and write the report. I have done this for over 200 papers … it does not take that long.

  4. Michel

    Yes, you should say something (while indicating that you accept the verdict). AI is not our peer, and we should be pushing back against its use for refereeing, even with human oversight.

  5. everwhat

    You should never agree to review a paper if you are even slightly disposed to have AI write your report for you. And if you are lazy and dishonest enough to do this, you should resign from your position in academia.

  6. Slexter

    The line between AI written and person written are now so blurred that I’m not sure it even makes sense to have a problem with it.
    Sure, the report could have been written entirely by AI. But, what if it was written entirely by a person, who then used something like grammarly to help make it more accessible?
    What if, given the fact that the reviewer is likely overworked as all the rest of us are, they wrote out the main points themselves, and tasked AI with putting it into a coherent sentence, or paragraph, structure? Thereby saving themselves time.
    Or, what if it was written by AI through collaboration with an AI program such as Chatgpt’s canvas?
    As far as I am concerned, even it was written by AI, the reviewer only did something wrong if they just auto generated a review – but it is at this point impossible to tell if this is what happened. Overall, the only problem here could be that the reviewer did not disclose that they used AI to help write the review.
    I think the most likely explanation here is that he reviewer collabed with AI to simplify their work. If that is the case, I have to admit, I struggle to understand the problem. When we teach we (usually) don’t write the books ourselves. Many professors lectures are little more than rearticulations of what another philosopher said or wrote. Many professors use pre-written assignments taken from other professors, or assignments that have been bought from companies like pearson. We offload work and borrow ideas all the time, from using books written by others, unoriginal lectures that amount to explaining a book, or reading from the book. We very often don’t do our own grading either. We usually have a TA do the complicated grading, and a program do the simple grading.
    Why would offloading some of the work of reviewing onto something else be problematic? We also gonna get upset that they wrote it down using computers, instead of memorizing it and reciting it to us in person?

Leave a Reply to The Church LadyCancel reply

Discover more from The Philosophers' Cocoon

Subscribe now to keep reading and get access to the full archive.

Continue reading