21 Peer Reviewers Who Failed So Hard They Almost Won

People in glass houses shouldn't write mean peer review comments. (Via Shit My Reviewers Say.)

1. This reviewer who is pretty verbose themselves.

2. This reviewer who might have skipped a few maths classes.

There is a lot of terminology flung around such as “false negatives”, “false positive” and “median, first quartile, third quartile'

3. These reviewers who are shameless in their self promotion.

“Cite newer, relevant references, especially those published by X 2012, and X 2008. Best wishes, Dr. X, Associate Editor."


Reviewer: "Please cite these two papers (by me)". That's it. That was the whole review. @YourPaperSucks

5. This reviewer who let their beliefs interfere with their work.

6. This reviewer who just does not have time for that.

7. This reviewer who can count.

“Presented paper has 13 pages and 26 adequate references. The paper seems to be very interesting.”

8. This reviewer who definitely knows better because they said so.

“I understand Wikipedia is not the best source of information, however…based on the information from Wikipedia, your hypothesis breaks down"

9. This reviewer who has "two" many misprints of their own.

10. And these other reviewers who shouldn't be throwing stones from their glass houses.


“The orgnization and writing of the paper need to improve. There are some grammar errors need to correct.”


“This study, an original work and repetition of earlier feature of these studies is similar. Writing language is quite a lot of errors'

13. This reviewer who thinks science is done already and we should all pack up and leave.

14. This reviewer who should probably conduct a review of different types of review, tbh.

“What is a ‘systematic’ review? I have never heard of an unsystematic review.”

15. Reviewers 1 and 2, who will never agree with each other.

16. (Unless one of them misreads something.)

17. This reviewer whose grasp of English isn't as great as they think.

18. This reviewer who needs to check the definition of "average".

"The reported mean of 7.7 is misleading because it appears that close to half of your participants are scoring below that mean" #impossible!

19. This reviewer who came so close to a good putdown, then missed by several orders of magnitude.

20. This reviewer who clearly knows something we don't.

21. And this reviewer who is a little too honest for their own good.

Skip to footer