Elisabeth Bik is getting mad. She has spent the better part of a decade finding examples of scientific fraud, and it seems to be easy pickings.
Although this was eight years ago, I distinctly recall how angry it made me. This was cheating, pure and simple. By editing an image to produce a desired result, a scientist can manufacture proof for a favored hypothesis, or create a signal out of noise. Scientists must rely on and build on one another’s work. Cheating is a transgression against everything that science should be. If scientific papers contain errors or — much worse — fraudulent data and fabricated imagery, other researchers are likely to waste time and grant money chasing theories based on made-up results…..
But were those duplicated images just an isolated case? With little clue about how big this would get, I began searching for suspicious figures in biomedical journals…. By day I went to my job in a lab at Stanford University, but I was soon spending every evening and most weekends looking for suspicious images. In 2016, I published an analysis of 20,621 peer-reviewed papers, discovering problematic images in no fewer than one in 25. Half of these appeared to have been manipulated deliberately — rotated, flipped, stretched or otherwise photoshopped. With a sense of unease about how much bad science might be in journals, I quit my full-time job in 2019 so that I could devote myself to finding and reporting more cases of scientific fraud.
Using my pattern-matching eyes and lots of caffeine, I have analyzed more than 100,000 papers since 2014 and found apparent image duplication in 4,800 and similar evidence of error, cheating or other ethical problems in an additional 1,700. I’ve reported 2,500 of these to their journals’ editors and — after learning the hard way that journals often do not respond to these cases — posted many of those papers along with 3,500 more to PubPeer, a website where scientific literature is discussed in public….
Unfortunately, many scientific journals and academic institutions are slow to respond to evidence of image manipulation — if they take action at all. So far, my work has resulted in 956 corrections and 923 retractions, but a majority of the papers I have reported to the journals remain unaddressed.
I’ve seen some of the fraud reports, and it amazes me how stupid the scientists committing these fakes must be. It’s as if they think jpeg artifacts don’t exist, and can be an obvious fingerprint when chunks of an image are duplicated; they don’t realize that you can reveal cheating by just tweaking a LUT and seeing all the duplicated edges light up. The only reason it’s done is to adjust your data to make it look like you expected it to look, which is an obvious act against the most basic scientific principles: you’re supposed to use science to avoid fooling yourself, not to make it easy to fool others.
This behavior ought to be harshly punished. If image fakery became in issue when one of my peers came up for tenure or promotion, I’d reject them without hesitation. It’s not even a question: this behavior is a deep violation of scientific and ethical principles, and would make all of their work untrustworthy.
Also, this is a problem with the for-profit journal publication system. Those scientists paid money for those pages, how can we possibly enforce honesty? The bad actors wouldn’t pay us for journal articles anymore!
But guess what happens when Elisabeth Bik takes a principled stand?
Most of my fellow detectives remain anonymous, operating under pseudonyms such as Smut Clyde or Cheshire. Criticizing other scientists’ work is often not well received, and concerns about negative career consequences can prevent scientists from speaking out. Image problems I have reported under my full name have resulted in hateful messages, angry videos on social media sites and two lawsuit threats….
Things could be about to get even worse. Artificial intelligence might help detect duplicated data in research, but it can also be used to generate fake data. It is easy nowadays to produce fabricated photos or videos of events that never happened, and A.I.-generated images might have already started to poison the scientific literature. As A.I. technology develops, it will become significantly harder to distinguish fake from real.
Science needs to get serious about research fraud.
How about instantly firing people who do this? Our tenure contracts generally have a moral turpitude clause, you know. This counts.