• July 29, 2014

Journals Find Fakery in Many Images Submitted to Support Research

Kristin Roovers was a postdoctoral fellow at the University of Pennsylvania with a bright career ahead of her—a trusted member of a research laboratory at the medical school studying the role of cell growth in diabetes.

But when an editor of The Journal of Clinical Investigation did a spot-check of one of her images for an article in 2005, Roovers's research proved a little too perfect.

The image had dark bands on it, supposedly showing different proteins in different conditions. "As we looked at it, we realized the person had cut and pasted the exact same bands" over and over again, says Ushma S. Neill, the journal's executive editor. In some cases a copied part of the image had been flipped or reversed to make it look like a new finding. "The closer we took a look, the more we were convinced that the data had been fabricated or manipulated in order to support the conclusions."

As computer programs make images easier than ever to manipulate, editors at a growing number of scientific publications are turning into image detectives, examining figures to test their authenticity.

And the level of tampering they find is alarming. "The magnitude of the fraud is phenomenal," says Hany Farid, a computer-science professor at Dartmouth College who has been working with journal editors to help them detect image manipulation. Doctored images are troubling because they can mislead scientists and even derail a search for the causes and cures of disease.

Ten to 20 of the articles accepted by The Journal of Clinical Investigation each year show some evidence of tampering, and about five to 10 of those papers warrant a thorough investigation, says Ms. Neill. (The journal publishes about 300 to 350 articles per year.)

In the case of Ms. Roovers, editors notified the federal Office of Research Integrity, which polices government-financed science projects. The office concluded that the images had been improperly manipulated, as had images the researcher had produced for papers published in three other journals. That finding led two of those journals to retract papers that Ms. Roovers had co-authored, papers that had been cited by other researchers dozens of times.

The episode damaged a career—Ms. Roovers resigned from the lab and is ineligible for U.S. government grants for five years—and delayed progress in an important line of scientific inquiry.

Experts say that many young researchers may not even realize that tampering with their images is inappropriate. After all, people now commonly alter digital snapshots to take red out of eyes, so why not clean up a protein image in Photoshop to make it clearer?

"This is one of the dirty little secrets—that everybody massages the data like this," says Mr. Farid. Yet changing some pixels for the sake of "clarity" can actually change an image's scientific meaning.

The Office of Research Integrity says that 44 percent of its cases in 2005-6 involved accusations of image fraud, compared with about 6 percent a decade earlier.

New tools, such as software developed by Mr. Farid, are helping journal editors detect manipulated images. But some researchers are concerned about this level of scrutiny, arguing that it could lead to false accusations and unnecessarily delay research.

Easy to Alter

The alterations made by Ms. Roovers at the University of Pennsylvania were "very easy" to do, says Richard K. Assoian, a professor of pharmacology at Penn who worked with the young researcher and served as her mentor while she was a doctoral student at the University of Miami. "It's basic Photoshopping," he says.

Ms. Roovers admitted that she used the software, though she says she was not the only one in the lab to do so.

"I certainly did something wrong, but I don't think I was alone in the whole thing," she says, adding that it was not her intent to deceive. "It was trying to present it even better."

Morris J. Birnbaum, director of the laboratory and a professor of diabetes and metabolic disease at Penn, says he never thought to look for such tampering, partly because he was trained back when images from scientific instruments were captured on film rather than digitally. "It's pretty hard to doctor film," he says.

Now Mr. Birnbaum checks every image in the lab carefully. "It doesn't take that long, it's just a question of doing it," he says.

He says it was a difficult way to learn a lesson.

Ms. Roovers might have gotten away with the image enhancements if the paper had been accepted by a different journal.

"Only a few journals are doing full image screening," says Mike Rossner, executive director of Rockefeller University Press. Mr. Rossner became a leading crusader for such checks after he accidentally stumbled upon manipulated images in an article submitted to The Journal of Cell Biology six years ago, when he was the publication's managing editor.

He worked with researchers to develop guidelines for the journal outlining proper treatment of images, and several other journals have since adopted them. Some enhancements are actually allowed—such as adjusting the contrast of an entire figure to make it clearer. But adjusting one part of an image is not permitted, because that changes the meaning of the data.

He says all papers accepted by The Journal of Cell Biology now go through an image check by production editors that adds about 30 minutes to the process. If anything seems amiss, the authors are asked to send an original copy of the data—without any enhancements.

So far the journal's editors have identified 250 papers with questionable figures. Out of those, 25 were rejected because the editors determined the alterations affected the data's interpretation.

The level of manipulation detected by Mr. Rossner helped persuade several other journal editors to confront the issue.

At Nature Publishing Group, which produces some of the world's leading science journals, image guidelines were developed in 2006, and last year the company's research journals began checking two randomly selected papers in each issue for image tampering, says Linda J. Miller, U.S. executive editor of Nature and the Nature Publishing Group's research journals.

So far no article has been rejected as a result of the checking, she says.

Ms. Miller and other editors say that in most cases of image tampering, scientists intend to beautify their figures rather than lie about their findings. In one case, an author notified the journal that a scientist working in his lab had gone too far in trying to make figures look clean. The journal determined that the conclusions were sound, but "they wound up having to print a huge correction, and this was quite embarrassing for the authors," she says.

Ms. Miller wrote an editorial for Nature stressing that scientists should present their images without alterations, rather than thinking polished images will help them get published. Many images are of gels, which are ways to detect proteins or other molecules in a sample, and often they are blurry.

No matter, says Ms. Miller. "We like dirt—not all gels run perfectly," she says. "Beautification is not necessary. If your data is solid, it shines through."

Automated Detection

Mr. Farid, of Dartmouth, has developed software tools that can automatically check for image tampering. The software looks for patterns in the digital code underlying an image. When files are opened and altered in Photoshop, for instance, codes are added that Mr. Farid's software can detect. Likewise, when scientists copy and paste parts of images in any software programs, their actions leave a digital mark.

"No matter how good you are at it, there's always going to be some trace left behind," he says.

Some journal editors have hired Mr. Farid to analyze difficult cases. "In the last year, I've worked on four," he says, though he would not say which journals he worked with.

Cadmus Professional Communications, which provides publishing services to several scientific journals, has also developed software to automatically check the integrity of scientific images.

The Journal of Biological Chemistry, which uses Cadmus for its printing, sends 20 to 30 papers a year through this system, at a charge of $30 per paper, says Nancy J. Rodnan, director of publications at the American Society for Biochemistry and Molecular Biology, which publishes the journal. She says the journal cannot afford to send every paper through (without passing the cost on to authors), so its editors send only those that they suspect, usually because some figures look like they have gel patterns that have been reused. Last year about six of the checked papers led to more serious investigations, and a couple of those were eventually found to have been altered inappropriately.

Some editors and researchers worry, however, that automated tools might not be as accurate as human inspectors and that the software could flag false positives.

Mr. Birnbaum, of the University of Pennsylvania, says a friend of his was wrongly accused of image tampering, though he would not say who. "He was pretty upset about it," says Mr. Birnbaum, even though, he says, the researcher was able to prove the image was not fraudulent. "My guess is it's going to happen more and more."

Reporting Suspected Fraud

As more journals search for image manipulation, they also need to develop clear procedures for how to report suspicious cases, says John Krueger, an investigator at the U.S. research-integrity office. For instance, simply contacting the authors of a suspect paper, rather than someone else at the authors' universities, could leave the door open for authors to submit the paper to another, less careful journal instead.

"That's happened," admits Mr. Rossner, of Rockefeller University Press. "We had a paper where we revoked the acceptance, and it was subsequently published in another journal with the same problems that we detected." The authors and the journal were overseas, he says, so "we had a difficult time trying to find out how to go about reporting the case."

"It was definitely frustrating," he adds. "My obligation as a journal editor is to protect the published record in any way we can."

Mr. Krueger, the federal investigator, says some journals are reluctant to investigate suspicious images in cases that involve prominent research.

"They're sort of in a conflict with themselves for wanting to accept the hot stuff," he says. "Sometimes they accept less-than-ideal material."

One new check on science images, though, is the blogosphere. As more papers are published in open-access journals, an informal group of watchdogs has emerged online.

"There's a lot of folks who in their idle moments just take a good look at some figures randomly," says John E. Dahlberg, director of the division of investigative oversight at the Office of Research Integrity. "We get allegations almost weekly involving people picking up problems with figures in grant applications or papers."

Such online watchdogs were among those who first identified problems with images and other data in a cloning paper published in Science by Woo Suk Hwang, a South Korean researcher. The research was eventually found to be fraudulent, and the journal retracted the paper.

Katrina L. Kelner, deputy editor of life sciences at Science, argues that the level of fabrication in the Hwang paper was so pervasive that it would not have been detected even if the journal had used the latest image-checking tools.

Since that instance, however, the journal has started spot-checking images in every paper before publication.

Ms. Roovers, the former researcher who tampered with images, has landed a new job as a postdoctoral researcher at the Ottawa Health Research Institute. The Canadian institution is outside the jurisdiction of the U.S. Office of Research Integrity that ruled against her.

She is still publishing—she is listed as a co-author on a new paper published in April by PLoS ONE. But she will never adjust images again, she says: "It's not worth it."

subscribe today

Get the insight you need for success in academe.