Article Type
Changed
Tue, 03/12/2024 - 16:15

Alzheimer’s research is plagued with misconduct and fraud, undermining the progress toward understanding and treating the disease, say investigators who endeavor to expose errors and misleading practices.

The book is yet to be closed, for instance, on whether a 2006 paper by Sylvain Lesne positing amyloid as a major cause of Alzheimer’s was based on fraudulent data. Suspicions about the paper were first raised in late 2021 by Matthew Schrag, MD, PhD, an assistant professor of neurology at Vanderbilt University Medical Center, Nashville, Tennessee.

Dr. Schrag also queried the work of a City University of New York (CUNY) researcher who proposed PT-125 (now simufilam) as a potential anti-amyloid for Alzheimer’s disease. Even though CUNY recently found “egregious” and potentially deliberate misconduct by that researcher, Cassava Sciences is continuing phase 3 trials of simufilam.

Now questions are being raised about work from the lab of Berislav V. Zlokovic, PhD, a prominent neuroscientist at the University of Southern California (USC), Los Angeles, California, and also about studies conducted under the aegis of Domenico Pratico, MD, the director of the Alzheimer’s Center at Temple University in Philadelphia, Pennsylvania.

Alzheimer’s has been a notoriously hard puzzle to solve. Despite decades of research, there are still no effective therapies and disparate theories about potential causes.

Dr. Schrag said he wouldn’t “attribute all the ills in this field” to misconduct, but it “is absolutely a part of the equation.” Some of the papers flagged for integrity issues “have been hugely influential,” he said in an interview. “Some of the labs that we’re talking about have really shaped how we’ve thought about this disease,” he said. “It’s hard to un-ring the bell.”

The fallout from fraud has a wide impact. Taxpayer dollars are wasted in the creation of the fraud and in attempts to replicate the failed experiment. Grad students — the workhorses of labs — waste time trying to repeat studies or may be bullied or intimidated into misconduct, said Elisabeth Bik, PhD, a former Stanford microbiologist who is now a full-time fraud investigator.

And there’s potential harm to patients. “There’s a lot of false hope being given to these people and their families,” Dr. Bik said in an interview.
 

Alzheimer’s Tempts With Big Rewards

There are big rewards for those who publish important papers on Alzheimer’s: More grants, publication in higher-impact journals, larger labs, and potentially, personal enrichment from commercialization of therapies.

“I can see that people are driven to cut corners or even to make up results, or even anything in between, to reach that goal,” said Dr. Bik.

It’s unclear whether misconduct and fraud are on the rise or just being detected more frequently.

“It’s very hard to say,” said Mike Rossner, PhD, president of Image Data Integrity. Institutions hire Dr. Rossner to help ferret out research integrity issues. He told this news organization that it’s likely detection is on the rise, given the increasing number of sleuths like Dr. Bik.

In 2002, Dr. Rossner began to screen all images submitted to the Journal of Clinical Biology in response to the new phenomenon of digital images and the advent of PhotoShop. “Very early on, we started to see problems in digital images that we would not have seen on a glossy printout,” Dr. Rossner said of his time as managing editor of the journal.

From 2002 to 2014, at least 25% of papers had an image that violated guidelines that prohibited the removal of spots or other blemishes (called “beautification”) with PhotoShop, which did not necessarily indicate fraud. They withdrew acceptance for 1% of papers because of image manipulations that affected data interpretation.

Dr. Bik noted that even if there is not a greater percentage of fraudulent papers in Alzheimer’s, “it would still be in absolute numbers a lot of papers that could be fraudulent,” given that Alzheimer’s research is well-funded with federal agencies alone providing $3.7 billion a year.

 

 

Images Key to Spotting Issues

Investigative sleuths often use the online forum PubPeer to initially raise questions about papers. The format gives the original authors a chance to comment on or defend their work. While some critiques are about data, most hone in on alleged duplications or manipulations of images, primarily by Western Blots.

The images are key, because “the images are the data,” said Dr. Rossner. “The words are the author’s interpretation of what they see in the images,” he said.

It’s also easier to spot a problem in an image. The raw data or an investigator’s notebooks aren’t needed, and there are artificial intelligence-driven software programs such as Proofig and Image Twin that help investigators spot duplicated images or cases in which an image might have been flipped or otherwise manipulated to make results look better.

Science recently announced that it would be using Proofig to screen images in all papers submitted to its six journals.

Using a screening tool is better than nothing, said Dr. Rossner who still relies on visual inspection, employing contrast or other features in PhotoShop to spot inconsistencies or duplications. But “none of those companies have disclosed how effective they are relative to visual screening, and that to me is very problematic,” he said.

“The tools are not going to catch everything,” said Dr. Bik.

Dr. Schrag agreed. “One of the things that we’re worried about is that a lot of the journals will simply adopt these tools as a screener and assume that that’s going to de-risk their publication portfolio,” he said, noting the high rate of misses.

Artificial Intelligence a Growing Concern

Artificial intelligence (AI) may also accelerate the amount of fraud and add to the difficulty of ferreting it out, said the investigators.

“I’m very worried about AI,” said Dr. Bik. Although AI-generated images and content may be rudimentary today, “next year it’s going to be much better,” she said. Going forward, it may be hard to distinguish between a real dataset and one that has been generated by AI, she said.

“The more closely AI can mimic authentic content, the more difficult it will be for publications to detect intentionally fraudulent submissions,” wrote Dror Kolodkin-Gal, PhD, the founder of Proofig, in an article for the Council of Science Editors.

Dr. Kolodkin-Gal said that AI may be especially prone to misuse by paper mills. Those operations submit fake or shoddy manuscripts to a journal on behalf of researchers seeking publication who pay the mills a fee. The Committee on Publication Ethics reported in 2022 that 2%-46% of papers submitted to journals may be from paper mills.

While it’s unclear whether AI is having any impact now, Dr. Rossner said, “I think I can be pretty confident in saying it is going to be a growing problem” as the tools become more sophisticated.

He sees parallels with the rise of PhotoShop and cites data from the National Institute of Health’s Office of Research Integrity (ORI) showing that in 1990, when PhotoShop was still new, 2% of cases referred to ORI involved image manipulation. By 2007, 70% of cases had image manipulation issues.

 

 

Journals, Institutions Need to Step Up More

Fraud may continue apace in part because investigations drag on for years, and in many cases, with a lack of consequences for the perpetrators, said the investigators. And, they say, journals and institutions haven’t devoted enough resources to prevent or investigate misconduct.

“A lot of editors did not even want to investigate because they just didn’t want to believe that there could be fraud in science,” said Dr. Bik of her experiences. “I hope that by now most journals at least should have realized that some proportion of the manuscripts that get sent to their journals is going to be fraud,” she said.

“The bulk of the journals seem like they don’t want to be bothered by this,” agreed Dr. Schrag, adding that “some have gone to great lengths to try to discourage people from bringing forward complaints.”

A big issue is that journals “don’t answer to any higher authority,” said Dr. Schrag. He believes that journals that repeatedly refuse to address integrity issues should be barred from publishing research produced with funds from the National Institutes of Health.

All the investigators said institutions and journals should hire forensic investigators. Relying on unpaid peer reviewers or editors to root out fraud is unrealistic, they said.

“You want to have specialized people with experience and be paid to do that as a full-time job,” said Dr. Bik, who is funded by speaking engagements and receives about $2300 a month through donations to her Patreon account.

Once a potential integrity issue is flagged, there is “an incredible conflict of interest in how these investigations are run,” said Dr. Schrag. “Institutions are asked to investigate their own faculty; they’re asked to investigate themselves.” That “creates the disincentive to move expeditiously,” said Dr. Schrag.

With the space of time, people who have committed fraud can throw out notebooks, delete data from servers, or even PhotoShop original photos so they match the manipulated ones that were submitted, Dr. Bik said.

Institutions could show they are serious about fraud by offering a “central, systematic universal screening of all image data going out of their institutions before submission to a journal,” said Dr. Rossner. But he knows only of a handful that do so. “I think research integrity offices have historically been very reactive, and they need to pivot and become proactive,” said Dr. Rossner.

Dr. Schrag wants to see stronger values within the research enterprise. “You have to build a culture where it’s absolutely anathema at a core level to violate these standards of research integrity,” he said. “We have this notion that we can push the process along faster and get to a grant and get to a paper and get to some short-term goal,” he said. “But the long-term goal in most of these cases is to cure a disease or to understand some biological mysteries. There’s no shortcut to getting there,” said Dr. Schrag.

There have been some high-profile consequences for research integrity failures, such as the 2023 resignation of Stanford University President Marc Tessier-Lavigne in the wake of findings that members of his lab — but not Tessier-Lavigne — engaged in data manipulation.

The process is often opaque, with investigations done in secrecy. “Consequences are not usually revealed, either,” said Dr. Rossner.

Dr. Schrag acknowledges it’s a tough balancing act for institutions to root out bad actors while also ensuring there’s no harm to those who may simply have operated in error.

“But it doesn’t serve anyone’s interest including the people who are accused, in dragging these things out for 5, 6, 8, or 10 years,” he said.

 

 

Lesne and Cassava: The Long and Winding Road

The investigations into the Lesne papers and the work underpinning Cassava Sciences’ therapy point to the difficulty of policing integrity and the potential fallout.

Lesne’s signature paper published in Nature in 2006 has been cited some 2300 times and is the fourth most-accessed article of 81,612 articles of a similar age in all journals tracked by Altimetrics.

Dr. Schrag, Dr. Bik, and others wrote to multiple journals asking them to investigate some 25 papers related to simufilam, including a 2012 Journal of Clinical Investigation article by Hoau-Yan Wang, PhD, the CUNY scientist whose work on simufilam has been questioned.

JCI Editor Elizabeth McNally pushed back stating in an editorial in 2022 that they, as whistleblowers, had potential conflicts and that they could be assisting short sellers who were seeking to profit by depressing Cassava’s stock price. Indeed, Dr. Schrag was initially hired by a law firm that was representing short sellers. Ms. McNally said that JCI would start requiring disclosures by whistleblowers.

Dr. Bik urged CUNY to investigate Dr. Wang in 2021 but was rebuffed. Then, in November 2023, a copy of CUNY’s final report on the Wang inquiry was leaked to Science. The university reported that Dr. Wang did not provide any original data or notebooks and that it found “long-standing and egregious misconduct in data management and record keeping by Dr. Wang,” wrote Dr. Bik in a blog post summarizing the investigation.

As of late 2023, 42 papers by Dr. Wang have earned PubPeer posts, seven have been retracted, and five have been marked with an Expression of Concern, wrote Dr. Bik.

Some have called for Cassava to stop its phase 3 studies of simufilam, but the company is proceeding, announcing in November 2023 that they have completed enrollment.

Misconduct Queries Underway at USC and Temple

Meanwhile, Dr. Schrag and Dr. Bik continue sleuthing. They are among a small group of whistleblowers who have filed a complaint with NIH about irregularities in the Zlokovic lab at USC. They allege that images were manipulated in dozens of papers, including some that inform the development of a stroke drug in phase 2 trials.

The inquiry goes well beyond stroke, said Dr. Schrag. Dr. Zlokovic “is one of the most influential scientists on Alzheimer’s scientists in the country,” Dr. Schrag said. The USC scientist is a leader on blood-brain barrier research.

USC is investigating “at some level,” he said. In a statement to this news organization, USC said that it “takes any allegations about research integrity very seriously.” The statement added, “Consistent with federal regulations and USC policies, this review must be kept confidential. As a result, we are unable to provide any further information.”

Mu Yang, PhD, assistant professor of neurobiology at Columbia University Medical Center in New York City, is also working on the Zlokovic investigation.

She calls herself an “accidental sleuth” who fell into the hobby after a graduate student asked her to help replicate a study by Temple University, Philadelphia, Pennsylvania, researcher Dominco Pratico, MD, of Alzheimer’s-like phenotype mice in the Morris Water Maze test. Dr. Yang, who runs the “behavior core” at Columbia — teaching and advising on how to run assays and collect and report data — could see right away that the Pratico data were “too perfect.”

She enlisted maze inventor Richard Morris to join her in a letter of concern to the journals that published Dr. Pratico’s work, all under the aegis of Springer Nature.

The publisher’s integrity team has since retracted four Pratico papers. Three were because of image abnormalities pointed out by Dr. Bik, who worked with Dr. Yang. One was because of “self-plagiarism.”

“The official retraction notes didn’t mention anything about data abnormality being a concern,” said Dr. Yang who says that questionable data is harder to prove than an image duplication or manipulation. And the papers remain available, although dozens of Practico papers have been flagged on PubPeer.

To Dr. Yang, images are the canary in the coalmine. “People don’t just fake western blots but then give real behavior data or give you fake behavior data but give you the most authentic Western Blots,” she said.

Dr. Pratico has now sued a graduate student who was a coauthor on the papers, according to the Philadelphia Inquirer.

The NIH’s ORI has requested that Temple University conduct an investigation, Dr. Yang said.

In a statement to this news organization, Temple said it “does not comment on internal investigations or personnel issues,” but that “allegations of research misconduct are reviewed and investigated centrally through Temple’s Office of the Vice President for Research in accordance with university policy and applicable federal regulations.”

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Alzheimer’s research is plagued with misconduct and fraud, undermining the progress toward understanding and treating the disease, say investigators who endeavor to expose errors and misleading practices.

The book is yet to be closed, for instance, on whether a 2006 paper by Sylvain Lesne positing amyloid as a major cause of Alzheimer’s was based on fraudulent data. Suspicions about the paper were first raised in late 2021 by Matthew Schrag, MD, PhD, an assistant professor of neurology at Vanderbilt University Medical Center, Nashville, Tennessee.

Dr. Schrag also queried the work of a City University of New York (CUNY) researcher who proposed PT-125 (now simufilam) as a potential anti-amyloid for Alzheimer’s disease. Even though CUNY recently found “egregious” and potentially deliberate misconduct by that researcher, Cassava Sciences is continuing phase 3 trials of simufilam.

Now questions are being raised about work from the lab of Berislav V. Zlokovic, PhD, a prominent neuroscientist at the University of Southern California (USC), Los Angeles, California, and also about studies conducted under the aegis of Domenico Pratico, MD, the director of the Alzheimer’s Center at Temple University in Philadelphia, Pennsylvania.

Alzheimer’s has been a notoriously hard puzzle to solve. Despite decades of research, there are still no effective therapies and disparate theories about potential causes.

Dr. Schrag said he wouldn’t “attribute all the ills in this field” to misconduct, but it “is absolutely a part of the equation.” Some of the papers flagged for integrity issues “have been hugely influential,” he said in an interview. “Some of the labs that we’re talking about have really shaped how we’ve thought about this disease,” he said. “It’s hard to un-ring the bell.”

The fallout from fraud has a wide impact. Taxpayer dollars are wasted in the creation of the fraud and in attempts to replicate the failed experiment. Grad students — the workhorses of labs — waste time trying to repeat studies or may be bullied or intimidated into misconduct, said Elisabeth Bik, PhD, a former Stanford microbiologist who is now a full-time fraud investigator.

And there’s potential harm to patients. “There’s a lot of false hope being given to these people and their families,” Dr. Bik said in an interview.
 

Alzheimer’s Tempts With Big Rewards

There are big rewards for those who publish important papers on Alzheimer’s: More grants, publication in higher-impact journals, larger labs, and potentially, personal enrichment from commercialization of therapies.

“I can see that people are driven to cut corners or even to make up results, or even anything in between, to reach that goal,” said Dr. Bik.

It’s unclear whether misconduct and fraud are on the rise or just being detected more frequently.

“It’s very hard to say,” said Mike Rossner, PhD, president of Image Data Integrity. Institutions hire Dr. Rossner to help ferret out research integrity issues. He told this news organization that it’s likely detection is on the rise, given the increasing number of sleuths like Dr. Bik.

In 2002, Dr. Rossner began to screen all images submitted to the Journal of Clinical Biology in response to the new phenomenon of digital images and the advent of PhotoShop. “Very early on, we started to see problems in digital images that we would not have seen on a glossy printout,” Dr. Rossner said of his time as managing editor of the journal.

From 2002 to 2014, at least 25% of papers had an image that violated guidelines that prohibited the removal of spots or other blemishes (called “beautification”) with PhotoShop, which did not necessarily indicate fraud. They withdrew acceptance for 1% of papers because of image manipulations that affected data interpretation.

Dr. Bik noted that even if there is not a greater percentage of fraudulent papers in Alzheimer’s, “it would still be in absolute numbers a lot of papers that could be fraudulent,” given that Alzheimer’s research is well-funded with federal agencies alone providing $3.7 billion a year.

 

 

Images Key to Spotting Issues

Investigative sleuths often use the online forum PubPeer to initially raise questions about papers. The format gives the original authors a chance to comment on or defend their work. While some critiques are about data, most hone in on alleged duplications or manipulations of images, primarily by Western Blots.

The images are key, because “the images are the data,” said Dr. Rossner. “The words are the author’s interpretation of what they see in the images,” he said.

It’s also easier to spot a problem in an image. The raw data or an investigator’s notebooks aren’t needed, and there are artificial intelligence-driven software programs such as Proofig and Image Twin that help investigators spot duplicated images or cases in which an image might have been flipped or otherwise manipulated to make results look better.

Science recently announced that it would be using Proofig to screen images in all papers submitted to its six journals.

Using a screening tool is better than nothing, said Dr. Rossner who still relies on visual inspection, employing contrast or other features in PhotoShop to spot inconsistencies or duplications. But “none of those companies have disclosed how effective they are relative to visual screening, and that to me is very problematic,” he said.

“The tools are not going to catch everything,” said Dr. Bik.

Dr. Schrag agreed. “One of the things that we’re worried about is that a lot of the journals will simply adopt these tools as a screener and assume that that’s going to de-risk their publication portfolio,” he said, noting the high rate of misses.

Artificial Intelligence a Growing Concern

Artificial intelligence (AI) may also accelerate the amount of fraud and add to the difficulty of ferreting it out, said the investigators.

“I’m very worried about AI,” said Dr. Bik. Although AI-generated images and content may be rudimentary today, “next year it’s going to be much better,” she said. Going forward, it may be hard to distinguish between a real dataset and one that has been generated by AI, she said.

“The more closely AI can mimic authentic content, the more difficult it will be for publications to detect intentionally fraudulent submissions,” wrote Dror Kolodkin-Gal, PhD, the founder of Proofig, in an article for the Council of Science Editors.

Dr. Kolodkin-Gal said that AI may be especially prone to misuse by paper mills. Those operations submit fake or shoddy manuscripts to a journal on behalf of researchers seeking publication who pay the mills a fee. The Committee on Publication Ethics reported in 2022 that 2%-46% of papers submitted to journals may be from paper mills.

While it’s unclear whether AI is having any impact now, Dr. Rossner said, “I think I can be pretty confident in saying it is going to be a growing problem” as the tools become more sophisticated.

He sees parallels with the rise of PhotoShop and cites data from the National Institute of Health’s Office of Research Integrity (ORI) showing that in 1990, when PhotoShop was still new, 2% of cases referred to ORI involved image manipulation. By 2007, 70% of cases had image manipulation issues.

 

 

Journals, Institutions Need to Step Up More

Fraud may continue apace in part because investigations drag on for years, and in many cases, with a lack of consequences for the perpetrators, said the investigators. And, they say, journals and institutions haven’t devoted enough resources to prevent or investigate misconduct.

“A lot of editors did not even want to investigate because they just didn’t want to believe that there could be fraud in science,” said Dr. Bik of her experiences. “I hope that by now most journals at least should have realized that some proportion of the manuscripts that get sent to their journals is going to be fraud,” she said.

“The bulk of the journals seem like they don’t want to be bothered by this,” agreed Dr. Schrag, adding that “some have gone to great lengths to try to discourage people from bringing forward complaints.”

A big issue is that journals “don’t answer to any higher authority,” said Dr. Schrag. He believes that journals that repeatedly refuse to address integrity issues should be barred from publishing research produced with funds from the National Institutes of Health.

All the investigators said institutions and journals should hire forensic investigators. Relying on unpaid peer reviewers or editors to root out fraud is unrealistic, they said.

“You want to have specialized people with experience and be paid to do that as a full-time job,” said Dr. Bik, who is funded by speaking engagements and receives about $2300 a month through donations to her Patreon account.

Once a potential integrity issue is flagged, there is “an incredible conflict of interest in how these investigations are run,” said Dr. Schrag. “Institutions are asked to investigate their own faculty; they’re asked to investigate themselves.” That “creates the disincentive to move expeditiously,” said Dr. Schrag.

With the space of time, people who have committed fraud can throw out notebooks, delete data from servers, or even PhotoShop original photos so they match the manipulated ones that were submitted, Dr. Bik said.

Institutions could show they are serious about fraud by offering a “central, systematic universal screening of all image data going out of their institutions before submission to a journal,” said Dr. Rossner. But he knows only of a handful that do so. “I think research integrity offices have historically been very reactive, and they need to pivot and become proactive,” said Dr. Rossner.

Dr. Schrag wants to see stronger values within the research enterprise. “You have to build a culture where it’s absolutely anathema at a core level to violate these standards of research integrity,” he said. “We have this notion that we can push the process along faster and get to a grant and get to a paper and get to some short-term goal,” he said. “But the long-term goal in most of these cases is to cure a disease or to understand some biological mysteries. There’s no shortcut to getting there,” said Dr. Schrag.

There have been some high-profile consequences for research integrity failures, such as the 2023 resignation of Stanford University President Marc Tessier-Lavigne in the wake of findings that members of his lab — but not Tessier-Lavigne — engaged in data manipulation.

The process is often opaque, with investigations done in secrecy. “Consequences are not usually revealed, either,” said Dr. Rossner.

Dr. Schrag acknowledges it’s a tough balancing act for institutions to root out bad actors while also ensuring there’s no harm to those who may simply have operated in error.

“But it doesn’t serve anyone’s interest including the people who are accused, in dragging these things out for 5, 6, 8, or 10 years,” he said.

 

 

Lesne and Cassava: The Long and Winding Road

The investigations into the Lesne papers and the work underpinning Cassava Sciences’ therapy point to the difficulty of policing integrity and the potential fallout.

Lesne’s signature paper published in Nature in 2006 has been cited some 2300 times and is the fourth most-accessed article of 81,612 articles of a similar age in all journals tracked by Altimetrics.

Dr. Schrag, Dr. Bik, and others wrote to multiple journals asking them to investigate some 25 papers related to simufilam, including a 2012 Journal of Clinical Investigation article by Hoau-Yan Wang, PhD, the CUNY scientist whose work on simufilam has been questioned.

JCI Editor Elizabeth McNally pushed back stating in an editorial in 2022 that they, as whistleblowers, had potential conflicts and that they could be assisting short sellers who were seeking to profit by depressing Cassava’s stock price. Indeed, Dr. Schrag was initially hired by a law firm that was representing short sellers. Ms. McNally said that JCI would start requiring disclosures by whistleblowers.

Dr. Bik urged CUNY to investigate Dr. Wang in 2021 but was rebuffed. Then, in November 2023, a copy of CUNY’s final report on the Wang inquiry was leaked to Science. The university reported that Dr. Wang did not provide any original data or notebooks and that it found “long-standing and egregious misconduct in data management and record keeping by Dr. Wang,” wrote Dr. Bik in a blog post summarizing the investigation.

As of late 2023, 42 papers by Dr. Wang have earned PubPeer posts, seven have been retracted, and five have been marked with an Expression of Concern, wrote Dr. Bik.

Some have called for Cassava to stop its phase 3 studies of simufilam, but the company is proceeding, announcing in November 2023 that they have completed enrollment.

Misconduct Queries Underway at USC and Temple

Meanwhile, Dr. Schrag and Dr. Bik continue sleuthing. They are among a small group of whistleblowers who have filed a complaint with NIH about irregularities in the Zlokovic lab at USC. They allege that images were manipulated in dozens of papers, including some that inform the development of a stroke drug in phase 2 trials.

The inquiry goes well beyond stroke, said Dr. Schrag. Dr. Zlokovic “is one of the most influential scientists on Alzheimer’s scientists in the country,” Dr. Schrag said. The USC scientist is a leader on blood-brain barrier research.

USC is investigating “at some level,” he said. In a statement to this news organization, USC said that it “takes any allegations about research integrity very seriously.” The statement added, “Consistent with federal regulations and USC policies, this review must be kept confidential. As a result, we are unable to provide any further information.”

Mu Yang, PhD, assistant professor of neurobiology at Columbia University Medical Center in New York City, is also working on the Zlokovic investigation.

She calls herself an “accidental sleuth” who fell into the hobby after a graduate student asked her to help replicate a study by Temple University, Philadelphia, Pennsylvania, researcher Dominco Pratico, MD, of Alzheimer’s-like phenotype mice in the Morris Water Maze test. Dr. Yang, who runs the “behavior core” at Columbia — teaching and advising on how to run assays and collect and report data — could see right away that the Pratico data were “too perfect.”

She enlisted maze inventor Richard Morris to join her in a letter of concern to the journals that published Dr. Pratico’s work, all under the aegis of Springer Nature.

The publisher’s integrity team has since retracted four Pratico papers. Three were because of image abnormalities pointed out by Dr. Bik, who worked with Dr. Yang. One was because of “self-plagiarism.”

“The official retraction notes didn’t mention anything about data abnormality being a concern,” said Dr. Yang who says that questionable data is harder to prove than an image duplication or manipulation. And the papers remain available, although dozens of Practico papers have been flagged on PubPeer.

To Dr. Yang, images are the canary in the coalmine. “People don’t just fake western blots but then give real behavior data or give you fake behavior data but give you the most authentic Western Blots,” she said.

Dr. Pratico has now sued a graduate student who was a coauthor on the papers, according to the Philadelphia Inquirer.

The NIH’s ORI has requested that Temple University conduct an investigation, Dr. Yang said.

In a statement to this news organization, Temple said it “does not comment on internal investigations or personnel issues,” but that “allegations of research misconduct are reviewed and investigated centrally through Temple’s Office of the Vice President for Research in accordance with university policy and applicable federal regulations.”

A version of this article appeared on Medscape.com.

Alzheimer’s research is plagued with misconduct and fraud, undermining the progress toward understanding and treating the disease, say investigators who endeavor to expose errors and misleading practices.

The book is yet to be closed, for instance, on whether a 2006 paper by Sylvain Lesne positing amyloid as a major cause of Alzheimer’s was based on fraudulent data. Suspicions about the paper were first raised in late 2021 by Matthew Schrag, MD, PhD, an assistant professor of neurology at Vanderbilt University Medical Center, Nashville, Tennessee.

Dr. Schrag also queried the work of a City University of New York (CUNY) researcher who proposed PT-125 (now simufilam) as a potential anti-amyloid for Alzheimer’s disease. Even though CUNY recently found “egregious” and potentially deliberate misconduct by that researcher, Cassava Sciences is continuing phase 3 trials of simufilam.

Now questions are being raised about work from the lab of Berislav V. Zlokovic, PhD, a prominent neuroscientist at the University of Southern California (USC), Los Angeles, California, and also about studies conducted under the aegis of Domenico Pratico, MD, the director of the Alzheimer’s Center at Temple University in Philadelphia, Pennsylvania.

Alzheimer’s has been a notoriously hard puzzle to solve. Despite decades of research, there are still no effective therapies and disparate theories about potential causes.

Dr. Schrag said he wouldn’t “attribute all the ills in this field” to misconduct, but it “is absolutely a part of the equation.” Some of the papers flagged for integrity issues “have been hugely influential,” he said in an interview. “Some of the labs that we’re talking about have really shaped how we’ve thought about this disease,” he said. “It’s hard to un-ring the bell.”

The fallout from fraud has a wide impact. Taxpayer dollars are wasted in the creation of the fraud and in attempts to replicate the failed experiment. Grad students — the workhorses of labs — waste time trying to repeat studies or may be bullied or intimidated into misconduct, said Elisabeth Bik, PhD, a former Stanford microbiologist who is now a full-time fraud investigator.

And there’s potential harm to patients. “There’s a lot of false hope being given to these people and their families,” Dr. Bik said in an interview.
 

Alzheimer’s Tempts With Big Rewards

There are big rewards for those who publish important papers on Alzheimer’s: More grants, publication in higher-impact journals, larger labs, and potentially, personal enrichment from commercialization of therapies.

“I can see that people are driven to cut corners or even to make up results, or even anything in between, to reach that goal,” said Dr. Bik.

It’s unclear whether misconduct and fraud are on the rise or just being detected more frequently.

“It’s very hard to say,” said Mike Rossner, PhD, president of Image Data Integrity. Institutions hire Dr. Rossner to help ferret out research integrity issues. He told this news organization that it’s likely detection is on the rise, given the increasing number of sleuths like Dr. Bik.

In 2002, Dr. Rossner began to screen all images submitted to the Journal of Clinical Biology in response to the new phenomenon of digital images and the advent of PhotoShop. “Very early on, we started to see problems in digital images that we would not have seen on a glossy printout,” Dr. Rossner said of his time as managing editor of the journal.

From 2002 to 2014, at least 25% of papers had an image that violated guidelines that prohibited the removal of spots or other blemishes (called “beautification”) with PhotoShop, which did not necessarily indicate fraud. They withdrew acceptance for 1% of papers because of image manipulations that affected data interpretation.

Dr. Bik noted that even if there is not a greater percentage of fraudulent papers in Alzheimer’s, “it would still be in absolute numbers a lot of papers that could be fraudulent,” given that Alzheimer’s research is well-funded with federal agencies alone providing $3.7 billion a year.

 

 

Images Key to Spotting Issues

Investigative sleuths often use the online forum PubPeer to initially raise questions about papers. The format gives the original authors a chance to comment on or defend their work. While some critiques are about data, most hone in on alleged duplications or manipulations of images, primarily by Western Blots.

The images are key, because “the images are the data,” said Dr. Rossner. “The words are the author’s interpretation of what they see in the images,” he said.

It’s also easier to spot a problem in an image. The raw data or an investigator’s notebooks aren’t needed, and there are artificial intelligence-driven software programs such as Proofig and Image Twin that help investigators spot duplicated images or cases in which an image might have been flipped or otherwise manipulated to make results look better.

Science recently announced that it would be using Proofig to screen images in all papers submitted to its six journals.

Using a screening tool is better than nothing, said Dr. Rossner who still relies on visual inspection, employing contrast or other features in PhotoShop to spot inconsistencies or duplications. But “none of those companies have disclosed how effective they are relative to visual screening, and that to me is very problematic,” he said.

“The tools are not going to catch everything,” said Dr. Bik.

Dr. Schrag agreed. “One of the things that we’re worried about is that a lot of the journals will simply adopt these tools as a screener and assume that that’s going to de-risk their publication portfolio,” he said, noting the high rate of misses.

Artificial Intelligence a Growing Concern

Artificial intelligence (AI) may also accelerate the amount of fraud and add to the difficulty of ferreting it out, said the investigators.

“I’m very worried about AI,” said Dr. Bik. Although AI-generated images and content may be rudimentary today, “next year it’s going to be much better,” she said. Going forward, it may be hard to distinguish between a real dataset and one that has been generated by AI, she said.

“The more closely AI can mimic authentic content, the more difficult it will be for publications to detect intentionally fraudulent submissions,” wrote Dror Kolodkin-Gal, PhD, the founder of Proofig, in an article for the Council of Science Editors.

Dr. Kolodkin-Gal said that AI may be especially prone to misuse by paper mills. Those operations submit fake or shoddy manuscripts to a journal on behalf of researchers seeking publication who pay the mills a fee. The Committee on Publication Ethics reported in 2022 that 2%-46% of papers submitted to journals may be from paper mills.

While it’s unclear whether AI is having any impact now, Dr. Rossner said, “I think I can be pretty confident in saying it is going to be a growing problem” as the tools become more sophisticated.

He sees parallels with the rise of PhotoShop and cites data from the National Institute of Health’s Office of Research Integrity (ORI) showing that in 1990, when PhotoShop was still new, 2% of cases referred to ORI involved image manipulation. By 2007, 70% of cases had image manipulation issues.

 

 

Journals, Institutions Need to Step Up More

Fraud may continue apace in part because investigations drag on for years, and in many cases, with a lack of consequences for the perpetrators, said the investigators. And, they say, journals and institutions haven’t devoted enough resources to prevent or investigate misconduct.

“A lot of editors did not even want to investigate because they just didn’t want to believe that there could be fraud in science,” said Dr. Bik of her experiences. “I hope that by now most journals at least should have realized that some proportion of the manuscripts that get sent to their journals is going to be fraud,” she said.

“The bulk of the journals seem like they don’t want to be bothered by this,” agreed Dr. Schrag, adding that “some have gone to great lengths to try to discourage people from bringing forward complaints.”

A big issue is that journals “don’t answer to any higher authority,” said Dr. Schrag. He believes that journals that repeatedly refuse to address integrity issues should be barred from publishing research produced with funds from the National Institutes of Health.

All the investigators said institutions and journals should hire forensic investigators. Relying on unpaid peer reviewers or editors to root out fraud is unrealistic, they said.

“You want to have specialized people with experience and be paid to do that as a full-time job,” said Dr. Bik, who is funded by speaking engagements and receives about $2300 a month through donations to her Patreon account.

Once a potential integrity issue is flagged, there is “an incredible conflict of interest in how these investigations are run,” said Dr. Schrag. “Institutions are asked to investigate their own faculty; they’re asked to investigate themselves.” That “creates the disincentive to move expeditiously,” said Dr. Schrag.

With the space of time, people who have committed fraud can throw out notebooks, delete data from servers, or even PhotoShop original photos so they match the manipulated ones that were submitted, Dr. Bik said.

Institutions could show they are serious about fraud by offering a “central, systematic universal screening of all image data going out of their institutions before submission to a journal,” said Dr. Rossner. But he knows only of a handful that do so. “I think research integrity offices have historically been very reactive, and they need to pivot and become proactive,” said Dr. Rossner.

Dr. Schrag wants to see stronger values within the research enterprise. “You have to build a culture where it’s absolutely anathema at a core level to violate these standards of research integrity,” he said. “We have this notion that we can push the process along faster and get to a grant and get to a paper and get to some short-term goal,” he said. “But the long-term goal in most of these cases is to cure a disease or to understand some biological mysteries. There’s no shortcut to getting there,” said Dr. Schrag.

There have been some high-profile consequences for research integrity failures, such as the 2023 resignation of Stanford University President Marc Tessier-Lavigne in the wake of findings that members of his lab — but not Tessier-Lavigne — engaged in data manipulation.

The process is often opaque, with investigations done in secrecy. “Consequences are not usually revealed, either,” said Dr. Rossner.

Dr. Schrag acknowledges it’s a tough balancing act for institutions to root out bad actors while also ensuring there’s no harm to those who may simply have operated in error.

“But it doesn’t serve anyone’s interest including the people who are accused, in dragging these things out for 5, 6, 8, or 10 years,” he said.

 

 

Lesne and Cassava: The Long and Winding Road

The investigations into the Lesne papers and the work underpinning Cassava Sciences’ therapy point to the difficulty of policing integrity and the potential fallout.

Lesne’s signature paper published in Nature in 2006 has been cited some 2300 times and is the fourth most-accessed article of 81,612 articles of a similar age in all journals tracked by Altimetrics.

Dr. Schrag, Dr. Bik, and others wrote to multiple journals asking them to investigate some 25 papers related to simufilam, including a 2012 Journal of Clinical Investigation article by Hoau-Yan Wang, PhD, the CUNY scientist whose work on simufilam has been questioned.

JCI Editor Elizabeth McNally pushed back stating in an editorial in 2022 that they, as whistleblowers, had potential conflicts and that they could be assisting short sellers who were seeking to profit by depressing Cassava’s stock price. Indeed, Dr. Schrag was initially hired by a law firm that was representing short sellers. Ms. McNally said that JCI would start requiring disclosures by whistleblowers.

Dr. Bik urged CUNY to investigate Dr. Wang in 2021 but was rebuffed. Then, in November 2023, a copy of CUNY’s final report on the Wang inquiry was leaked to Science. The university reported that Dr. Wang did not provide any original data or notebooks and that it found “long-standing and egregious misconduct in data management and record keeping by Dr. Wang,” wrote Dr. Bik in a blog post summarizing the investigation.

As of late 2023, 42 papers by Dr. Wang have earned PubPeer posts, seven have been retracted, and five have been marked with an Expression of Concern, wrote Dr. Bik.

Some have called for Cassava to stop its phase 3 studies of simufilam, but the company is proceeding, announcing in November 2023 that they have completed enrollment.

Misconduct Queries Underway at USC and Temple

Meanwhile, Dr. Schrag and Dr. Bik continue sleuthing. They are among a small group of whistleblowers who have filed a complaint with NIH about irregularities in the Zlokovic lab at USC. They allege that images were manipulated in dozens of papers, including some that inform the development of a stroke drug in phase 2 trials.

The inquiry goes well beyond stroke, said Dr. Schrag. Dr. Zlokovic “is one of the most influential scientists on Alzheimer’s scientists in the country,” Dr. Schrag said. The USC scientist is a leader on blood-brain barrier research.

USC is investigating “at some level,” he said. In a statement to this news organization, USC said that it “takes any allegations about research integrity very seriously.” The statement added, “Consistent with federal regulations and USC policies, this review must be kept confidential. As a result, we are unable to provide any further information.”

Mu Yang, PhD, assistant professor of neurobiology at Columbia University Medical Center in New York City, is also working on the Zlokovic investigation.

She calls herself an “accidental sleuth” who fell into the hobby after a graduate student asked her to help replicate a study by Temple University, Philadelphia, Pennsylvania, researcher Dominco Pratico, MD, of Alzheimer’s-like phenotype mice in the Morris Water Maze test. Dr. Yang, who runs the “behavior core” at Columbia — teaching and advising on how to run assays and collect and report data — could see right away that the Pratico data were “too perfect.”

She enlisted maze inventor Richard Morris to join her in a letter of concern to the journals that published Dr. Pratico’s work, all under the aegis of Springer Nature.

The publisher’s integrity team has since retracted four Pratico papers. Three were because of image abnormalities pointed out by Dr. Bik, who worked with Dr. Yang. One was because of “self-plagiarism.”

“The official retraction notes didn’t mention anything about data abnormality being a concern,” said Dr. Yang who says that questionable data is harder to prove than an image duplication or manipulation. And the papers remain available, although dozens of Practico papers have been flagged on PubPeer.

To Dr. Yang, images are the canary in the coalmine. “People don’t just fake western blots but then give real behavior data or give you fake behavior data but give you the most authentic Western Blots,” she said.

Dr. Pratico has now sued a graduate student who was a coauthor on the papers, according to the Philadelphia Inquirer.

The NIH’s ORI has requested that Temple University conduct an investigation, Dr. Yang said.

In a statement to this news organization, Temple said it “does not comment on internal investigations or personnel issues,” but that “allegations of research misconduct are reviewed and investigated centrally through Temple’s Office of the Vice President for Research in accordance with university policy and applicable federal regulations.”

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article