The paper trail of plagiarism turned up in a university faculty member’s Ph.D. dissertation, then in a job application and, eventually, in a proposal for taxpayer-funded research sent to the National Science Foundation.
Yet like several other researchers caught stealing information on government-funded foundation projects or proposals, the faculty member managed to avoid the embarrassment of public exposure.
The National Science Foundation’s (NSF) office of inspector general, which closed the case last year, is withholding the researcher’s name from public scrutiny, citing privacy interests.
In another recent case, a researcher plagiarized in at least three funding proposals and, once caught, claimed as an excuse the fact that “his non-native command of English made paraphrasing difficult,” case records show. His name, too, remains private.
Unlike the Office of Research Integrity (ORI), another federal agency that investigates scientific misconduct, the NSF inspector general withholds many of the identities of the researchers it catches engaging in misconduct, according to records obtained through the Freedom of Information Act.
The policy reflects a stark but little-known difference in the scientific community and in the federal government on the question of whether, and when, to name names in the wake of misconduct investigations.
“My position is that these are public agencies and public funding is involved, so there should be disclosure,” said Mark S. Frankel, director of the Scientific Freedom, Responsibility and Law Program at the American Association for the Advancement of Science.
He said researchers found to have committed misconduct could do so again if the misdeeds go unnoticed by future employers. In their new jobs, these same researchers someday could be placed in positions of trust, such as supervising the work of graduate students, he said.
But others say the disparity between the two agencies’ policies might be a reflection of the sorts of cases they typically investigate. The NSF inspector general often uncovers plagiarism, while many of the ORI integrity cases involve fabrication or falsification of data, analysts say.
“It may have to do with the types of findings,” said Debra Parrish, a lawyer who has published scholarly articles on research misconduct. “Most NSF findings are premised on plagiarism. … [P]erhaps plagiarism, although not desirable, is not as worthy of public hanging.”
The NSF inspector general declined to comment on its disclosure policy, but noted that identifying the researchers in some cases would constitute an unwarranted invasion of personal privacy, according to written responses to open records requests by The Washington Times. The inspector general’s office provided hundreds of pages of records to The Times, though the names of researchers were redacted in most cases.
There are exceptions to the policy, however.
The inspector general does identify researchers who engage in misconduct if they’re successfully charged criminally or civilly or if they’re currently barred from getting federal contracts — but, only then, by releasing the names in response to a Freedom of Information Act request. Other federal inspector general offices follow similar guidelines.
But some serious misconduct investigations don’t result in debarments or court actions, records show. In those cases, the identities remain a secret.
The overall cost of research misconduct is unclear, but both ORI and NSF have noted recently that there’s far more than money at stake.
In a recent report to Congress, the NSF inspector general cited “a significant rise in the number of substantive allegations of misconduct associated with NSF proposals and awards.”
“Research misconduct damages the scientific enterprise, is a misuse of public funds and undermines the trust of citizens in science and government,” the NSF report stated.
John Dahlberg, director of ORI’s division of investigative oversight, wrote in a recent newsletter, “It would be impossible to estimate how many laboratories attempt to reproduce falsified and fabricated results and how much such efforts cost scientists in time and resources.”
Case memos of misconduct investigations conducted by the NSF inspector general’s office show a host of excuses from researchers caught engaging in misconduct.
One researcher who plagiarized on a funding proposal cited, among other excuses, “technical problems with my personal laptop.” He also said he hadn’t paid enough attention, adding that, “I myself had the flu” and “my son has severe allergies,” documents show. He later resigned from his unnamed university.
An assistant professor found to have plagiarized on three proposals for NSF funds expressed “shock” at first that his proposals had come under scrutiny, records show. Later, he claimed he had copied only “essentially some definitions … or some facts.”
In 2008, the inspector general made a misconduct finding against another unnamed university faculty member accused of taking information written by one of his graduate students and submitting it in an NSF proposal, in which he was listed as the sole principal investigator, records show.
When accused of stealing his student’s work, the faculty member told officials that he and the graduate student had “entered into an oral agreement that the application would be submitted to NSF with myself as the principal investigator” — an account the graduate student denied, records show.
The university faculty member resigned and the school terminated the roughly $400,000 NSF award.
Last summer, the inspector general closed its investigation of yet another unnamed university faculty member found to have plagiarized text in an NSF proposal for funding. The investigation concluded that the researcher “exhibited a pattern of plagiarism” that was “intentional, knowing or reckless,” records show.
“The subject’s plagiarism is a serious misrepresentation to NSF of his understanding and expertise, and creates a distorted competitive status for potential receipt of NSF funds,” investigators wrote in a case report.
Officials also noted that they were not swayed by the researcher’s excuse that he did not know he had to put other people’s words in quotation marks.
The inspector general ultimately recommended a punishment that included a letter of reprimand, a mandated ethics training course and a requirement that his employer submit assurances to the inspector general’s office for three years.
The universities where these and other researchers worked knew of and investigated the misconduct, and the researchers often later resigned, records show.
Yet while sanctions sometimes called for the notification of employers, future employers appear to have little way of discovering the misdeeds once that notification requirement expires.
Unless the punishment included debarment from federal funding or successful criminal prosecution or civil action, the researchers can manage to avoid the fate of a public outing.
Scientists caught committing misconduct by ORI, an arm of the Department of Health and Human Services, aren’t so lucky. The ORI regularly publishes on its Web site the names of researchers it nabs in misconduct cases. The site currently shows 40 names.
“We made the decision to name names, and I think it was the right one,” said Alan Price, former associate director for investigative oversight at the ORI.
Last year, for instance, the ORI published findings against former University of Alabama at Birmingham scientists Juan R. Contreras and Judith M. Thomas. They were sanctioned after federal and university investigators uncovered falsified data in millions of dollars of animal research studies funded by the National Institutes of Health.
In another case last year, the ORI sanctioned Luk Van Parijs, a former associate professor at the Massachusetts Institute of Technology. In 2005, the university terminated Mr. Van Parijs after an investigation found he fabricated and falsified data.
Still, ORI officials say their public listing of misconduct findings are limited to the current year and two previous years since administrative sanctions are typically imposed for three years.
Sheldon Krimsky, an ethics specialist at Tufts University, said disclosure isn’t always appropriate. He said it’s appropriate when the scientific misconduct involves falsifying or fabricating data, which he called “a real public hazard.” But he said plagiarism often isn’t as damaging.
“I think the most serious forms of misconduct should be treated in the harshest way and open to the public,” he said, citing fabrication and falsification of data as examples.
But in cases of less-serious, one-time misconduct, he said: “Their lives should not be destroyed because they failed to give correct attribution for a source.”