The central idea of my web publication "Statistics
of Scientific Fraud" -- direct and objective estimation of the
rate of misconduct in modern (bio)science, or at least, demonstration of
feasibility of finding such estimate -- was stolen recently by Office of
Research Integrity. In November 2000, ORI conducted a conference to announce
a grants program boosting research on exactly this same theme -- i.e. obtaining
estimates for exactly this same unknown. Predictably, without a single
reference to this my work despite my rather numerous (always unpleasant)
contacts with this organization concerning my work.
In more details, in 1995-2000 I had several contacts with ORI concerning
various technical deatails of my work. Finally, in March 2000 I have received
strange pair of letters (one on paper and one e-mail) both signed by Alan
R.Price which apparently had to be interpreted as if ORI has no further
interest to this my work.
Yet soon afterwards, in November 2000 ORI announced a grants program
with a primary objective PRECISELY coinciding
with the theme of my work while obviously pretending that my work does
not exist or they know nothing about it.
It is necessary to define more correctly what I mean by PRECISE coincidence
between my work and the theme of ORI conference. It is simple.
report of ORI conference written by Dr.Steneck describes two sharply
divided "schools" of estimating the rate of sci. misconduct.
First school presents ridiculously low estimates by dividing the cases
of sci.misconduct confirmed by federal government by the total number of
researchers. Such estimates may be taken seriously only if allowing that
there is surely a 100% probability that every scientist commiting fraud
is caught by competent organisation. Obviously, it is not true and
this probability is a perfectly unknown value. Therefore, actually,
these low estimates are just meaningless rhetoric
trick of expressing one unknown through another equally unknown value.
The second "school" yielding estimates of 10% or more is based mostly
on works performed on specials sets of data where it really seems possible
to identify nearly 100% of fraudsters. I like best the approach (by
Prof.Bloomberg of Univ.Virginia) on estimating amount of students'
works copied from Web resources and works on checking the citation
errors (it is described in the background report - p.8).
Unfortunately, students are not researchers and distorting citations
is a too trifling sin usually insufficient for talking about fraud. So
these approaches are listed in ORI report among "other research practices"
being of indirect relevance to estimating of research misconduct.
So, both "schools" have serious troubles with estimates they propose.
And ORI announces the objective to receive some BETTER estimates. Alas,
that is exactly what I have proposed in my work. I have already found the
estimates which ORI is now looking for. That is, the estimates lacking
the drawbacks described above.
I denote this behaviour as plagiarism, but this word does not fit
Rather unusual sort of plagiarism; I am not sure whether it may be
called plagiarism at all. At least, I don't think I can do anything more
than just expressing my displeasure from such behaviour on the base of
complaining about plagiarism.
I hope, this case looks more promising from another point of view. ORI
now collects public money for solving problem which has been already solved
by me. Of course, I have not any sort of monopoly on this field of research.
I can well believe that ORI may wish to have different estimates more compatible
with its bureaucratic interests. It may also wish this estimates to be
received by different people and to be based on different methods. All
this does not constitute anything more than just "political censorship".
Perhaps, it is unethical, but, certainly, it is not punishable.
Yet, what I mean is that concealing information about
my work is a serious financial infringement. It's like collecting money
for digging Panama channel concealing the fact that it was already built
by somebody else. It's FRAUD!
So, expecting response rates around 10%, I think I should write now
a dozen whistleblows to a dozen of inspectors in HHS and above... Suggestions
about more appropriate directions would be greatly appreciated.
Below is my posting to SCIFRAUD maillist of 29 December 2000 devoted
to this trouble. It also quotes Science article overviewing ORI conference.
More information about ORI conference may be found at their website
-- some items are interesting.
PS I have received a rather meaningless response from N.Steneck, author
of background report for ORI conference. He said that he disregarded my
work because it was not published on paper. It does not change anything
in my accusations.
> Subject: Ungratious Pastors
> Date: Fri, 29 Dec 2000 17:36:02 +0300
> From: Dmitriy Yuryev <firstname.lastname@example.org>
> To: SCIFRAUD
> Apparently, I should respond to posting with Science article
> ORI conference ("The size of the problem" of 19 Dec.)
> There is certainly, a suspicious coincidence between the theme of
> Science article and my work "Statistics of Sci.Fraud"
> (see www.orc.ru/~yur77/statfr.htm).
> Of course, its a great honour for me that the problem for which
> I derived some estimations 3 years ago has now attracted
> so close attention of US academic bureaucracy. Yet, predictably,
> I am slightly disappointed by absence of any reference to my
> pioneering contribution to this field.
> It becomes apparent from the content of Science article that my work
> still provides the only available direct statistical estimation for
> frequency of instances of sci. misconduct. It is also obvious that
> pretends that my work either does not exist or they know nothing
> It is not true, I wrote letters concerning this work both to N.Steneck
> and to C.Pascal (no responses). A strange response came from A.Price,
> director of div. research investigations after the "ORI fraud laundry"
> thread in this maillist (March 2000). He sent (almost?) simultaneously
> letters -- e-mail demonstrating some interest to my work which even
> interpreted as a hesitant preliminary invitation to send abstract
> conference; and a letter on paper which definitely said that ORI
has no further
> interest to my work. Both letters were shortly quoted in my postings
> under the same "ORI fraud laundry" subject.
> I am not going to talk now about the grief or surprise that I am
> from the fact that my ideas were stolen again by an organization
> created to fight plagiarism. Actually, I am not surprised at all.
> By the way, I remember 400 years ago fair Ophelia described a similar
> case of some ungracious pastor who recks not his own rede.
> I am more interested to find out how this behaviour of ORI should
> be qualified and, perhaps, what it actually means. I would greatly
> appreciate comments as, understandably, I am not a too great expert
> in English usage nor in definitions of fraud.
> In my view, in this case I could not do anything more than just mumbling
> abouth some abstract "dishonesty" but that the fact that ORI is now
> collecting public money for solving this problem. It transfers
> whole story from the field of scientific ethics to the field of financial
> Again, it's not where I am copenhagen, yet I believe that intentional
> misrepresentation or withholding information about the status of
> to which the requested money would be applied should be a very
> serious sin. It's like collecting money for digging Panama
> information that it is already built (or half-built or 1%-built)
> Please correct me if I am wrong, because I can't believe my eyes
-- it's FRAUD!!!???
> I am 90% sure that disregard of my work by ORI is explained by "racial"
> considerations. Yet, it seems more interesting to note two other
> conflicts between my work and misconduct research as it was presented
> ORI conference. It also may be helpful for understanding of what
sort of research
> and what sort of conclusions ORI requires.
> 1. Apparently, the range of acceptable estimations of fabrication
> by N.Steneck is somewhere between 0.001% and 1%. Therefore, my estimating
> it as "at least 5-10%" seems to be illegal falling too far beyond.
What an unjust
> thing! The more so, that my estimates are perfectly compatible with
> received on students.
> 2. Regarding studies of students' tricks, I remember that I have
> seen a more interesting work. If memory serves me, it was a note
> published little more than a year ago: the talk was about some software
> directly identify the Web resource from which papers written by students
> have been plagiarized. So some professor said to his students that
> their works would be inspected, nevertheless the result was that
> over 30% of papers have been found plagiarized. (Please, has anybody
> reference to this work?).
> Surprisingly, this study is also disregarded by ORI conference.
> Methodology of this work resembles to some extent that of mein. Something
> like "catching fraudsters for statistics" -- obviosly it differs
> least "invasive" approach of innocent surveys presented in Science
> There is a chance that such practice also seems illegal to ORI which
> think that fraudsters should be caught for execution purposes only.
> Perhaps it just suggests an easier view at scientific fraud challenging
> notion of it as an extremely rare and very serious crime.
> Dmitriy K. Yuryev
> Al Higgins wrote:
The Size of the Problem
> > A couple of weeks ago, ORI held a
conference at which
> > discussions of "needed research" and "supportable research" were
> > apparently discussed. According to Science, some 70 participants
> > discussions of the prevalence of fraud in science were involved.
> > Guesstimates vary on the size of the problem.
> > Here is the Science article.
> > \Marshall, Eliot. "How
Prevalent Is Fraud? That's a
> > Million-Dollar Question," Science
290 (1 December 2000),
> > pp. 1662-1663.\
> > Charles Turner still doesn't know
whether his experience was
> > like finding a rare bad apple in
the barrel. But he is sure that
> > there was something rotten in the
survey data going into his
> > federally funded study o1 sexual
behavior. And he knows
> > that it has taken him 2 years to
pluck out the spoiled fruit and
> > piece together a clean report for
> > Turner,
a social scientist at City University of New
> > York/Queens College, offered his
cautionary story last month
> > at a conference' called by a key
federal watchdog agency to
> > announce a $1 million grants program
to investigate the
> > prevalence of fraud, data fabrication,
plagiarism, and other
> > questionable practices in science.
The 8-year-old Office of
> > Research Integrity (ORI) a small
unit within the Department
> > of Health and Human Services, hopes
to support studies
> > aimed at gauging the frequency of
misconduct and how to
> > raise ethical standards.
> > Turner's
story was the most dramatic of a series of case
> > studies presented at the ORI conference.
In 1997, he
> > explained, the National Institutes
of Health funded his
> > proposal to ask 1800 Baltimore residents
about their sexual
> > behavior. The project, an epidemiological
look at AIDS and
> > other sexually transmitted diseases
such as gonorrhea and
> > chlam dia was mana ed b the Research
> > (RTI) of Research Triangle Park,
North Carolina. Eleven
> > months into the study, Turner, who
has an appointment at
> > RTI, got a call from a data collection
manager who was
> > troubled by the apparent overproducnvity
of one interviewer.
> > A closer look revealed that the worker
was faking results; the
> > address of one interview site, for
example turned out to be an
> > abandoned house. The worker was dismissed,
> > came under suspicion.
> > After
"a horrible 6 months" pulling apart the entire
> > study, Turner and his colleagues
discovered an "epidemic of
> > falsification" that they linked to
a cessation of random quality
> > checks. As the schedule slipped,
says Turner, some staffers
> > may have felt pressure to hurry up.
Despite a "significant"
> > loss of money and time, the investigators
> > plucked out data from tainted sources,
sorted the remains, and
> > pieced together a final report that
has been submitted for
> > publication.
> > Turner
says the exercise taught him several hard
> > lessons, the most important being
to "validate the work
> > yourself." Scientists should start
analyzing survey data as
> > soon as it is submitted, he says,
with a sharp eye for
> > anomalies. Turner says he doesn't
know if other projects have
> > faced similar problems, because most
journal articles don't
> > discuss the issue. And the incident
never became public, he
> > says, because no one was ever publicly
> > wrongdoing and the institute chose
to avoid the risk of
> > litigation.
> > How
often does misconduct like this occur? There
> > appears to be no consensus on the
answer, although science
> > historian Nicholas Steneck of the
University of Michigan,
> > Ann Arbor, co-chair of the conference,
has drawn up a range
> > of estimates. At the low end is an
estimate of 1 fraud per o
> > 100,000 scientists per year. That's
based on 200 official
> > federal cases that fit a narrow 3
definition that counts only
> > fraud, data fabrication, and plagiarism,
out of a community s
> > of 2 million active researchers over
> > At
the same time, Steneck notes that 1 in o 100
> > researchers "consistently report"
in surveys that they know
> > about an instance of s misconduct.
A broader definition
> > yields even more hands. There is
a "troubling discrepancy,"
> > Steneck observed, "between public
statements about how
> > `rare' misconduct in research supposedly
is and the more
> > private belief on the part of many
researchers o that it is
> > fairly
> > common."
> > A study
of students at one campus suggests that the
> > practice of massaging data is a common,
but the behavior
> > decreases as students advance toward
a career in science.
> > Biologist Elizabeth Davidson and
colleagues at Arizona State
> > University in Tempe asked students
in seven introductory
> > biology and zoology courses whether
they manipulated lab
> > data to obtain desired results. A
huge majority - 84% to 91%
> > - admitted to manipulating lab data
"almost always" or
> > "often." Most said they did this
to get a better grade. Other
> > studies, however, show that the willingness
to fake data
> > declines sharply as students move
on to graduate and
> > professional-level work, leading
Davidson to speculate that
> > their behavior improves as the "research
> > to them personally."
> > Some
institutions have attempted to remedy the
> > problem of scientific misconduct
with special education
> > programs. The University of Minnesota,
> > reported on an ambitious
> > ethics training program at the medical
.school that in 1 year
> > spent $500,000 on 60 workshops and
signed up 2200
> > researchers as participants. But
Steneck and others say that
> > it's hard to measure the effectiveness
of such training, and
> > that the meager results to date are
> > A study
of 172 University of Texas students enrolled in
> > a "responsible conduct of research"
course, for example,
> > found "no significant change" in
attitudes after training, says
> > Elizabeth Heitman of the University
of Texas School of
> > Public Health in Houston. The finding
is consistent with what
> > Steneck has seen, including a 1996
study that found that
> > people who had gone through a training
course were actually
> > more willing to grant "honorary authorship"
> > who had not performed research than
were those who had not
> > been trained.
> > ORI
director Chris Pascal says his office has received
> > several favorable comments about
the new grants program
> > and that 70 scientists interested
in the topic showed up last
> > month for an ORI workshop on how
to apply for biomedical
> > research grants. The first round
of winners will be announced
> > next year.