Late last month, a US physicist began a jail sentence for scientific fraud. Darin Kinion took funds for research on quantum computing but did not carry out the work he claimed; instead, he invented the data that the research supposedly produced.
Scientists like to think that such blatant dishonesty is rare, but I myself have witnessed several serious cases of scientific misconduct, from major data manipulation to outright fabrication. Most have gone unpunished — in fact, it has been disheartening to see the culprits lauded [definition: praised or extolled; spoken highly of]. It makes little sense for fraudsters to fabricate mediocre data. Their falsehoods generate outstanding stories, which result in high-profile publications and a disproportionately large chunk of the funding. I have noticed a lesser-known motive for bad science in my field, experimental biology. As environmental change proceeds, there is great demand from the public and policy­makers for simple stories that show the damage being done to wildlife. I occasionally meet scientists who argue that the questions we ask and the stories we tell are more important than the probity [honesty or decency] of our investigations: the end justifies the means, even if the means lead to data fabrication. That view is alarmingly misguided and has no place in science. The undeniable anthropogenic [originating in human activity] impacts on wildlife must be investigated with strict scientific rigor.
[SCIENCE IS BASED ON FAITH!!]
One reason some scientists can get away with questionable practices is that the scientific system is based on trust. The burden of proof is on those who suspect and report misconduct. Unless there is overwhelming evidence to the contrary, scientists are believed to have done what they say they did. If the community is serious about tackling misconduct, this must change. It is time to shift the burden of proof onto those who produce the results.
In some fields, this proof is often implicit in how scientists collect and report data. Detailed evidence may be provided by the outputs of mostly autonomous [having the freedom to govern itself or control its own affairs.] equipment. Access to all the raw, non-manipulated data files — as increasingly demanded by journals and peers across disciplines — may be enough.
Science that relies on human observation of remote field work and trials that are difficult to replicate precisely — such as studies in the field of animal behaviour — needs a different approach. Simply, researchers should routinely film their experiments and present the footage to journal editors, reviewers and colleagues alongside their data and analyses. In some disciplines (such as ornithology [study of birds]), photo or audio files may provide better evidence than video.
If extreme athletes can use self-mounted cameras to record their wildest adventures during mountaintop blizzards, scientists have little excuse not to record what goes on in lab and field studies. [why is there no self mounted cameras on astronauts riding rockets to ISS??]
“Scientists have little excuse not to record what goes on in lab and field studies.”
Yes, visual evidence can be faked, but a few simple safeguards should be enough to prevent that. Take a typical experiment in my field: using a tank of flowing water to expose fish to environmental perturbations and looking for shifts in behaviour. It is trivial to set up a camera, and equally simple to begin each recorded exposure with a note that details, for example, the trial number and treatment history of the organism. (Think of how film directors use clapper boards to keep records of the sequence of numerous takes.) This simple measure would make it much more difficult to fabricate data and ‘assign’ animals to desired treatment groups after the results are known.
My colleagues and I are currently using this approach to record studies of how coral-reef fish respond to dissolved carbon dioxide. There would also be benefits for other disciplines, including social-psychology studies based on direct observations. [and for proving the shape of the earth and outer space objects]
Sharing visual evidence is straightforward. Video files can be compressed and transferred without excessive loss of resolution. Files can then be uploaded to free data repositories (such as figshare or Zenodo) before manuscripts are submitted for publication. Notably, the online supplementary material of most journals allows for 10–150 MB of storage to accommodate images and detailed descriptions of methodology.
There is more to this than preventing misconduct. Visual evidence can help reviewers (before and after publication) to spot problems that are not obvious from written descriptions and diagrams. Software could help to quantify behavioural features in recorded experiments and mitigate experimenter biases. Plus, scientists who know that their equipment and techniques will be on display will try harder to improve them.
The best way to implement these changes is for academic journals to start mandating visual (and audio) evidence to support a submitted paper. As far as I am aware, no journals routinely do this. Journals must also ensure that their stated requirements are adhered to.
Surveys suggest that I am not unusual in witnessing fraud: some 14% of scientists say that they have witnessed it, too. Although it would be simpler to turn a blind eye to this issue and move on, this situation inhibits so many aspects of scientific progress that I feel compelled to try to fix it. The added logistical difficulties of providing visual evidence are a small price to pay to tackle dishonesty and greatly reduce the number of irreproducible (and often poorly conducted) studies. Mandatory visual evidence will undoubtedly help to reconcile the tens of billions of dollars wasted on irreproducible research every year. In short, show us your science.