
Widespread misidentification of scanning electron microscope instruments in the peer-reviewed materials science and engineering literature
Abstract
Materials science and engineering (MSE) research has, for the most part, escaped the doubts raised about the reliability of the scientific literature by recent large-scale replication studies in psychology and cancer biology. However, users on post-publication peer review sites have recently identified dozens of articles where the make and model of the scanning electron microscope (SEM) listed in the text of the paper does not match the instrument’s metadata visible in the images in the published article. In order to systematically investigate this potential risk to the MSE literature, we develop a semi-automated approach to scan published figures for this metadata and check it against the SEM instrument identified in the text. Starting from an exhaustive set of 1,067,108 articles published since 2010 in 50 journals with impact factors ranging from 2 to 24, we identify 11,314 articles for which SEM manufacturer and model can be identified in an image’s metadata. For 21.2% of those articles, the image metadata does not match the SEM manufacturer or model listed in the text and, for another 24.7%, at least some of the instruments used in the study are not reported. We find that articles with SEM misidentification are more likely to have existing observations of improprieties made on post-publication peer review site PubPeer than other MSE articles and that a subset of these articles within the subfield of electrochemistry are more likely to incorrectly estimate the optical band gap if the article features SEM misidentification. This suggests that SEM misidentification may be a tractable signature for flagging problematic MSE articles. Unexplained patterns common to many of these articles suggest the involvement of paper mills, organizations that mass-produce, sell authorship on, and publish fraudulent scientific manuscripts at scale.