Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The use of the phrase “data not shown” in dental research

  • Eero Raittio,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Institute of Dentistry, University of Eastern Finland, Kuopio, Finland

  • Ahmad Sofi-Mahmudi ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Resources, Software, Supervision, Validation, Visualization, Writing – original draft, Writing – review & editing

    a.sofimahmudi@gmail.com, sofimahmudi@research.ac.ir

    Affiliations Cochrane Iran Associate Centre, National Institute for Medical Research Development (NIMAD), Tehran, Iran, Seqiz Health Network, Kurdistan University of Medical Sciences, Seqiz, Kurdistan, Iran

  • Erfan Shamsoddin

    Roles Writing – review & editing

    Affiliation Cochrane Iran Associate Centre, National Institute for Medical Research Development (NIMAD), Tehran, Iran

Abstract

Objective

The use of phrases such as “data/results not shown” is deemed an obscure way to represent scientific findings. Our aim was to investigate how frequently papers published in dental journals use the phrases and what kind of results the authors referred to with these phrases in 2021.

Methods

We searched the Europe PubMed Central (PMC) database for open-access articles available from studies published in PubMed-indexed dental journals until December 31st, 2021. We searched for “data/results not shown” phrases from the full texts and then calculated the proportion of articles with the phrases in all the available articles. From studies published in 2021, we evaluated whether the phrases referred to confirmatory results, negative results, peripheral results, sensitivity analysis results, future results, or other/unclear results. Journal- and publisher-related differences in publishing studies with the phrases in 2021 were tested with Fisher’s exact test using the R v4.1.1 software.

Results

The percentage of studies with the relevant phrases from the total number of studies in the database decreased from 13% to 3% between 2010 and 2020. In 2021, out of 2,434 studies published in 73 different journals by eight publishers, 67 (2.8%) used the phrases. Potential journal- and publisher-related differences in publishing studies with the phrases were detected in 2021 (p = 0.001 and p = 0.005, respectively). Most commonly, the phrases referred to negative (n = 16, 24%), peripheral (n = 22, 33%) or confirmatory (n = 11, 16%) results. The significance of unpublished results to which the phrases referred considerably varied across studies.

Conclusion

Over the last decade, there has been a marked decrease in the use of the phrases “data/results not shown” in dental journals. However, the phrases were still notably in use in dental studies in 2021, despite the good availability of accessible free online supplements and repositories.

Introduction

The foundation of science and research is sustainable, valid and reliable when the results are available to be tested, replicated and reproduced [14]. Open access articles, data and code sharing, funding and conflicts of interest disclosures and detailed descriptions of materials, methods and results are great facilitators to open science [46]. However, studies suggest that frequently published results may be non-reproducible, which means the findings are difficult or impossible to reproduce [5, 79]. Fortunately, open science practices have also been adopted in biomedical research over the last decades [5, 10]. For instance, some journals have adopted compulsory data and code availability statements and abandoned strict word, table and figure limits. Additionally, online repositories and scientific publishers’ online supplements to articles have facilitated easy and free data as well as code and document sharing. Open Science Framework (OSF, www.osf.io), figshare (www.figshare.com) and GitHub (www.github.com) are some examples of online platforms where one can manage project information and data/code sharing and archive for free.

Frequently, phrases such as “data not shown” or “results not shown” are used to refer to unpublished results. It has been assumed that that results may be related to confirmatory analyses (similar results published elsewhere), negative results, peripheral results (not directly related to the topic), sensitivity analyses or future results (e.g., results related to the manuscript in preparation) [11]. However, we are unaware of any systematically conducted study that investigated what kind of unpublished results the phrases actually refer to. Using such phrases to refer to results can be seen problematic for multiple reasons. If researchers report a considerable amount of results or important results with such phrases, and without sharing results, data or code, free interpretation and verification of results are doubly harder or even impossible [4, 5]. In other words, the use of phrases obscures transparency, reproducibility and weakens the peer review process [12]. Focusing on statistically significant results and neglecting the negative ones is a major reason behind publication bias, but may also threaten the reproducibility of scientific results [13]. In addition, proper interpretation of sensitivity analyses require that modelling modifications, parameters and sensitivity results are adequately reported [14, 15]. Results not considered important for the purposes of the current study may be crucial to conduct a systematic review or meta-analysis on the closely related topic.

However, it remains unknown how frequently published studies include such phrases to refer to unpublished results in the current era of open access online journals and free online repositories and supplements to articles. Therefore, we aimed to investigate how frequently papers published in dental journals used the phrases “data/results not shown”. Accordingly, to describe current practices, we examined what kind of results the authors referred to with these phrases from the studies with the phrases published in 2021. Further, we also examined the data sharing statements, data, supplement and code availability for the studies that used these phrases.

Materials and methods

Protocol registration

We shared the protocol for this study on OSF on 26 September 2021 (osf.io/5zryu). All codes and data are also available on osf.io/5zryu. Deviations from the protocol are available in S1 Text.

Bibliographic search

We conducted searches in the Europe PubMed Central (PMC) database, which contains over seven million full-text articles at the moment. First, we searched all PMC open access articles (PMCOA) published in the PubMed-indexed dental journals in the database until 31 December 2021. Then, we searched for the phrases “data not shown” and “results not shown” from the PMCOA articles published in the same journals until the same date.

Selection of 2021 subsample

From those searches, we selected studies published in 2021. From studies for which the search indicated that they included the phrases, we manually confirmed whether the phrase referred to unreported/unpublished results. If they did so, we included them for further analysis.

Data extraction from the 2021 subsample

From studies published in 2021 with the “data/results not shown” phrases, we documented whether the study shared data or code or online supplementary materials/appendices within the journal website or via other platforms. Then, we categorised the studies based on whether the “data/results not shown” phrases referred to confirmatory results, negative results, peripheral results, sensitivity analysis results, future results or other/unclear categories (Table 1) [11]. We also searched for information about publishing free online supplementary materials from the websites of all journals which had published at least one paper in the subsample (yes unlimited, yes limited, unclear, no). We searched the name of the publisher of each journal from Publons (and for sensitivity analysis also from SCImago and National Library of Medicine, NLM, Catalog). All data extractions from full texts were performed first by one of the authors, and all extractions were confirmed/checked by another author; discrepancies were solved through discussion.

thumbnail
Table 1. Definitions of phrase types with examples, adapted from [11].

https://doi.org/10.1371/journal.pone.0272695.t001

Data synthesis and analyses

For each publication year, we calculated the percentage of studies with the “data/results not shown” phrases from the total number of PMCOA articles published in the same dental journals during the year. We also searched the phrases from all PubMed-indexed PMCOA articles and wrote down the returned number of hits for each year to make a comparison with PMCOA articles from dental journals. We reported our findings with simple descriptive tables and figures, as well as provided some examples of how the phrases were used. Journal- and publisher-related differences in the number of studies with the phrase, from the total number of PMCOA articles from each journal or publisher in 2021, were tested using Fisher’s exact test with Monte Carlo simulations, with 100 000 replications. We used the R v4.1.1 software (2021-08-10, R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org) for statistical analysis.

Results

Overall perspective

The search from the Europe PMC database identified 21 217 unique PMCOA articles from 116 different dental journals until 31 December 2021. Of these, the search for “data/results not shown” phrases produced 1 474 unique records from 70 various dental journals.

As the total number of PMCOA dental articles was low before 2010, there was considerable fluctuation in the percentage of studies with the phrases. As the number of available dental articles increased, the percentage of studies with the phrases stabilised to around 13% by 2010. From 2010, the percentage decreased to approximately 3% in dental journals from 2010 to 2020 (Fig 1A).

thumbnail
Fig 1.

The number of available PubMed Central open access articles with “data/results not shown” phrase from dental (A) and all PubMed-indexed PMCOA (B) journals by publication year (bars), and percentage of articles with at least one “data/results not shown” phrase within each year (line).

https://doi.org/10.1371/journal.pone.0272695.g001

Amongst all PubMed-indexed PMCOA articles the percentages were clearly higher in all study years than in dental journals. The percentage of articles with the phrases decreased from 1990 (35%) to 2020 (6%) (Fig 1B). In both samples, these trends showed a steeper downtrend from the late 2000s onwards.

2021 subsample

In 2021, the search identified 2 434 unique PMCOA articles from 73 dental journals and 22 publishers. Of these, the search identified 67 PMCOA articles (from 22 different journals and eight publishers), with at least one of either of the “data/results not shown” phrases. In the full-text review, all 67 were confirmed as including the phrase(s). Thus, 2.8% of the full texts in 2021 referred to unpublished results with the phrase. Fisher’s exact test showed a p-value of <0.001 for the journal-related, and 0.002 for the publisher-related differences in the percentages of studies with the phrase(s), from the total number of available PMCOA articles, from each journal or publisher in 2021 (Tables 2 and 3).

thumbnail
Table 2. The number of available open access articles from Europe PubMed Central and the number of articles with “data/results not shown” phrases in each journal.

https://doi.org/10.1371/journal.pone.0272695.t002

thumbnail
Table 3. The number of available open access articles from Europe PubMed Central and the number of articles with “data/results not shown” phrases in journals of each publisher.

https://doi.org/10.1371/journal.pone.0272695.t003

Of those 67 studies with the phrases, the phrases related most often to peripheral (n = 22, 33%), negative (n = 16, 24%) or confirmatory (n = 11, 16%) results. Few referred to the sensitivity analysis results (n = 4, 6%). However, in some cases, it was difficult to evaluate their meaning or what results the phrase referred to (n = 14, 21%). Nineteen studies used the phrase(s) multiple times (twelve studies two times, six studies three times, one study four times).

Some authors seemed to use the phrase just to indicate that certain numbers were not represented in the tables or figures but while providing data or results in the same sentence with words; for example: “The ratio of patients showing a history of head and neck cancer (19/47 vs. 14/97, P   =  0.0007, data not shown)…” [21].

However, in some articles, notable conclusions were made based on data not shown. For instance, in a study investigating the association of preventive dental care to healthcare outcomes, tooth extractions and endodontic treatments were given considerable attention in terms of methodological decisions, results and their interpretation, but for restorative treatments, it was just stated in the discussion that, “Furthermore, we examined the effect of receiving restorative dental care on health outcomes, but no associations were seen (data not shown)” [22]. Other examples of how the phrases were used in the studies are provided in Table 1.

Thirty-six of the studies (54%) included a data sharing statement. Three studies shared data, and no study shared code, while 27 studies (40%) included supplementary material. The search from the journal websites of 22 journals that published at least one study with the phrase(s) showed that 20 would publish online supplementary materials attached to the research articles for free, without limits. From two journals, we were unable to detect the information from their websites (The Angle Orthodontist and Medicina Oral, Patologia Oral, Cirugia Bucal). Sensitivity analysis for publisher-related differences is available in S2 Text.

Discussion

In agreement with promising trends in open science practices in biomedical studies over the last decades [5], we found that the proportion of PMCOA articles in dental journals that used the “data/results not shown” phrases had decreased significantly, from over 10% to approximately 3%, during the last decade.

We also investigated the use of phrases in 2021 in more detail. These years researchers’ have had plenty of possibilities of sharing all kinds of data via numerous free and accessible platforms. Findings showed that from all PMCOA articles published in dental journals in 2021, 67 (2.8%) studies from 22 different journals used the “data/results not shown” phrases to refer to unpublished results. We found that there were differences in the use of the phrases between dental journals and publishers. Most commonly the phrases referred to negative, peripheral or confirmatory results. The significance of unpublished results to which the phrases referred varied considerably across studies. From the 67 studies, three studies shared raw data, and no study shared code.

Our findings showed a decreasing trend of PubMed-indexed PMCOA articles with the phrase “data/results not shown” from 1999 onwards. This trend, however, showed a steeper downtrend from 2008. This occurred after the publication of data availability editorials in Nature journals, starting from 2006 [2325]. Thereafter, several pieces of evidence tried to elaborate and express the concerns about data availability and reproducibility of results [2629]. In 2016, these concerns were translated into a policy of a mandatory statement on including information on whether and how others can access the underlying data for all research papers accepted for publication in Nature [30].

Some reasons for the use of the phrases to refer to unpublished results can be postulated. First, pressure to publish articles and minimising the amount of work may be one reason, that is also seen to be behind other poor scientific practices [31, 32]. In short, the results or data is not seen as worthy of publishing. Secondly, some of the results or data may be hard or impossible to share. Thirdly, as we showed, some authors just used the phrase to indicate that the results were not given in table or graphical format but were given only in text and so the results or data were not actually unpublished. However, what we see as an important reason, is that the transparency of science has not given the value in scientific practice it deserves. On the positive side, at least authors using such phrases make it honestly clear that the data or results are not shared.

Many of the studied PMCOA articles were from open access dental journals, indexed in PubMed (like BMC Oral Health and Clinical and Experimental Dental Research). Nieminen and Uribe [33] showed that in non-predatory (legitimate and indexed by established databases) open access dental journals, the presentation of results (particularly in tables and figures) was poorer than in more visible subscription-based dental journals but still better than in predatory (non-indexed) dental journals (from predatory publishers). Since referring to “data not shown” evidently is an obscure way of presenting results, it may be related to how results are presented in these studies in general.

Our findings showed that the proportion of all PubMed-indexed PMCOA articles with “data/results not shown” phrases was considerably higher than in PubMed-indexed dental journals. Whereas the decreasing trend was evident in both, all PubMed-indexed PMCOA articles had a two-fold proportion in 2021 compared with dental journals. This potentially implies subject-related differences in the use of the phrases. So, investigating these differences in a further study could provide a better picture of the current situation.

Implications for research policy

Solutions to enhance the movement towards open science through abandoning “data/results not shown” can be postulated. The strictest solution could be banning the use of these phrases, accompanied by editorial requests for providing the data or results not shown, as some journals and publishers have done [12]. However, a more sustainable solution could be wider adoption of open science practices, as particularly free data and code sharing have remained rare in biomedical literature over the last decades [5]. In open access journals advocating for more open science [34, 35], the obscure representation of results as well as not sharing data or code should not be overlooked by the publishers or editorial teams. For instance, during the study process, we noted a study which used the raw data availability statement template without any changes: “The data that support the findings of this study are openly available in [repository name e.g., “figshare”] at http://doi.org/[doi], reference number [reference number]” [36]. Thus it seems that despite data availability statements being mandatory in some journals, the actual content of the statement does not always receive careful consideration. Evidently, we need more commitment to open science principles from all stakeholders in science.

Limitations

First, it is evident that the use of these phrases is not the only way to refer to unpublished results or data. Secondly, it is unknown how well PMCOA articles from dental journals represent the wider dental literature because subscription-based journals are underrepresented in the database of open access articles. However, at least in terms of transparency indicators, the differences between PubMed-indexed and PMCOA articles might be small [37]. In addition, it is worth noting that the composition of the PMCOA database varies over time due to changes in open access practices and differences in how soon after publication, the journal’s articles are made available to the database (ncbi.nlm.nih.gov/pmc/journals). Further, the total number of articles included all types of papers (commentaries, letters, etc.); hence, the proportion of studies with the phrase may be higher than what was found if solely research articles had been considered. Although the analysis of PMCOA articles published in 2021 showed that the search from the database retrieved no false positives (studies without the phrases), we cannot be sure about the actual false-positive rates before 2021 or about the false-negative rate (missed studies with the phrases) of our identification strategy. Additionally, due to a large and heterogeneous sample of studies, we were unable to detect how the use of the phrases was related to other critical characteristics of studies, e.g., risks of bias. Finally, it must be noted that we do not know whether reviewers or editors had seen results to which “data not shown” referred during the peer-reviewing process which could thus justify the use of the phrase to some extent. However, editor experiences and studies have shown that (raw) data to support the findings of a study may not be shared despite reasonable requests, and sometimes given data doesn’t support the conclusions made from it [38, 39].

Conclusions

We showed that a great decrease in the use of the selected phrases occurred in PMCOA articles published in PubMed-indexed dental and other journals over the last decades. However, dental or other researchers have not completely abandoned the outdated caveat of “data/results not shown”, and it was still seen to be in use in 2021. Researchers, reviewers, editorial teams and publishers are responsible for further promoting and adopting open science practices, including providing all results, data and code, whenever possible, in a freely accessible online format, one way or another; fortunately, it is possible today.

Supporting information

S2 Text. Sensitivity analysis for publisher-related differences.

https://doi.org/10.1371/journal.pone.0272695.s002

(DOCX)

References

  1. 1. Goodman SN, Fanelli D, Ioannidis JPA. What does research reproducibility mean? Sci Transl Med. 2016;8: 341ps12–341ps12. pmid:27252173
  2. 2. McNutt M. Reproducibility. Science. 2014;343: 229–229. pmid:24436391
  3. 3. Nosek BA, Alter G, Banks GC, Borsboom D, Bowman SD, Breckler SJ, et al. Promoting an open research culture. Science. 2015;348: 1422–1425. pmid:26113702
  4. 4. Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Percie du Sert N, et al. A manifesto for reproducible science. Nat Hum Behav. 2017;1: 0021. pmid:33954258
  5. 5. Serghiou S, Contopoulos-Ioannidis DG, Boyack KW, Riedel N, Wallach JD, Ioannidis JPA. Assessment of transparency indicators across the biomedical literature: How open is open? Bero L, editor. PLOS Biol. 2021;19: e3001107. pmid:33647013
  6. 6. Vicente-Saez R, Gustafsson R, Van den Brande L. The dawn of an open exploration era: Emergent principles and practices of open science and innovation of university research teams in a digital world. Technol Forecast Soc Change. 2020;156: 120037.
  7. 7. Camerer CF, Dreber A, Holzmeister F, Ho T-H, Huber J, Johannesson M, et al. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat Hum Behav. 2018;2: 637–644. pmid:31346273
  8. 8. Leek JT, Jager LR. Is Most Published Research Really False? Annu Rev Stat Its Appl. 2017;4: 109–122.
  9. 9. Nosek BA, Errington TM. Making sense of replications. eLife. 2017;6: e23383. pmid:28100398
  10. 10. Naudet F, Sakarovitch C, Janiaud P, Cristea I, Fanelli D, Moher D, et al. Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy: survey of studies published in The BMJ and PLOS Medicine. BMJ. 2018; k400. pmid:29440066
  11. 11. Panter M. “Data Not Shown” - 4 Reasons to Omit a Figure or Table. In: AJE Scholar [Internet]. [cited 13 Oct 2021]. Available: https://www.aje.com/arc/data-not-shown-4-reasons-omit-figure-or-table/
  12. 12. Data shown. Nat Chem Biol. 2008;4: 575–575.
  13. 13. Amrhein V, Korner-Nievergelt F, Roth T. The earth is flat (p > 0.05): significance thresholds and the crisis of unreplicable research. PeerJ. 2017;5: e3544. pmid:28698825
  14. 14. Fox MP, Lash TL. On the Need for Quantitative Bias Analysis in the Peer-Review Process. Am J Epidemiol. 2017;185: 865–868. pmid:28430833
  15. 15. Lash TL, Fox MP, MacLehose RF, Maldonado G, McCandless LC, Greenland S. Good practices for quantitative bias analysis. Int J Epidemiol. 2014;43: 1969–1985. pmid:25080530
  16. 16. Blank E, Grischke J, Winkel A, Eberhard J, Kommerein N, Doll K, et al. Evaluation of biofilm colonization on multi-part dental implants in a rat model. BMC Oral Health. 2021;21: 313. pmid:34144677
  17. 17. Al Hugail AM, Mealey BL, Walker C, Al Harthi S, Duong M, Noujeim M, et al. Evaluation of healing at molar extraction sites with ridge preservation using a non‐resorbable dense polytetrafluoroethylene membrane: A four‐arm cohort prospective study. Clin Exp Dent Res. 2021;7: 1103–1111. pmid:34096195
  18. 18. Sharma P, Fenton A, Dias IHK, Heaton B, Brown CLR, Sidhu A, et al. Oxidative stress links periodontal inflammation and renal function. J Clin Periodontol. 2021;48: 357–367. pmid:33368493
  19. 19. Kassem El Hajj H, Fares Y, Abou-Abbas L. Assessment of dental anxiety and dental phobia among adults in Lebanon. BMC Oral Health. 2021;21: 48. pmid:33541354
  20. 20. Rizzato VL, Lotto M, Lourenço Neto N, Oliveira TM, Cruvinel T. Digital surveillance: The interests in toothache-related information after the outbreak of COVID-19. Oral Dis. 2021. pmid:34448289
  21. 21. Yang S-W, Lee Y-S, Chang L-C, Yang C-H, Luo C-M, Wu P-W. Oral tongue leukoplakia: analysis of clinicopathological characteristics, treatment outcomes, and factors related to recurrence and malignant transformation. Clin Oral Investig. 2021;25: 4045–4058. pmid:33411001
  22. 22. Lamster IB, Malloy KP, DiMura PM, Cheng B, Wagner VL, Matson J, et al. Dental Services and Health Outcomes in the New York State Medicaid Program. J Dent Res. 2021;100: 928–934. pmid:33880960
  23. 23. Nothing to hide. Nat Cell Biol. 2006;8: 541–541. pmid:16738696
  24. 24. Compete, collaborate, compel. Nat Genet. 2007;39: 931–931. pmid:17660804
  25. 25. Got data? Nat Neurosci. 2007;10: 931–931. pmid:17657230
  26. 26. Data producers deserve citation credit. Nat Genet. 2009;41: 1045–1045. pmid:19786949
  27. 27. It’s not about the data. Nat Genet. 2012;44: 111–111. pmid:22281761
  28. 28. It’s good to share. Nat Phys. 2014;10: 463–463. pmid:26955073
  29. 29. Laine C, Goodman SN, Griswold ME, Sox HC. Reproducible Research: Moving toward Research the Public Can Really Trust. Ann Intern Med. 2007;146: 450. pmid:17339612
  30. 30. Announcement: Where are the data? Nature. 2016;537: 138–138. pmid:27604913
  31. 31. Gopalakrishna G, ter Riet G, Vink G, Stoop I, Wicherts JM, Bouter LM. Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands. Fàbregues S, editor. PLOS ONE. 2022;17: e0263023. pmid:35171921
  32. 32. Shamsoddin E, Torkashvand-Khah Z, Sofi-Mahmudi A, Janani L, Kabiri P, Shamsi-Gooshki E, et al. Assessing research misconduct in Iran: a perspective from Iranian medical faculty members. BMC Med Ethics. 2021;22: 74. pmid:34154574
  33. 33. Nieminen P, Uribe SE. The Quality of Statistical Reporting and Data Presentation in Predatory Dental Journals Was Lower Than in Non-Predatory Journals. Entropy. 2021;23: 468. pmid:33923391
  34. 34. Tennant JP, Waldner F, Jacques DC, Masuzzo P, Collister LB, Hartgerink Chris HJ. The academic, economic and societal impacts of Open Access: an evidence-based review. F1000Research. 2016;5: 632. pmid:27158456
  35. 35. Laakso M, Björk B-C. Anatomy of open access publishing: a study of longitudinal development and internal structure. BMC Med. 2012;10: 124. pmid:23088823
  36. 36. Sereti M, Roy M, Zekeridou A, Gastaldi G, Giannopoulou C. Gingival crevicular fluid biomarkers in type 1 diabetes mellitus: A case–control study. Clin Exp Dent Res. 2021;7: 170–178. pmid:33369174
  37. 37. Wallach JD, Boyack KW, Ioannidis JPA. Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017. Dirnagl U, editor. PLOS Biol. 2018;16: e2006930. pmid:30457984
  38. 38. Miyakawa T. No raw data, no science: another possible source of the reproducibility crisis. Mol Brain. 2020;13: 24, s13041-020-0552–2. pmid:32079532
  39. 39. Errington TM, Denis A, Perfito N, Iorns E, Nosek BA. Challenges for assessing replicability in preclinical cancer biology. eLife. 2021;10: e67995. pmid:34874008