Skip to main content
Advertisement
Browse Subject Areas
?

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Attitudes and practices of open data, preprinting, and peer-review—A cross sectional study on Croatian scientists

Abstract

Attitudes towards open peer review, open data and use of preprints influence scientists’ engagement with those practices. Yet there is a lack of validated questionnaires that measure these attitudes. The goal of our study was to construct and validate such a questionnaire and use it to assess attitudes of Croatian scientists. We first developed a 21-item questionnaire called Attitudes towards Open data sharing, preprinting, and peer-review (ATOPP), which had a reliable four-factor structure, and measured attitudes towards open data, preprint servers, open peer-review and open peer-review in small scientific communities. We then used the ATOPP to explore attitudes of Croatian scientists (n = 541) towards these topics, and to assess the association of their attitudes with their open science practices and demographic information. Overall, Croatian scientists’ attitudes towards these topics were generally neutral, with a median (Md) score of 3.3 out of max 5 on the scale score. We also found no gender (P = 0.995) or field differences (P = 0.523) in their attitudes. However, attitudes of scientist who previously engaged in open peer-review or preprinting were higher than of scientists that did not (Md 3.5 vs. 3.3, P<0.001, and Md 3.6 vs 3.3, P<0.001, respectively). Further research is needed to determine optimal ways of increasing scientists’ attitudes and their open science practices.

Introduction

Open science, despite lacking an universally accepted definition, is widely recognized as a global phenomenon and an initiative emerging from the philosophical concept of scholarly „openness“. With the principles and values of openness rooted in the idea of scientific knowledge being a common good [1]. The term open science was coined in 2001 by Recep Şentürk, and he used it to refer to a democratic and a pluralist culture of science. For Şentürk, open science indicated that different perspectives in science are considered equal, rather than alternative to each other: “If we desire to recognize the complexity of our world we must embrace multiplex ontology” [1]. His view, however, is different from today’s relatively narrow view of open science perceived as an „effort by researchers, governments, research funding agencies or the scientific community itself to make the primary outputs of publicly funded research results—publications and the research data—publicly accessible in digital format with no or minimal restriction”[2]. A recent systematic review summarized definitions of open science from 75 studies into „transparent and accessible knowledge that is shared and developed through collaborative networks”[3].

The open science movement intensified since 2010, when it became clear, that open access alone would not solve problems of non-reproducibility of published studies and the inaccessibility of research data, study protocols, laboratory notes, software, or peer review reports. The movement therefore also serves as a reminder of the basic tenets of science, and encourages open sciences to take the forefront in scholarly discussions [4].

Practical considerations of open science often deal with methods to lower or erase technical, social, and cultural barriers, and enable public sharing of all aspects of research [5], which are believed to lead toward the betterment of science [6]. Often, those practical considerations are described in various open science taxonomies and classifications, of which one of the most commonly used is the FOSTER’s graphical representation, which distinguishes six „first level”elements of open science: open access, open data, open reproducible research, open science evaluation, open science policies, and open science tools [7].

In our research, we focused on the three of these elements: open data (open data use and reuse), open science tools (open repositories—preprint servers) and open science evaluation (open peer review).

Open data

Open data are data that can be used (with proper attribution) by anyone without technical or legal restrictions [2]. Open Knowledge Foundation characterized them by: i) availability and access:; ii) reuse and re-distribution; iii) universal participation [8]. Many statements and recommendations were made to increase open data use and reuse [9], of which International Committee of Medical Journal Editors (ICMJE) recommendations, followed today by more than 500 biomedical journals, required a data sharing statement for clinical trials since July 2018 [10]. Research data is thought to be best preserved by being deposited in one of the many general or specific repositories existing today [11]. Although research and funding agencies often recognize the importance of data sharing, many technical and even psychological barriers still exist towards data sharing [12].

While the number of studies on open data has greatly risen in the last few decades [8], data is still rarely shared across sciences due to great differences between disciplines, debates on data ownership, lack of funding to support data sharing and data curation (i.e., preparation of data for sharing), as well as due to lack of incentives to reward it [915]. Recent estimates show that data sharing was mentioned in only of 15% biomedical [16], and in only 2% of psychological articles [17].

Preprinting

Preprinting is an open science practice that allows the deposition and distribution of manuscripts (preprints) using an open science infrastructure (thematic or general preprint server) before submitting to a journal and being formally peer-reviewed [1820]. While experiments with faster dissemination of research began in 1960s, in 1990s, first preprint servers (arXiv, SSRN and RePec) emerged and allowed public sharing of author’s versions of manuscripts, i.e., preprints, before those manuscripts were peer reviewed and published in journals (or other venues, such as books or conference proceedings). However, it took a while for prepint servers to become the go to place for researchers. For example, it took arXiv 8 years to become a major player in the dissemination of results in physics and mathematics [21]. Other scholarly fields have been even slower to adapt to the preprint culture: with bioRxiv, a preprint server dedicated to the biological sciences, originating in 2013, SocArXiv a server for preprints in social sciences in 2016, and MedRxiv, a server for clinical research preprints, in June of 2019 [19]. Further actualization and discussions surrounding preprint servers, also rose after Chalmers and Glasziou estimated that 85% of research is wasted due to inadequate research design or methodology, poor reporting, publication bias, and lack of scholarly openness [22], with some viewing preprint serves as a way to address some of these issues. Today, there are more than 60 preprint servers in the world covering all scholarly fields [23], and the number of preprints is rising, fuelled additionally by the COVID-19 pandemic [24]. Preprints are seen as a step toward greater openness of science, and in 2019, Fu and Hughey estimated that manuscripts first published as preprints received 36% more citations and had a 49% higher Altmetric score [25]. Increasing number of journals and funders today encourage preprinting [26]. Furthermore, many scholarly engines have started indexing preprints, e.g. Europe PMC [20], Scopus [18], and Dimensions [27].

Open peer-review

Peer-review is a quality control mechanism for scholarly research or funding proposals. Traditionally, journal peer-review was most commonly blind (single, double or triple blind) and it was often criticised for being slow, expensive, subjective, not able to detect errors, non-reliable, prone to bias and easily abused [28]. This lead to growing need for a more open peer-review process [29]. Open peer-review as a term, however, lacks a universal definition [7]. Most often it used to describe one of the following practices: open identities of the authors and reviewers, open review reports published alongside the article, open interaction and discussion between author(s) and reviewers, or open platforms where a review is facilitated by a different entity than the one where the paper is published [7, 24, 30]. In our study, we consider open peer review to be open (public) sharing of review reports (with or without reviewers’ names) as part of the journal or grant peer review processes. Practice and uptake of open peer review, however has been low, with less than 1% of journals today practicing it [31], and, to the best of our knowledge, no known estimates of its use by funders is available.

We are not aware of any studies, that analysed attitudes towards open data, preprinting and peer-review with a validated questionnaire. It was, therefore, our goal to construct and validate such a questionnaire; and use it to report attitudes towards open data, preprinting and open peer-review of Croatian scientists, as well as on the association between their attitudes and open science practices or their demographic information.

Literature review

Attitudes measurement

Attitudes can be defined as an individual’s positive, neutral or negative feelings (evaluative affect) about a certain behaviour or a value [32, 33]. Attitudes are often measured with either one-item questions or with multi-item questionnaires (psychometric scales, whose answers are then often summarized to create a scale or an attitude score). While one-item questions can be a useful method for “snapshot measuring” [34], measuring attitudes with only a single question is generally not considered an optimal approach. On the other hand, creation of scales requires rigorous methodological approaches for questionnaire construction and validation [3538]. This process often includes steps that include item/question generation, face validity checks, testing scales validity and reliability, and evaluating responsiveness and scale interpretability [35, 3941]. A known fallacy of many attitude assessments is the difficulty to compare research findings, as the same questions or scales are rarely used multiple times or for different populations, and differences between studies can turn out to be a consequence of different wording of questions, emphasizing the need for creation of standardized questionnaires.

Attitudes towards open data, preprinting and open peer-review

We present below our literature review of studies analysing attitudes towards open data, preprinting and open peer-review.

Attitudes towards open data

In a recent (2020) systematic review, Zuiderwijk, Shinde and Jeng summarized results of 32 quantitative and qualitative studies on open data, of which 15 were surveys [42]. They found that “scholars refer to personal drivers and a positive attitude toward data sharing as vital individual drivers for openly sharing research data” and that they see a negative attitude as an inhibitor of data sharing [42]. In those summarized studies participants were mostly from the United States and Europe, and only a small number of studies were focused on multiple scientific disciplines. Most participants also had generally positive attitudes towards open data. An interesting finding was that in half of those studies (which assessed data sharing), there was no reference to studies own data availability [42].

An earlier 1988 study by Ceci described attitudes of 790 researchers from three US universities using a case scenario approach followed by 3 (snap-shot) questions, finding that researchers have a positive attitude towards data sharing, but acknowledging that those attitudes might have been influenced by giving socially desirable answers [43].

Two large studies of data sharing practices, and barriers of data reuse were authored by Tenopir et al. (2011 and 2015) using a multi-question approach, but without validating an attitude scale or reporting its reliability [44,45]. In the first study they surveyed approximately 1200 scientists, of which 900 were followed up in the second study. Most scientists were from North America (68%), and from the fields of environmental sciences and ecology (32%). Their results showed an increase in data sharing attitudes over time, but also an increase in the number of perceived barriers for data sharing [45]. Building on their questions, Curty et al. [13] later validated a scale for measuring attitudes towards data reuse on a sample of 570 scientist. They tested construct validity, and reported a 3 factor construct of their scale: perceived efficiency of data reuse (5 items), perception of data re-use (2 items) and concern about trustworthiness of data (4 items), with subscale reliability scores (Cronbach’s alpha) ranging from 0.73 to 0.81 [13].

Yoon and Kim, in 2017, constructed and validated a scale using structural equation modelling (a combination of factor analysis and multiple regression), on a sample of 292 social scientists. Their questionnaire had 20 items and 7 factors (with Cronbach’s alpha values ranging from 0.76 to 0.97). They also concluded that attitudes towards data reuse was a strong predictor of data reuse intention [46].

Zenk-Möltgen et al., in 2018, investigated attitudes towards data sharing of 446 political and sociology scientists using a theory of planned behaviour, but they did not report on their scale’s validity or reliability. Overall, they found generally positive attitudes toward data and code sharing, and a strong association between previous sharing behaviour and intention to share [47].

Abele-Brehm et al., in 2019, investigated attitudes towards open data and data sharing of 337 psychological society members and reported a 2 factor scale (positive expectations—10 items with Cronbach’s alpha of 0.90, and negative expectations—4 items with Cronbach’s alpha of 0.67). They found that respondents attitudes were generally positive [48].

Finally, Zhu [12] in 2020, measured attitude of UK researchers towards data reuse with one question (“How important do you think it is, in general, to make research data available online for reuse?”) and 1459 out of 1695 (86%) respondents found it to be very or fairly important.

Attitudes towards preprinting

Studies evaluating attitudes towards preprinting are very scarce. Zha, Li and Yan, in 2013, have measured attitudes of 260 participants from natural and social sciences that previously posted a preprint on a Chinese preprint server [49]. Their questionnaire had 25 questions, with 7 factors (each construct had 2–5 items with Cronbach’s alpha values from 0.85 to 0.98 with very high correlations indicating unidimensionality) and they found overall positive attitudes toward preprinting. Yi and Huh [50], investigated attitudes towards preprinting of 365 Korean authors and editors with 5 questions with a reliability of Cronbach α = 0.86, but did not report on the construct validity. Overall, they reported positive attitudes of respondents [50].

Attitudes towards open peer-review

Twenty years ago, in 2001, a study by Melero and Lopez-Santovena, found that 17% of 103 reviewers for the journal Food Science and Technology International expressed favour fully open peer review (by answering a single question: “What system are you in favour of? Open or blinded”) [51]. Ten years after that, 28% (104 out of 364) of Danish general medical journal reviewers expressed their preference for an open review (answering a single question: “Which peer review system do you prefer in the future?”) [52]. One of the largest ever studies of attitudes towards open peer review was published by Ross-Hellauer, Deppe and Shmidt in 2017 [53], and although they used multiple questions, they did not report on their questionnaires validity or reliability. In total they collected approximately 3000 responses, mostly of researchers from Europe (61%), and from science, technology and medical (STM) fields (90%). Overall, respondents reported generally positive attitudes towards open peer-review. In 2018, Segado-Boj, Martín-Quevedo and Prieto-Gutiérrez, surveyed authors of Spanish journals (n = 295), mostly from social sciences (63%) with 7 questions, but they did not report on the questionnaires validity or reliability [54]. Overall, participants were found to be cautious towards open peer review. Lastly, in 2020, Besacon et al., using a small sample (N = 30) of researchers in the computer science field, and eight questions, reported that more than half of the respondents were in favour of open peer review, but not of displaying their reviewer names. They did not report the constructs validity or reliability [55].

Materials & methods

We conducted a cross-sectional study with psychometrical validation of a questionnaire, which we named, the Attitudes towards Open data sharing, preprinting, and peer-review (ATOPP).

Participants

In 2018, Croatia had 17,706 scientists [56]. In order to reach as most of them as we could, we sent invitations through 2 different channels: through the mailing list of Croatian scientists (approximately 17,000 members) compiled by the Rudjer Boskovic Institute (Zagreb, Croatia), and the Dean’s secretaries of University of Rijeka (the University of the first author, with 1,256 scientists).

Procedure

Participants were invited to fulfil an anonymous online questionnaire (through Google forms). The survey was open from 12 May 2020 to 7 July 2020, and we sent two reminders 14 days apart.

Constructing the questionnaire.

The questionnaire was constructed as a result of three focus groups we held at the University of Rijeka in 2019 and 2020 with a total of 24 participants. The first focus group was held with participants from Biomedical Sceinces (N = 12), second with the participants from Social Sciences (N = 7) and the last with participants from Natural Sciences (N = 5). Participants were asked 5 questions: (1) What is open science to you? (2) What are your experiences with open access journals? (3) What do you think about the open peer-review process? (4) Do you use any of the open science tools? (5) What could influence you to provide access to your research/project data? The sessions were recorded and the transcripts used for generating the survey questions [57]. The questionnaire face validity was then checked by us (the authors). This questionnaire had 73 questions, of which 45 were meant to assess the attitudes towards open science, specifically open access (8 items), open peer-review (12 items), open data (10 items), preprints (9 items), and open science tools (6 items). It also had 20 questions on open science practices; and 8 about demographic information. Answers to attitude statements were offered on a five-point Likert-type scale, where 1 indicated “strongly disagree;” 2 –“disagree;” 3 –“neither agree nor disagree;” 4 –“agree;” and 5 –“strongly agree.” Open science practices questions were of mixed type (yes/no and multiple-choice questions). Demographic questions included questions on gender, age, scientific filed, roles in science, and the total number of published papers.

Our initial exploration (factor analysis) of the 45 attitudes questions showed that questions on open access (8 items) and open science tools (6 items) explained less than 5% of the variance of the total score and were not internally consistent (with Cronbach alpha scores <0.65) [35]. We then re-examined them (face validity), and hypothesized this is most likely due to the fact that these two aspects of open science dealt with concepts outside of direct researcher’s influence (i.e. they were built by other actors), while data sharing, open peer review, and self-archiving through preprints were under direct (self-) agency of the researchers. The psychometrical validation of the remaining questions (31 items) is presented in the results.

Statistical analysis

Validation of the ATOPP questionnaire.

Construct validity of the scale was tested with exploratory factor analysis after the suitability of the item correlation matrix was checked with the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy and Bartlett’s test of sphericity. In Exploratory Factor Analysis, we used Principal Axis Factoring (PAF) as the factor extraction method and Oblimin as the rotation method. We included the extracted factors with the eigenvalue >1, more than 5% of the construct variance and those which passed visual inspection on the scree plot. Factor loadings <0.30 are not presented [35]. The factor analysis procedure uses the pattern of correlation between questionnaire items, which represent directly measured manifest variables, grouping them by the variance they share which is captured by factors that are interpreted as latent dimensions, inferred constructs that are not directly measured. Consequently, each extracted factor or dimension is defined only by questionnaire items to which it relates [35, 58]. Correlations of factors were calculated with Pearson’s coefficient of correlation.

Internal consistency of the scale and subscales were determined with Cronbach alpha.

Total score.

Before calculating the total score we have recorded 4 items: item 6 and 8 in Open data and items 10 and 11 in Open peer-review (S1 Appendix). The total score of whole scale and factors were constructed as a linear composite of all items divided by the 21 (number of items) with the score range being from 1 to 5. Lower results (<2.6) were considered as negative attitude, average (2.6–3.39) as neutral attitude and higher results (>3.39) as positive.

Analysis of answers, based on the ATOPP survey. Qualitative data are presented with frequency and relative frequency. Comparison of qualitative data is done with χ2 test and test of proportion.

Quantitative data are presented with median and interquartile range [Md(IQR)] and the distribution was tested with Kolmogorov- Smirnov test. Comparison of quantitative data was made with non-parametric (Mann-Whitney or Kruskal-Wallis) tests. Post-hoc test for Kruskal- Wallis was Dunn test.

For the purpose of the attitude analysis we have merged Natural sciences and Technical sciences, Biomedicine and health and Biotechnical sciences, and finally Social Sciences, Humanities and Interdisciplinary fields of science.

For statistical analysis, we have used 2 statistical packages SPSS (IBM SPSS Statistics for Windows, Version 25.0. Armonk, NY: IBM Corp) and Medcalc (MedCalc Software, Ostend, Belgium, version 16.0.3). P<0.05 was considered significant.

Sample size calculation. We based our calculation on the number of initial survey attitude questions (n = 45) and the fact that it is considered sufficient for scale validation to have 10 times more participants than the number of items [39].

Ethics

The study was approved by the Ethical committee of the University of Rijeka, Rijeka, Croatia (KLASA: 003-08/19-01/l; URBROJ: 217 0-24-04-3–19–7). In the invite letter we also presented the informed consent form, which the participants had to approve in the online form before starting to fulfil the questionnaire.

Results

Validation of the ATOPP questionnaire

Thirty-one item related to open peer-review (12 items), open data (10 items) and preprinting (9 items) were entered into the exploratory factor analysis after exclusion of 14 items related to open access and open science tools (see Methods above). Acceptability of the construct was then assessed by analysing the floor and ceiling effects of the individual items on the score distribution and no floor and ceiling effects were observed.

Kaiser-Mayer Olkin test (KMO = 0.79) and the Bartlett’s test of sphericity (P < 0.001) have satisfied the condition for principal axis factoring (PAF) of the 31 ATOPP questionnaire item. The inspection of the scree plot, Eigenvalues >1 and more than 5% of variance explained yielded 4 factors with 40% of the construct variance explained. We then repeated the factor analysis with 22 items (S1 Fig) that had factor loadings higher than 0.30 [35]. The second PAF analysis was more suitable (KMO = 0.80; Bartlett’s test P<0.001) and it resulted with 4 factors (21 items)–Open Data, Preprinting, Open peer review in small scientific communities, and Open Peer-review that accounted with 51% of the construct variance (Table 1).

thumbnail
Table 1. Attitudes towards open data, preprinting, and peer-review (ATOPP)—Reliability, factor loadings and median values.

https://doi.org/10.1371/journal.pone.0244529.t001

Structure matrix (correlations of each item with the extracted dimensions) is presented in S1 AppendixTable 1, indicating 21 items were left in the model with a simple factorial structure (loadings are distributed on one factor exclusively). The reliability of the whole scale was very good (Cronbach’s alpha of 0.815).

Participants’ characteristics

We have collected 546 responses, 196 (36%) from University in Rijeka and 350 (64%) from the Rudjer Boskovic Institute list of Croatian scientists. There was no overlap between the respondents of the two sources, and 5 responses were not valid (not completed), leaving a total of 541 responses. The response rate for the University of Rijeka was 15.6% and for the Croatian scientist list it was 2%. Factorial structure of the attitude scales was the same for both samples and therefore we present them together.

Median age of the participants was 45 (38 to 53), with equal percentage of both males and females (43% vs 54%, P = 0.082) Majority of the respondents were from Biomedicine and Health (26%), Social (25%) or Natural Sciences (17%). They were most commonly Assistant Professors (29%), Full Professors (27%) or Associate Professor (19%). Most respondents (n = 529, 98%) published at least one article, with a median of 23 (IQR 10–45). More than two thirds (n = 371, 69%) were also reviewers, 16% (n = 87) acted as reviewers for funding agencies, and 11% (n = 62) as members of the editorial board, finally 3% (n = 18) were editors. Detailed demographic and scholarly information of respondents is presented in Table 2.

Open science practices

Respondents’ open science practices are presented in Table 3. Around half (47%, n = 240) of the respondents participated in open peer-review and most of them were happy to sign the review reports (n = 225, 95%).

thumbnail
Table 3. Open peer review, open data and preprinting practices.

https://doi.org/10.1371/journal.pone.0244529.t003

Nearly half of the authors (46%, n = 249) published a paper in a journal in which research data could be deposited, and one third (29.9%, n = 162) published an article based on public data from other researchers. Most respondents shared their data (as supplementary files) via journals (54%, n = 285). Minority of the respondents posted a preprint (12%, n = 64), mostly on Arxiv (n = 38), BiorXiv (n = 12) or SocarXiv (n = 4).

Attitudes towards open data, preprinting, and peer-review

The total score for all participants on the ATOPP scale was neutral with median of 3.3 (3.0–3.7). The neutral score was also found for their attitudes towards preprinting [3.0 (2.6–3.4)] and open peer review [3.2 (2.7–3.7)]. Negative attitude was found for the open peer-review in small scientific communities [2.0 (1.0–3.0)] and positive for open data [3.9 (3.4–4.4)] (all P<0.05) (Table 4). Differences in attitudes were tested regarding gender, field, open science practices and education (Table 4).

thumbnail
Table 4. Attitude towards open data, preprinting, and peer-review (ATOPP) of Croatian scientists (N = 541).

https://doi.org/10.1371/journal.pone.0244529.t004

We found no gender differences (all P>0.05) except for the open peer-review in the small scientific communities, where female respondents had a more negative attitude than male respondents [2.0(1.0–3.0) vs 2.0(2.0–3.0), P = 0.032].

We also found no differences in the overall ATOPP score between scientific fields (P = 0.523). However, attitudes toward open peer review in small scientific communities were higher in Natural sciences and Technical sciences than in Social Sciences, Humanities and Interdisciplinary fields [2.5 (2.0–3.0) vs 2.0 (1.0–3.0), P = 0.002]. While attitudes towards open peer review were higher in Biomedicine and Health and Biotechnical sciences compared to Natural sciences and Technical sciences [3.3 (2.8–3.8) vs 3.0 (2.3–3.8), P = 0.023)].

Participants who had open peer review experience had higher total ATOPP score (P<0.001), as well as attitudes towards open data (P = 0.008) and open peer-review (P<0.001). Similarly, those who previously shared their data had higher attitudes towards Open data (P = 0.007), Preprinting (P = 0.005) and Open peer review in small scientific communities (P = 0.021). Participants with experience in preprinting had more positive attitudes for all subscales (all P<0.05) except for the Open peer review in small scientific communities (P = 0.140). Finally, participants who had education in open science had a more positive ATOPP score then those that did not (<0.001) and they also had higher attitudes for preprinting and open peer review.

Discussion

In this study we developed the ATOPP questionnaire for measuring attitudes toward open data, preprinting and open peer-review. Using the ATOPP questionnaire, we then explored Croatian scientists’ attitudes towards those topics and the association of those attitudes with their open science practices and socio-demographic information. To the best of our knowledge, this is the first psychometrically validated (multiple-item) questionnaire for measuring attitudes towards all these three topics with one questionnaire. The ATOPP scale, consisting of 21 items, demonstrated good internal consistency and validity. Because of its good psychometrical characteristics and relatively small number of questions, we believe that it represents a fast measurement that can be used in assessing or monitoring attitudes towards open science, thus allowing cross-cultural validation.

During ATOPP development, attitudes towards open peer review in small scientific communities turned out to be a separate factor (subscale) from attitudes toward open peer review. This could be a product of both the fact that Croatian scientific community for centuries had a higher number of specialized journals per capita compared to its neighbouring countries, and the fact that open peer review in small (national) fields or subfields has higher likelihood of reviewers being direct competitors for funding or job positions [58]. Additionally, smaller communities may experience greater fear of negative consequences of open peer review, i.e., fear of a potential (vindictive) backlash of their colleagues if they criticize their work, or if due to their review, they negatively affected funding or publication opportunities of their colleagues. And these fears will likely remain until external evaluations or strong protective mechanisms are put into practice (however, for small communities these approaches likely face significant language barriers and high costs).

Based on the ATOPP questionnaire, we then found that the Croatian scientists had generally neutral attitudes toward open science. Their most positive attitudes were towards open data, while their attitude towards preprinting and towards open peer-review were neutral, and those towards open peer review in smaller scientific communities were negative. We also found no gender or scholarly field differences in respondents’ overall attitude scores. However, scientists who already had experience with open science practices, i.e., shared data, provided open peer review reports in the past, or posted preprints, had generally more positive attitudes than those who did not. Higher attitude in those with experience in open science practices and previous open science education are in accordance with Bem’s self-perception theory that confirms effect of past behavior on internal attitude [59]. Past behaviour influence on attitudes was also confirmed by many researchers since then [60, 61]. We have also found that participants who had taken open science courses had more positive ATOPP scale score, preprinting score and open peer-review score. which is a confirmation of a model that positive attitudes are related to intention to behave and behavior [32].

The positive attitudes towards open data in our study were associated with the high prevalence of researchers in our sample (46%) that shared their data in the past. However, in the recent survey by Zhu (2020) in the United Kingdom on 1724 participants from various scholarly fields, there were less participants (21%) who had deposited primary data in online repositories, although majority (86%) had a very positive attitude towards data sharing [12]. However, in that survey, attitude was only measured by a single question “How important do you think it is, in general, to make research data available online for reuse?”.

Positive attitude towards open data in our study can be compared with the data in a recent survey among members of the German psychological society (N = 303). Abele-Brehm et al. constructed a scale measuring hopes (10 items, Cronbach α = 0.90) and fears (4 items, Cronbach α = 0.67) towards data sharing. The positive attitude–“hopes” of respondents were neutral, but their experience with data sharing was not measured [48]. Yoon and Kim (2017) investigated data reuse behaviour by measuring beliefs, attitudes, and norms. Based on their theoretical framework attitude was a strong positive predictor of data reuse [46].

Attitudes of Croatian scientist towards preprinting were neutral in our study, except of those scientists who preprinted in the past (12%). These results differ from attitudes participants in South Korea [50], China [49], and Latin America [62] whose attitudes were overall found to be positive; but those surveys included more editors, and all (China), or many respondents who previously posted a preprint (Korea (32% editors, 15% past preprint users; and Latin America, 40% past preprint users), while in our sample only 12% of scientists did so and we had only 11% of editorial board members in the sample [49, 50]. Additionally, China introduced a country preprint server ChinaXiv in 2006, and Latin America in 2020 (ScIELO preprints), which most likely further promoted already strong open access culture in those countries (Croatia does not have a national preprint server). Croatian scientist preprint use in our study, is in the line with a large analysis of Biorxiv preprints (n = 67,885, in the period from 2013 to 2019), which found that senior authors of preprints are often researchers from the United States (39.2%) and the United Kingdom (10.5%), while Croatia was described as a “contributor country” whose authors were rarely on senior authorship positions [63]. More studies are, however, needed to determine the main factors that drive researchers to start preprinting manuscripts or project proposals (protocols), as well as inviting or choosing to wait for public comments before deciding to submit the preprint for scholarly journal peer review. Additionally, previous survey has shown that scientist choices toward posting a preprint are influenced by the policies of the journals in which they plan to publish those studies [62, 64]. Although Croatia has approximately 400 active scholarly journals [65] of which less than half are indexed in WoS or Scopus [66], preprint policies are listed for only 24 in Sherpa website, and so further research is needed to determine the influence of Croatia’s editorial, funder and publishing milieu on the preprint attitudes of its researchers.

In our study, Croatian scientists’ overall attitudes towards open peer review were neutral. We also found differences among scientific fields with scientists from Biomedicine and Health and Biotechnical sciences having higher attitude score (albeit still neutral), as did those with previous experience with open peer review. In Ross-Hellauer, Deppe and Schmidt 2017 study on 3062 participants, most thought that open peer-review should be common practice, with the strongest support coming from social science researchers (e.g. economics, psychology and philosophy). Also, the majority of researchers agreed that the obligatory signing of review reports is strongly associated with rejecting peer review requests [53]. That study however did not report on the validity and reliability of its questionnaire.

Attitude towards open peer-review in small scientific communities of Croatian scientist were low and even lower than attitudes towards open peer-review, indicating that Croatian scientific community might not be ready for this aspect of open science yet. Female scientists also had a lower attitude than male scientists, which could be the results of the gender disbalance in academia, and greater fear of retaliation or promotion obstruction [67]. Slightly higher attitudes towards open peer-review in small scientific communities was found among scientists who shared data before, preprinted, or were from natural and technical sciences; which are fields that have been sharing preprints the longest.

Despite presenting the first psychometrically validated scale for measuring attitudes towards open data, preprinting and open peer-review, our study is not without limitations. As all questionnaires, are data are based on self-declared attitudes and open science practices and does not capture independently confirmed practices. Furthermore, as in many recent online surveys, our response rates were low, and that rate might have also been influenced by the fact that the questionnaire was sent during the early months of the COVID-19 pandemic. Additionally, we might have captured opinions only of those interested in these topics, which, if true, could mean that the attitudes of a representative sample of Croatian scientists would be even lower towards these topics. While we did provide definitions of open science practices in our questionnaire, as most of our respondents (78%) did not have education in open science, it is possible some held different ideas of those practices. Finally, while our study, showed a strong association between open sciences attitudes and previous open science practices, and in that way provides further credibility to the ATOPP questionnaire, our study was cross-sectional and was not designed to look at the possible consequences of these attitudes on promotion and implementation of open science practices. Further interventional research is needed to explore the most efficient interventions in increasing positive researchers’ attitudes toward open science practices, as well as if those interventions would lead to greater uptake of such practices [68]. Furthermore, with the recent changes in the EU funding schemes for the period of 2021 to 2027, and the requirement for open peer review and data sharing [69]; announcement of the journal eLife for only accepting submissions if they have been posted as preprints [70], as well as dedicated calls for research into ways to increase open sciences practices of those who have not embraced them yet [71] it will be interesting to compare if approaches aimed at rewarding open science practices vs mandating of those practices by several key stakeholders, will have a different impact on researchers attitudes. And, ultimately, which of those approaches will be more effective in inducing changes in the wider scholarly community.

Conclusions

In conclusion, our study presents the validation of a multi-item questionnaire for measuring open science attitudes, specifically open data, preprinting and open peer review. Additionally, using the questionnaire we found that attitudes Croatian researchers towards these topics were neutral, and that more positive attitudes were found among those that participated in open science practices before or had an education in open science. Further studies are needed to assess attitudes of researchers on these topics in other countries, as well as to track changes of these attitudes over time. With more and more funders and institutions encouraging or mandating open science practices, we believe that validated tools, such as this one, could help assess and monitor researchers’ attitudes and their associations with open science practices.

Supporting information

S1 Appendix. Factor analysis of the attitudes and practices of open data, preprinting, and peer-review—a cross sectional study on Croatian scientists.

https://doi.org/10.1371/journal.pone.0244529.s001

(PDF)

S1 Fig. Scree plot of the factor analysis (22 items) of attitudes towards open data, preprinting, and peer-review (ATOPP).

https://doi.org/10.1371/journal.pone.0244529.s003

(TIF)

References

  1. 1. Şentürk R. Toward an open science and society: multiplex relations in language, religion and society -revisiting Ottoman culture-. İslam Araştırmaları Derg. 2011;(6):93–129.
  2. 2. OECD. Making Open Science a Reality. 2015;(25):1–108. Available from: http://dx.doi.org/10.1787/5jrs2f963zs1-en.
  3. 3. Vicente-Saez R; Martinez-Fuentes C. Open Science now: A systematic literature review for an integrated definition. J Bus Res. 2018;88:428–36.
  4. 4. Tennant J. Do we need an Open Science coalition? Elephant Lab [Internet]. 2018; http://elephantinthelab.org/do-we-need-an-open-science-coalition/.
  5. 5. Brown CT. Living in an Ivory Basement: Stochastic Thoughts on Science, Testing, and Programming. [Internet]. 2016. http://ivory.idyll.org/blog/2016-what-is-open-science.html.
  6. 6. Tennant J, Agarwal R, Baždarić K, Brassard D, Crick T, Dunleavy D, et al. A tale of two “opens”: intersections between Free and Open Source Software and Open Scholarship [Preprint]. 2020. https://osf.io/preprints/socarxiv/2kxq8/.
  7. 7. Pontika N, Knoth P, Cancellieri M, Pearce S. Fostering Open Science to Research Using a Taxonomy and an ELearning Portal. In: Proceedings of the 15th International Conference on Knowledge Technologies and Data-Driven Business [Internet]. New York, NY, USA: Association for Computing Machinery; 2015. (i-KNOW ‘15). https://doi.org/10.1145/2809563.2809571.
  8. 8. Open Knowledge Foundation. What is Open? [Internet]. 2020. https://okfn.org/opendata/.
  9. 9. Lammey R. Data sharing and data citation: Join the movement! Eur Sci Ed. 2019;45(3):58–9.
  10. 10. ICMJE data sharing [Internet]. http://www.icmje.org/recommendations/browse/publishing-and-editorial-issues/clinical-trial-registration.html.
  11. 11. Nature. Recommended Data Repositories [Internet]. https://www.nature.com/sdata/policies/repositories.
  12. 12. Zhu Y. Open-access policy and data-sharing practice in UK academia. J Inf Sci. 2020;46(1):41–52.
  13. 13. Curty RG, Crowston K, Specht A, Grant BW, Dalton ED. Attitudes and norms affecting scientists’ data reuse. PLoS One. 2017;12(12):1–22. pmid:29281658
  14. 14. Zhang Y, Hua W, Yuan S. Mapping the scientific research on open data: A bibliometric review. Learn Publ. 2018;31(2):95–106.
  15. 15. Berenbaum MR. On Mr. Hyslop’s prediction, content archives, and preprint servers. Proc Natl Acad Sci U S A. 2020;117(17):9131–4. pmid:32284425
  16. 16. Serghiou S, Contopoulos-Ioannidis DG, Boyack KW, Riedel N, Wallach JD, Ioannidis JPA. Αssessment of transparency indicators across the biomedical literature: How open is open? bioRxiv [Internet]. 2020;1–26. http://dx.doi.org/10.1371/journal.pbio.3001107.
  17. 17. Hardwicke TE, Thibault RT, Kosie JE, Wallach JD, Kidwell MC, Ioannidis JPA. Estimating the Prevalence of Transparency and Reproducibility-Related Research Practices in Psychology (2014–2017). Perspect Psychol Sci. 2021.
  18. 18. McCullough R. Preprints are now in Scopus! [Internet]. Scopus blog. 2021. https://blog.scopus.com/posts/preprints-are-now-in-scopus.
  19. 19. Hoy MB. Rise of the Rxivs: How Preprint Servers are Changing the Publishing Process. Med Ref Serv Q [Internet]. 2020;39(1):84–9. Available from: https://doi.org/10.1080/02763869.2020.1704597. pmid:32069196
  20. 20. PMC E. Preprints in Europe PMC [Internet]. https://europepmc.org/Preprints.
  21. 21. Ginsparg P. Preprint Déjà Vu. EMBO J. 2016;35(24):2620–5. pmid:27760783
  22. 22. Chalmers I; Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9. pmid:19525005
  23. 23. Malički M, Jerončić A, ter Riet G, Bouter LM, Ioannidis J, Goodman S, et al. Preprint Servers’ Policies, Submission Requirements, and Transparency inReporting and Research Integrity Recommendations. JAMA 2020;324(18):1901–3. pmid:33170231
  24. 24. Fraser N, Brierley L, Dey G, Polka JK, Pálfy M, Nanni F. Preprinting the COVID-19 pandemic. 2020; https://www.biorxiv.org/content/10.1101/2020.05.22.111294v2.
  25. 25. Fu DY, Hughey JJ. Releasing a preprint is associated with more attention and citations for the peerreviewed article. Elife. 2019;8:1–12.
  26. 26. Sherpa Romeo [Internet]. https://v2.sherpa.ac.uk/romeo/.
  27. 27. Aggregated—Source Titles for Publication Type: Preprint in Publications—Dimensions [Internet]. [cited 2021 Mar 18]. https://app.dimensions.ai/analytics/publication/source_title/aggregated?or_facet_publication_type=preprint.
  28. 28. Smith R. Peer review: A flawed process at the heart of science and journals. J R Soc Med. 2006;99(4):178–82. pmid:16574968
  29. 29. Mulligan Adrian; Hall Louise; Raphael E. Peer Review in a Changing World: An International Study Measuring the Attitudes of Researchers. J Am Soc Inf Sci Technol [Internet]. 2012;64(July):132–61. Available from: http://onlinelibrary.wiley.com/doi/10.1002/asi.22883/abstract.
  30. 30. Ross-Hellauer T. What is open peer review? A systematic review [version 2; peer review: 4 approved]. F1000Research. 2017;6(588). pmid:28580134
  31. 31. Responsible Journals—Database—Statistics [Internet]. [cited 2021 Mar 18]. https://www.responsiblejournals.org/database/statistics.
  32. 32. Ajzen I, Fishbein M. The influence of attitudes on behaviour. In: Handbook of attitudes and attitudes change: basic principles. Mahwah (NJ, US): Erlbaum; 2005. p. 173–221.
  33. 33. Gasper K, Spencer LA, Hu D. Does Neutral Affect Exist? How Challenging Three Beliefs About Neutral Affect Can Advance Affective Research. Front Psychol. 2019;10(November). pmid:31787911
  34. 34. Bowling A. Just one question: If one question works, why ask several? J Epidemiol Community Health. 2005;59(5):342–5. pmid:15831678
  35. 35. Spector P. Summated Rating Scale Construction. Vol. 19, Japanese Society of Biofeedback Research. Sage; 1992. 463–466 p.
  36. 36. Lee SH. Constructing Effective Questionnaires. In: Handbook of human perfromance technology [Internet]. 2001. p. 760–79. http://www.davidlewisphd.com/courses/EDD8006/fall11/2006-Lee.pdf.
  37. 37. Bazdaric K. Questionnaire structure—how much do editors need to know? Eur Sci Ed. 2018;44(4):74–5.
  38. 38. Boateng Godfred O; Neilands Torsten B; Frongillo Edward A; Melgar-Quiñonez Hugo R; Young SL. Best Practices for Developing and Validating Scales for Health, Social, and Behavioral Research: A Primer. Front Public Heal [Internet]. 2018;1:149. Available from: https://www.frontiersin.org/articles/10.3389/fpubh.2018.00149/full.
  39. 39. Costello AB, Osbourne JW. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Pract Assessment, Res Eval. 2005;10(7):1–9.
  40. 40. Rowley J. Designing and using research questionnaires. Manag Res Rev. 2014;37(3):308–30.
  41. 41. Hair JF, Gabriel M L.D.S., da Silva D, Braga S Junior. Development and validation of attitudes measurement scales: fundamental and practical aspects. RAUSP Manag J. 2019 Oct 14;54(4):490–507.
  42. 42. Zuiderwijk A, Shinde R, Jeng W. What drives and inhibits researchers to share and use open research data? A systematic literature review to analyze factors influencing open research data adoption [Internet]. Vol. 15, PLoS ONE. 2020. 1–49 p. Available from: http://dx.doi.org/10.1371/journal.pone.0239283.
  43. 43. Ceci SJ, Values H. Scientists ‘ Attitudes toward Data Sharing. Sci Technol Hum Values. 1988;13(1):45–52.
  44. 44. Tenopir C, Allard S, Douglass K, Aydinoglu AU, Wu L, Read E, et al. Data sharing by scientists: Practices and perceptions. PLoS One [Internet]. 2011;6(6):1–21. Available from: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0021101. pmid:21738610
  45. 45. Tenopir C, Dalton ED, Allard S, Frame M, Pjesivac I, Birch B, et al. Changes in Data Sharing and Data Reuse Practices and Perceptions among Scientists Worldwide. PLoS One [Internet]. 2015;1–24. Available from: http://www.sacred-texts.com/bib/kjv/gen009.htm. pmid:26308551
  46. 46. Yoon A, Kim Y. Social scientists’ data reuse behaviors: Exploring the roles of attitudinal beliefs, attitudes, norms, and data repositories. Libr Inf Sci Res [Internet]. 2017;39(3):224–33. Available from: http://dx.doi.org/10.1016/j.lisr.2017.07.008.
  47. 47. Zenk-Möltgen W, Akdeniz E, Katsanidou A, Naßhoven V, Balaban E. Factors influencing the data sharing behavior of researchers in sociology and political science. J Doc. 2018;74(5):1053–73.
  48. 48. Abele-Brehm AE, Gollwitzer M, Steinberg U, Schönbrodt FD. Attitudes Toward Open Science and Public Data Sharing: A Survey among Members of the German Psychological Society. Soc Psychol (Gott). 2019;50(4):252–60.
  49. 49. Zha X, Li J, Yan Y. Understanding preprint sharing on Sciencepaper Online from the perspectives of motivation and trust. Inf Dev. 2013;29(1):81–95.
  50. 50. Yi HJ, Huh S. Korean editors ‘ and researchers ‘ experiences with preprints and attitudes towards preprint policies.Science Editing 2021;8(1):4–9.
  51. 51. Melero R, López-Santoveña F. Referees’ Attitudes toward Open Peer Review and Electronic Transmission of Papers. Food Sci Technol Int. 2001;7(6):521–7.
  52. 52. Vinther S, Nielsen OH, Rosenberg J, Keiding N, Schroeder T V. Same review quality in open versus blinded peer review in “Ugeskrift for Læger”. Dan Med J. 2012;59(8). pmid:22849979
  53. 53. Ross-Hellauer T, Deppe A, Schmidt B. Survey on open peer review: Attitudes and experience amongst editors, authors and reviewers. PLoS One. 2017;12(12):1–28. pmid:29236721
  54. 54. Segado-Boj F, Martín-Quevedo J, Prieto-Gutiérrez JJ. Attitudes toward open access, open peer review, and altmetrics among contributors to Spanish scholarly journals. J Sch Publ. 2018;50(1):48–70.
  55. 55. Besançon L, Rönnberg N, Löwgren J, Tennant JP, Cooper M. Open up: a survey on open and non-anonymized peer reviewing. Res Integr Peer Rev. 2020;5(1):1–10. pmid:32607252
  56. 56. Croatian Bureau of Statistics. Higher Education (Croatia), 2018—Statistical Reports [Internet]. 2018. https://www.dzs.hr/Hrv_Eng/publication/2019/SI-1644.pdf.
  57. 57. Breen RL. A Practical Guide to Focus-Group Research A Practical Guide to Focus-Group Research. 2007;30(3):463–75. Available from: https://www.tandfonline.com/doi/full/10.1080/03098260600927575.
  58. 58. Sambunjak D, Ivaniš A, Marušić A, Marušić M. Representation of journals from five neighboring European countries in the Journal Citation Reports. Scientometrics. 2008;76(2):261–71.
  59. 59. Bem DJ. Self-Perception: an Alternative Interpretation of Cognitive Dissonance Phenomena. Psychol Rev. 1967;74(3):183–200. pmid:5342882
  60. 60. Albarracín D, Wyer RS. J. The cognitive impact of past behavior: Influences on beliefs, attitudes, and future behavioral decisions. J Pers Soc Psychol. 2000;79(1):5–22. pmid:10909874
  61. 61. Olson JM; Stone J. The influence of behavior on attitudes. In: The handbook of attitudes. New York: Psychology Press; 2014. p. 223–72.
  62. 62. Funk K, Meadows A, Mendonça A, Rieger O, Swaminathan S. Preprint authors optimistic about benefits: preliminary results from the #bioPreprints2020 survey [Internet]. https://asapbio.org/biopreprints2020-survey-initial-results?fbclid=IwAR07rFL9o43Aj8iBT-s005A-hIPs4Zn61naFPk9eJ5nTyf9-8Mk7SZ_rCxo.
  63. 63. Abdill RJ, Adamowicz EM, Blekhman R. International authorship and collaboration across biorxiv preprints. Elife. 2020;9:1–17. pmid:32716295
  64. 64. Teixeira da Silva JA, Dobránszki J. Preprint policies among 14 academic publishers. J Acad Librariansh [Internet]. 2019 Mar;45(2):162–70. Available from: http://10.0.3.248/j.acalib.2019.02.009.
  65. 65. HRČAK. Alphabetical journals list [Internet]. [cited 2021 Mar 18]. https://hrcak.srce.hr/index.php?show=casopisi_abecedno&status=1&lang=en.
  66. 66. Utrobičić A, Šimić J, Malički M, Marušić M, Marušić A. Composition of editorial boards and peer review policies of Croatian journals indexed in Web of Science and Scopus. Eur Sci Ed. 2014;40(2):31–3.
  67. 67. Instituto Europeo de Igualdad de Género (EIGE). Gender Equality in Academia and Research: GEAR tool [Internet]. 2016. 60 p. http://eige.europa.eu/gender-mainstreaming.
  68. 68. Norris E, O’Connor DB. Science as behaviour: Using a behaviour change approach to increase uptake of open science. Vol. 34, Psychology and Health. 2019. p. 1397–406. pmid:31661325
  69. 69. Commission E. How it Works | Open Research Europe [Internet]. [cited 2021 Apr 17]. https://open-research-europe.ec.europa.eu/about.
  70. 70. eLife. eLife shifting to exclusively reviewing preprints | For the press | eLife [Internet]. [cited 2021 Apr 17]. https://elifesciences.org/for-the-press/a4dc2f54/elife-shifting-to-exclusively-reviewing-preprints.
  71. 71. Wellcome Trust. Open Research Fund—Grant Funding | Wellcome [Internet]. 2021 [cited 2021 Apr 17]. https://wellcome.org/grant-funding/schemes/open-research-fund.