Skip to main content
Advertisement
  • Loading metrics

Ten simple rules on how to write a standard operating procedure

Abstract

Research publications and data nowadays should be publicly available on the internet and, theoretically, usable for everyone to develop further research, products, or services. The long-term accessibility of research data is, therefore, fundamental in the economy of the research production process. However, the availability of data is not sufficient by itself, but also their quality must be verifiable. Measures to ensure reuse and reproducibility need to include the entire research life cycle, from the experimental design to the generation of data, quality control, statistical analysis, interpretation, and validation of the results. Hence, high-quality records, particularly for providing a string of documents for the verifiable origin of data, are essential elements that can act as a certificate for potential users (customers). These records also improve the traceability and transparency of data and processes, therefore, improving the reliability of results. Standards for data acquisition, analysis, and documentation have been fostered in the last decade driven by grassroot initiatives of researchers and organizations such as the Research Data Alliance (RDA). Nevertheless, what is still largely missing in the life science academic research are agreed procedures for complex routine research workflows. Here, well-crafted documentation like standard operating procedures (SOPs) offer clear direction and instructions specifically designed to avoid deviations as an absolute necessity for reproducibility.

Therefore, this paper provides a standardized workflow that explains step by step how to write an SOP to be used as a starting point for appropriate research documentation.

Introduction

Nowadays, digital technologies are integral to how knowledge is produced and shared, and science is organized. Data availability is a critical feature for an efficient, progressive, and, ultimately, self-correcting scientific ecosystem that generates credible findings and has become a relevant element of scientific integrity [1]. However, the anticipated benefits of sharing are achieved only if data are of reliable quality and reusable [2,3]. Despite this need, it has been shown that fewer than one-third of (biomedical) papers can be reproduced in general [4,5]. Further studies showed that suboptimal data curation, unclear analysis specification, and reporting errors could impede reuse and analytic reproducibility, undermining the utility of data sharing and the credibility of scientific findings [69]. Standard operating procedures (SOPs) for industrial processes to achieve efficiency, quality, and uniformity of performance have existed since a long time ago. SOPs ensure that the user operates following consistent processes that meet best practice standards. Moreover, the use of SOPs ensures that processes are reviewed and updated regularly and that researchers inside and outside the same group or institute are enabled to reproduce or reuse results to enlarge the study or for other studies. Despite their importance, the need for the use of standards in life science research has emerged as crucial for the quality and reproducibility of research findings only in recent years [9]. The question of data quality and reproducibility poses new scientific and societal challenges for individual researchers, universities, scientific organizations, infrastructure facilities and funders, and the broader society [10]. The scientific community started to talk about a crisis of reproducibility and its dramatic impact on the economy and credibility of the research system [9,11] around 2016. Many community-driven initiatives such as the Research Data Alliance (RDA) [12] and the European Commission prompted a series of initiatives in support of the scientific community to cope with problems related to the need of new tools and strategies to improve the harmonization of standard initiatives [1314] and the implementation of standards in the daily research work to improve research quality and data reuse. Since then, many steps forwards have been made in different fields of biological research [1520].

There are recommendations providing guidelines for maintaining reproducible results by just applying simple rules. Those rules include the use of error annotations for produced data that are viable for evaluating the impact and credibility of single data associated with the generated results. Furthermore, it is essential to use annexes when research project results are published. Annexes aid the traceability of findings and the reproducibility of performed experiments and can be linked to the aforementioned error classifications. Input files, along with information of the applied software versions, perfectly fit into annexes and play a pivotal role for reproducing results. Finally, adopting these simple rules in the routine research practice within an SOP format aid the transparency of results and exert a decisive impact on scientific reproducibility [21]. Against this background, the implementation of a minimal quality assurance (QA) system as a systematic approach to review practices and procedures is inherent logical [22]. QA systems enable the users to identify possible improvements and errors and provide a mechanism for their use, for example, by developing and deploying a failure mode and effects analysis (FMEA) [23]. The basis of each quality system is a high-quality record providing a string of documents for the verifiable origin and quality of data. Also, the general documentation improves the traceability and transparency of research findings to prove the reliability of results. Such quality control systems should be based on and be in line with good laboratory practices (GLP), well-defined and validated protocols, and comprehensive SOPs [2425]. The advantages of implementing SOPs in the daily workflow of academic researchers might not be immediately obvious and enlighten everyone. At first, it seems to be unnecessary and avoidable extra work. Indeed, without appropriate training, the setup of an SOP is time-consuming and does not appear to be a relevant asset. However, because each SOP describes one procedure only and not a series of complex procedures the efforts to be done remain feasible. For this reason, we provide you here with “10 Simple Rules on How To Write an SOP” that will enable you to produce a reliable and verifiable set of your research data.

Results

The Ten Rules

Fig 1 demonstrates the workflow of SOP writing along the line from its preparation, validation, and approval to its implementation and follow-up processes, which will be detailed in the following 10 rules.

thumbnail
Fig 1. SOP workflow.

Workflow of SOP development, its implementation, and monitoring.

https://doi.org/10.1371/journal.pcbi.1008095.g001

Rule 1: Knowing when to write an SOP

SOPs are always needed when critical processes or workflows need to be repeated in a reproducible way or when defined procedures are obliged by compliance guidelines. In other words, SOPs are vital instruments for maintaining consistent quality. Hence, every research institution which faces such processes, compliance requirements or has demand for quality should be encouraged to develop new or follow existing SOPs. Crosscheck within your institution for templates. There is no need to reinvent the wheel. Often templates are existing in your institution, but their existence might not be well communicated. Contact your data or quality manager for information. There also often exists a shared team server or cloud, containing all internal SOPs being accessible for each appropriate member or user, respectively. If your institute or department does not have official templates, ask other groups if they already have SOPs in use. If so, your colleagues should be happy to share their knowledge with you. If there is no template available, you have a pioneering role. In this case you can run a quick online search for SOP templates to be used as a starting point. It is wise to involve the technicians of your lab as well as the leader of your group or department and the quality manager and/or data manager of your institution. The initial step is now to draft a cover sheet and discuss the content of the SOP with the relevant stakeholders (reviewers and approvers) of your institution (Rules 3 and 6). Depending on the peculiarity of research you can choose an appropriate SOP format or develop your own. The resulting unique template can then be further used by all members of your group, department, or institution and by your wider community of practice.

Rule 2: Write the introduction: describe the purpose (the why)

An essential aspect of any SOP is introduction and purpose forming. The introduction section specifies the need and capability of the procedures for the research environment in which the process is being established. Identify the specific reason you decided to write your SOP. Which specific process do you address? Which specific procedures will be covered, and which are not covered? Clarifying what the specific focus of your document is will facilitate the use of your SOP by your colleagues. Richie Norton writes, “Simplicity is complex. It's never simple to keep things simple. Simple solutions require the most advanced thinking” [26]. The way you frame the information describing the process matters. Consider "less is more." Trying to tackle too much at once will mostly only lead to confusion. Represent the complex aspects of a process in a way to make it as simple as possible to make it also understandable even for users who are not from your field.

Rule 3: Set up the document structure

Every single SOP should consist of three sections:

  1. The cover page
  2. The sequence of steps or tasks (metadata) for the given procedure (Rule 4)
  3. A list of references and definitions (Rule 5)

The cover page

The cover page represents a control block used to house the document control information required to configure management and compliance standards.

It should contain the following information (Fig 1):

  • administrative information about the institution and/or department
  • a title that clearly identifies the activity or procedure
  • an SOP identifier (ID) number (or versioning) with its category (Table 1)
  • page number and total number of pages of the SOP
  • the date of issue (or of versioning)
  • possible safety instructions
  • the names of individuals who prepared and approved the SOP
  • the name of the reviewers, including the date of the review
  • a description of the purpose and field of application
  • the name and function of the author
  • the name and function of the approver

There are standardized abbreviations to describe elements of the respective SOP on the cover page. Table 2 provides a list of typical abbreviations for an SOP category.

thumbnail
Table 2. List of abbreviations used for allocation of the SOP’s category.

https://doi.org/10.1371/journal.pcbi.1008095.t002

It is recommended to use a digital object identifier (DOI) for the identification of the SOP in accordance with ISO 26324:2012 [27].

Each following page should have a heading and/or footing note mentioning (Table 3)

  • administrative information about the institution and/or department
  • short title of the SOP
  • page number and total number of pages of the SOP
  • date of approval and/or version number

A version number can be assigned in accordance with semantic versioning [28] recommendations as guidelines used in source control systems like GitHub [29].

The presentation of dates, including date of issue, versioning, date of review, and approval should be in accordance with the format specified in ISO 8601 [30].

thumbnail
Table 3. Makeup template for the header of SOP instruction pages.

https://doi.org/10.1371/journal.pcbi.1008095.t003

The final step in creating your SOP template should involve a note on the styles, fonts, and margins that you intend to use. You can download a full template SOP at Zenodo as a writable PDF [31].

Rule 4: Fill in the content

The procedure.

Start at page 2 to compile the metadata. In this step you describe the activities and the sequence of steps or tasks for the given procedure. Always consider the aim of the SOP; this will help you to focus on the specific procedure you describe. However, an SOP may contain multiple SOPs provided that each one is cited by number and full information is provided. Every user should be able to understand your work instructions. As George Orwell said, "Good [writing] is like a windowpane" [32]. Consider the knowledge and skills of potential users and the level of detail to present the process description. Long preambles should be avoided. Work instructions should follow a single style and follow a stepwise process strictly. Balance the level of detail, avoid unnecessary specification (e.g., “blue-cap tubes”) and alternatives; if alternatives are necessary, explain what dictates which action. Ensure consistency in terms of terminology, layout, media, and method and, as much as possible, avoid polysyllabic words, complex sentences, jargon, acronyms, or too many terms (without explaining them). To facilitate handling of the different data types and formats within the same workflow, the impact of such diversity to the usability of data and metadata should be minimized. Consider the different work cultures and different circumstances within people’s work and explain and describe the how and what to do. To optimize the structure of you SOP.

  • break the process into sections
  • break the sections into specific steps
  • number the steps or add bullet points for clarity.

Describe each task in detail, including timeframes and tools required to complete assignments and achieve expected outcomes.

Rule 5: References and definitions: Specify tools required for the task

The workflow elements may involve different types of resources and tools. Depending on a particular task, the selection of tools can also comprise modeling and simulation tools, data repositories, and compiler construction tools. Do not forget to add relevant references. Reference materials can include normative documents, instructions, and standards as well as research papers, graphical material, photographs, and even different SOPs. Provide all definitions of terms and abbreviations that are to be used in the procedure. Harmonize and align with standard terminologies used within your field. These should be provided as an annex.

Rule 6: Set up responsibilities and nominate reviewers and approvers

Reviewers should be nominated and appointed responsible for each particular task. These persons should have appropriate scientific knowledge and expertise as well as experience in the field. Usually, the initial SOP-author(s) is/are responsible for monitoring and reviewing the SOP. They will ensure that the SOP reflects the tasks described in the document. Open discussion on controversial aspects related to the SOP should be allowed among reviewers to guarantee a proper revision. Changes to SOPs should preferably be made by the creator. Each change needs again to be reviewed and approved by the responsible persons and by an increase of the revision number (Rule 8). The team of reviewers should agree on a timeframe to monitor and align the current state-of-the-art SOP.

Rule 7: Test with a colleague: Perform training

Congratulations! By reaching this step you wrote your first SOP. Now it is time to check if the SOP is clear and understandable for all potential users. Hence ask a colleague (it can be a researcher or a technician) to read the SOP and, if feasible, execute a test run. Ask the test person to be constructive and critical and wait for his or her feedback or questions. Consider that the test person should be able to run the experiment without any support. Do not interfere while testing to receive authentic feedback and to avoid falsifying the result.

Once written, it is essential that all staff is appropriately trained with and familiarized in the use of the new SOP. This can easily be integrated within the annual safety instructions or as part of a running seminar series. Retraining should be conducted regularly but also when there is a change in SOPs. Ensure that showing attendance at training is documented.

Rule 8: Review and approve

Once your SOP is tested, make it available for the reviewers’ team for a final check. Once substantial comments are received, discuss them with your colleagues to ensure a successful outcome and that the SOP is clear for everybody. Include all relevant edits to improve the document. Repeat the procedure until the SOP is agreed by all stakeholders and, subsequently, send the document to the quality manager for approval afterwards. The final document should be sent to the assigned SOP approvers, who will sign it. In case an SOP is out of date due to the introduction of new technologies or changes in the organizational structure, the SOP should be terminated, labelled (out of date), and archived for traceability. The new and valid version should be distributed. All valid versions should be stored and accessible in a folder at a central location. By this, you ensure that valid and approved SOPs exist and every employee receives the information necessary for their work. To keep track of any deviations from existing SOPs, the establishment and deployment of an exception log might be a powerful instrument. This log should ideally entail a documentation of any deviation, the reason for the deviation, the outcome, any troubleshooting which has been applied, and resolutions and appropriate communication. Regarding the latter, issues or reasons for deviations as well as necessary changes or modifications in the SOP should be discussed with and signed by all involved stakeholders and supervisors, respectively [33].

Rule 9: Update document: Specify validation and periodic review date

Do not forget to review and update an SOP regularly to keep it up to date and useful for current and future use. Your SOP should be validated and reviewed periodically to improve the document and reflect any changes that have been made or are necessary. All changes should be entered into a revision form, which comprises version number, change data, reason and description for change, reviewers’ data, and signatures. The revised or updated document should be shared immediately with all respective users, while clarifying that the former SOP is outdated. The SOP compliance maintenance could be easily implemented by using an electronic lab notebook (ELN), which might keep track of all deviations automatically, dependent on software and functions.

In this case, ensure the interoperability of used formats to enable export of the data into other systems. Furthermore, regular mandatory user training should be set up for the most important SOPs to ensure compliance.

Rule 10: Publish

It's all very good to have work instructions, but what is their value if they are only available in your office when the users who need them are somewhere else? The people performing the work should have easy access to the work instructions anytime and anywhere, most easily accomplished by setting up a secured team server or cloud on which all latest versions are uploaded. To add value to your SOP, make it available to a broad user community you should upload your SOP also in open access public repositories such as Zenodo [34], SEEK [18], OpenAIRE [35], FAIRsharing [3637], or another.

Conclusion

Nowadays digital technologies and advanced computational methods are an integral part of daily laboratory practice. To best manage the generated data and avoid reproducibility issues, scientists need to implement FAIR data principles, suitable DMPs, and appropriate documentation. The lack of reproducibility within laboratory research discourages successful implementation of the wide-spread adoption of research results in the scientific community. One way to improve it is to provide consistency and traceability of existing standards and laboratory practices that are achievable with precise and clearly written SOPs.

Acknowledgments

This publication is based upon work from COST Action CHARME CA15110 [14] supported by COST (European Cooperation in Science and Technology). www.cost.eu.

References

  1. 1. Hardwicke TE, Mathur MB, MacDonald K, Nilsonne G., Banks GC, Kidwell MC, et al. Data availability, reusability, and analytic reproducibility: evaluating the impact of a mandatory open data policy at the journal Cognition. R Soc Open Sci. 2018 Aug 15;5(8):180448. pmid:30225032; PubMed PMCID: PMC6124055
  2. 2. Benefits of sharing. Editorial. Nature. 2016 Feb 11;530(7589):129. pmid:26863943
  3. 3. Doucet M, Becker KF, Björkman J, Bonnet J, Clément B, Daidone M-G, et al. Quality Matters: Annual Conference of the National Infrastructures for Biobanking. Biopreserv Biobank 2017 Jun;15(3):270–276. pmid:27992240; PubMed PMCID: PMC5586151
  4. 4. Borgman CL. Big data, little data, no data: Scholarship in the networked world. The MIT Press 2015.
  5. 5. Harris JK, Wondmeneh SB, Zhao Y, Leider JP. Examining the Reproducibility of 6 Published Studies in Public Health Services and Systems Research. J Public Health Manag Pract. 2019. pmid:29481544
  6. 6. Baker M. Irreproducible biology research costs put at $28 billion per year. Nature News 2015 Jun:9. doi.org/10.1038/nature.2015.17711
  7. 7. Baker M. 1,500 scientists lift the lid on reproducibility. Nature. 2016 May 6;533(7604):452–4. pmid:27225100
  8. 8. Freedman LP, Cockburn IM, Simcoe TS. The Economics of Reproducibility in Preclinical Research. PLoS Biol. 2015. pmid:26057340; Correction: The Economics of Reproducibility in Preclinical Research. PLoS Biol. 2018. PMID: 29634726
  9. 9. Allison D, Brown A, George B, Kaiser KA. Reproducibility: A tragedy of errors. Nature 2016 Feb 4;530(7588):27–9. pmid:26842041; PubMed PMCID: PMC4831566
  10. 10. Baker M. How quality control could save your science. Nature. 2016 Jan 28;529(7587):456–8. pmid:26819028
  11. 11. European Commission. Directorate-General for Research and Innovation (European Commission), PwC EU Services. Cost-Benefit analysis for FAIR research data—Cost of not having FAIR research data. 2019. doi.org/10.2777/706548
  12. 12. Research Data Alliance (RDA) [cited 2020 Feb 6]. Available from: https://www.rd-alliance.org/
  13. 13. Wilkinson M, Dumontier M, Aalbersberg I, Appleton G, Axton M, Baak A, et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci Data. 2016 Mar 15;3:160018. pmid:26978244
  14. 14. cost-charme.eu [Internet]. COST Action Harmonising standardisation strategies to increase efficiency and competitiveness of European life-science research (CHARME); c2016-2020 [cited 2020 Apr 7]. Available from: https://www.cost-charme.eu/. Available from: https://www.cost-charme.eu/
  15. 15. Bergmann FT, Hoops S, Klahn B, Kummer U, Mendes P, Pahle J, et al. COPASI and its applications in biotechnology. J Biotechnol. 2017 Nov 10;261:215–220. pmid:28655634
  16. 16. von Kamp A, Thiele S, Hädicke O, Klamt S. Use of CellNetAnalyzer in biotechnology and metabolic engineering. J Biotechnol. 2017 Nov 10;261:221–228. pmid:28499817
  17. 17. Wittig U, Rey M, Weidemann A, Kania R, Müller W. SABIO-RK: an updated resource for manually curated biochemical reaction kinetics. Nucleic Acids Res. 2018 Jan 4;46(D1):D656–D660. pmid:29092055
  18. 18. Wolstencroft K, Owen S, Krebs O, Nguyen Q, Stanford NJ, Golebiewski M, et al. SEEK: a systems biology data and model management platform. BMC Syst Biol. 2015 Jul 11;9:33. pmid:26160520
  19. 19. Rocca-Serra P, Sansone SA. Experiment design driven FAIRification of omics data matrices, an exemplar. Sci Data. 2019 Dec 12;6(1):271. pmid:31831744; PubMed PMCID: PMC6908569
  20. 20. go-fair.org [Internet]. GO FAIR; [cited 2020 Apr 7]. Available from: https://www.go-fair.org/
  21. 21. Conroy G. Q&A: 5 simple ways to make your research more reproducible. Small details can make a big difference. Nature Index. 31 October 2019; [cited 2020 Apr 7]. Available from: https://www.natureindex.com/news-blog/david-sholl-five-simple-ways-to-make-your-research-more-reproducible
  22. 22. ISO 9000:2015(en) Quality management systems—Fundamentals and vocabulary; [cited 2020 Feb 6]. Available from: https://www.iso.org/obp/ui/#iso:std:iso:9000:ed-4:v1:en
  23. 23. Stamatis DH. Failure Mode and Effect Analysis: FMEA from Theory to Execution, ASQ Quality Press 2003. ISBN 6000047320, 9786000047320
  24. 24. European Medicines Agency—Good laboratory practice compliance. 25 March 2015 EMA/89741/2015. Compliance and Inspection; [cited 2020 Feb 6]. Available from: https://www.ema.europa.eu/en/documents/regulatory-procedural-guideline/triggers-audits-good-laboratory-practice-glp-studies_en.pdf
  25. 25. Hollmann S, Attwood T, Bongcam-Rudloff E. Duca D, D’Elia D. Endrullat C. et al. Standardization and Quality Assurance in Life-Science Research—Crucially Needed or Unnecessary and Annoying Regulation? In: Kalajdziski S, Ackovska N, editors. ICT Innovations 2018. Proceedings of the10th International Conference on Engineering and Life Sciences; 2018 Sep 17–19; Ohrid, Noth Macedonia. Springer, 2019:13–20. Print ISBN 978-3-030-00824-6; Online ISBN 978-3-030-00825-3
  26. 26. Thrive Global [Internet]. 7 Ways to Reprogram a Negative Mindset; c2020 [cited 2020 Feb 6]. Norton R. Simplicity is complex. It's never simple to keep things simple. Simple solutions require the most advanced thinking. Quotable Quote. Available from: https://thriveglobal.com/stories/7-ways-to-reprogram-a-negative-mindset/
  27. 27. International Organization for Standardization [ISO]: ISO 26324:2012 Information and documentation—Digital object identifier system; [cited 2020 Feb 6]. Available from: https://www.iso.org/standard/43506.html
  28. 28. semver.org [Internet]. Semantic Versioning Specification; [cited 2020 Mar 23]. Available from: https://semver.org/
  29. 29. github.com [Internet]. GitHub Inc., [cited 2020 Mar 23]. Available from: https://github.com/
  30. 30. International Organization for Standardization [ISO]:ISO 8601–1:2019 Date and time—Representations for information interchange—Part 1: Basic rules. Available from: https://www.iso.org/standard/70907.html
  31. 31. Hollmann S, Frohme M, Template for Standard Operation Procedure [Internet]. Zenodo; 2020. [cited 2020 Feb 24]. Available from: https://doi.org/10.5281/zenodo.3678317
  32. 32. The Orwell Foundation [Internet]. Why I write [cited 2020 Mar 3]. George Orwell Good prose is like a windowpane. Quotation is available from: https://www.orwellfoundation.com/the-orwell-foundation/orwell/essays-and-other-works/why-i-write/
  33. 33. Aziz N, Zhao Q, Bry L, Driscoll DK, Funke B, Gibson JS, et al. College of American Pathologists' laboratory standards for next-generation sequencing clinical tests. Arch Pathol Lab Med. 2015 Apr;139(4):481–93. pmid:25152313
  34. 34. zenodo.org [Internet]. Zenodo. [cited 2020 Feb 24]. Available from: https://zenodo.org/
  35. 35. OpenAIRE [Internet]. Deposit or publish your research in Open Access. [cited 2020 Feb 24]. Available from: https://explore.openaire.eu/participate/deposit-publications
  36. 36. Sansone SA, McQuilton P, Rocca-Serra P, Gonzalez-Beltran A, Izzo M, Lister AL, et al. FAIRsharing as a community approach to standards, repositories and policies. FAIRsharing Community. Nat Biotechnol. 2019 Apr;37(4):358–367. pmid:30940948
  37. 37. Fairdom [Internet]. The FAIRDOM Platform.c2014-2020 [cited 2020 Mar 8]. Available from: https://fair-dom.org/platform/