Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

A comprehensive scoping review to identify standards for the development of health information resources on the internet

  • Noha Abdel-Wahab,

    Roles Conceptualization, Data curation, Formal analysis, Methodology, Writing – original draft, Writing – review & editing

    Affiliations Section of Rheumatology and Clinical Immunology, Department of General Internal Medicine, The University of Texas MD Anderson Cancer Center, Houston, Texas, United States of America, Rheumatology and Rehabilitation Department, Faculty of Medicine, Assiut University Hospitals, Assiut, Egypt

  • Devesh Rai,

    Roles Data curation, Formal analysis, Writing – original draft

    Affiliation Section of Rheumatology and Clinical Immunology, Department of General Internal Medicine, The University of Texas MD Anderson Cancer Center, Houston, Texas, United States of America

  • Harish Siddhanamatha,

    Roles Data curation, Writing – review & editing

    Affiliation Section of Rheumatology and Clinical Immunology, Department of General Internal Medicine, The University of Texas MD Anderson Cancer Center, Houston, Texas, United States of America

  • Abhinav Dodeja,

    Roles Data curation, Writing – review & editing

    Affiliation Section of Rheumatology and Clinical Immunology, Department of General Internal Medicine, The University of Texas MD Anderson Cancer Center, Houston, Texas, United States of America

  • Maria E. Suarez-Almazor,

    Roles Conceptualization, Methodology, Project administration, Resources, Supervision, Writing – review & editing

    Affiliation Section of Rheumatology and Clinical Immunology, Department of General Internal Medicine, The University of Texas MD Anderson Cancer Center, Houston, Texas, United States of America

  • Maria A. Lopez-Olivo

    Roles Conceptualization, Methodology, Project administration, Resources, Supervision, Validation, Writing – review & editing

    Affiliation Section of Rheumatology and Clinical Immunology, Department of General Internal Medicine, The University of Texas MD Anderson Cancer Center, Houston, Texas, United States of America



Online health information, if evidence-based and unbiased, can improve patients’ and caregivers’ health knowledge and assist them in disease management and health care decision-making.


To identify standards for the development of health information resources on the internet for patients.


We searched in MEDLINE, CINAHL, Scopus, Web of Science, and Google Scholar for publications describing evaluation instruments for websites providing health information. Eligible instruments were identified by three independent reviewers and disagreements resolved by consensus. Items reported were extracted and categorized into seven domains (accuracy, completeness and comprehensiveness, technical elements, design and aesthetics, usability, accessibility, and readability) that were previously thought to be a minimum requirement for websites.


One hundred eleven articles met inclusion criteria, reporting 92 evaluation instruments (1609 items). We found 74 unique items that we grouped into the seven domains. For the accuracy domain, one item evaluated information provided in concordance with current guidelines. For completeness and comprehensiveness, 18 items described the disease with respect to various topics such as etiology or therapy, among others. For technical elements, 27 items evaluated disclosure of authorship, sponsorship, affiliation, editorial process, feedback process, privacy, and data protection. For design and aesthetics, 10 items evaluated consistent layout and relevant graphics and images. For usability, 10 items evaluated ease of navigation and functionality of internal search engines. For accessibility, five items evaluated the availability of websites to people with audiovisual disabilities. For readability, three items evaluated conversational writing style and use of a readability tool to determine the reading level of the text.


We identified standards for the development of online patient health information. This proposed instrument can serve as a guideline to develop and improve how health information is presented on the internet.


The internet has become the new “first aid,” often the first place for patients to get health information given its ease of access. Health information on the internet has rapidly grown over the past two decades, from 10,000 websites providing health care information in 1997 to millions of websites at present [1, 2]. According to a national survey done in 2013 in the United States by the Pew Research Center, one in three American adults have used the internet to find information about a medical condition [3]. Patients not only browse the internet for health information before consulting with a physician [4] but also expect doctors to recommend useful resources and supplementary quality information services [5]. Although health information provided on the internet cannot replace the advice of a health care professional, it can serve as a readily available source of information, which, if evidence-based and unbiased, can improve patients’ knowledge [6, 7].

Websites providing health information for patients should follow high quality standards and be comprehensible. According to a study performed by the US National Center for Education Statistics, nine of ten Americans have difficulty understanding health information on the internet [8]. Multiple instruments have been developed to help health providers identify websites with reliable health information that can be recommended to their patients. In 2002, Eysenbach et al [9] performed a systematic review to establish a methodological framework on how the quality of health information on the internet should be evaluated. The authors identified seven domains that were thought to be a minimum requirement: (i) accuracy, (ii) completeness and comprehensiveness, (iii) technical elements, (iv) readability, (v) design and aesthetics, (vi) accessibility, and (vii) usability. However, no items were provided for each domain for detailed evaluation. These domains were further divided by Zhang et al in 2015 [10]. The technical elements were separated into currency of the information provided, credibility, and privacy and data protection. In addition, usability was separated into navigability, interactivity, and cultural sensitivity. Although items to evaluate these criteria were presented, no explanatory definitions for each item were provided.

To date, no tools have been compiled for reporting items and providing explanatory definitions of the original seven domains identified by Eysenbach et al. We have conducted a review of available instruments used to evaluate websites providing health information for patients to identify relevant items to consider for the development of online health information resources.

Materials and methods

We report our methods and results according to the Preferred Reporting Items for Systematic Review and Meta-Analysis statement (S1 Table).

Eligibility criteria

We included articles describing evaluation instruments that were previously used to evaluate the quality of websites. We excluded articles that provided instruments for librarians, software engineers, or universities, as well as those providing items for advertising or measuring consumer perceptions. We also excluded articles that did not provide items while reporting domains similar to those described by Eysenbach et al [9].

Information sources

We performed a literature search in MEDLINE, CINAHL, Scopus, and Web of Science databases, as well as Google Scholar.


Search terms included “online,” “Internet,” “information,” “resource,” “evaluate,” “assess,” “tool,” “instrument,” “questionnaire,” “website,” “web-site,” and “web site.” The search was performed for published articles, unpublished articles, and online instruments irrespective of language and region. References of the retrieved articles were also searched manually for original citations and additional instruments otherwise not found.

Study selection

Three reviewers (DR, NA-W, and HS) independently selected studies reporting instruments to evaluate health care or general websites. Agreement was achieved through consensus.

Data collection process and data items

Two reviewers (DR, HS) independently extracted all items from the retrieved instruments. Articles in a language other than English (Korean, Spanish, and French) were translated by collaborators proficient in the original language, or by Google translator, to retrieve the items. We included all items that were used to assess at least one of the domains proposed by Eysenbach et al. We did not consider items that evaluated metadata standards, html coding of websites, or financial transactions.

Synthesis of results

Each item was considered relevant if the reviewers reached a consensus to include it in the proposed instrument, with final selection achieved through third party adjudication if needed. Items describing similar criteria were grouped together under the Eysenbach et al proposed domains, and the frequency of each item in the original retrieved instrument was calculated.


Study selection

One hundred sixty articles were identified through databases and 80 through hand searching. After duplicate removal, the full texts of 239 unique articles were assessed for eligibility. One hundred eleven articles met our inclusion criteria, providing 92 distinct evaluation instruments (S1 Fig).

Study characteristics

The characteristics of the 92 included instruments are described in Table 1. The creator, validation, and domains and items proposed by each individual instrument are detailed in S2 Table. We found that 64 instruments (70%) were developed by the authors and 28 (30%) by various organizations. Eighty-three instruments (90%) evaluated health care websites or websites in general (e.g., the DISCERN tool evaluated reliable health information and quality of treatment options[11]), and nine (10%) evaluated disease-specific websites (e.g., measurable criteria for credibility score for diabetes websites [12]). Ninety instruments (98%) targeted the general population, and two (2%) were specific for either low-literacy or elderly populations. Eighty-three instruments (90%) included items for more than one domain of interest and the remaining for only a specific domain. Only 28 instruments (30%) had been validated by independent reviewers and measured interrater reliability. However, none of the identified instruments covered all domains.

Table 1. Characteristics of the included instruments (n = 92).

Synthesis of results

A total of 1,609 items were retrieved from the included instruments (See S2 Table). After removing duplicates, we categorized 74 unique items into the domains proposed by Eysenbach et al. Agreement on item suitability for each domain was achieved by consensus. The final items are listed in Table 2, along with the frequency of reporting of each item in the original 92 instruments retrieved from the literature search.

Table 2. Frequency with which each item was reported in the 92 instruments identified in the literature search.

Interpretations of each identified readability tool are summarized in Table 3. In the following subsections, we describe in detail the items we included under each domain.


Accuracy, also often referred to as reliability, is defined as the extent to which the information provided on the website is in concordance with current standards [32]. Items found for this domain were combined into one because all were measuring the same concept: information should be based on current guidelines or standards of care. That is, the information provided should be a summary of current evidence according to clinical practices guidelines, textbooks, and/or expert consultation when there is no evidence about the topic.

Completeness and comprehensiveness

Eysenbach et al previously proposed completeness and comprehensiveness as one domain. For completeness, also sometimes called coverage, or scope, any website providing health information should cover the main concepts of the topic. We judged that the following eighteen items should ideally be covered to improve understanding of the condition of interest. The disease should be accurately defined (item 1), and the epidemiology, etiology, pathogenesis, clinical features, method of diagnosis, standard management, and typical self-management of the disease (items 2–8) should be reported. In addition, the beneficial and harmful effects of each treatment (item 9) and treatment costs (item 10) should be explicitly mentioned. Disease monitoring (item 11), complications (item 12), and consequences that could ensue if no treatment is used (item 13) should also be described in detail. Areas of uncertainty (item 14) and questions to be discussed with those involved in the patient's care (item 15) should also be mentioned. For comprehensiveness, a platform for users to interact (item 16) should be included, or cases and/or examples of desired behavior (item 17) should be modeled or shown. Lastly, complex topics should be subdivided so that readers can experience small successes in understanding (item 18).

Technical elements

According to Eysenbach et al [9], technical elements depict the way in which information is presented to users. This domain evaluates the trustworthiness of content and the confidentiality of any personal data collected. Twenty-nine items were included under technical elements; 24 of them were reported by Eysenbach as the most frequently used [33]. First, authorship (item 1), author affiliations and credentials (items 2 and 3), physician credentials (item 4), ownership (item 6), and sponsorship (item 7) should be disclosed. If the author is a recognized authority, this should be mentioned (item 5). The editorial review process (item 8) should be described. The hierarchy of evidence (item 9) should also be clear, and advertisements on the website should be distinctly labeled as such and limited to one per screen, with none on the homepage (item 10). The objective of the websites should be clear; a statement explicitly declaring that the information on the websites is not meant to replace the advice of health professionals and clearly describing the aim of the website should be included on the homepage (item 11)[34]. The type of website should also be described as a general disclosure (item 12)—i.e., whether the website serves an educational, nonprofit, or commercial purpose. The website should clearly define its target audience (item 13) and describe sources of information (item 14) and provide references for its claims (item 15). When the content is copyrighted, a creative commons license should be provided (item 16).

The website should also provide detailed disclaimers covering the privacy and data protection policies (item 17), abiding by the directives given by the National Research Council Committee on Maintaining Privacy Security [35] and the European Commission [36]. Data collection procedures (item 18), ability to opt in and out of subscriptions (item 19), and usage of cookies (item 20) should be explicitly mentioned. The website should display an alert when the user is leaving a secured page (item 21). The date of content creation (item 22) should be provided, along with the date of update for each page (item 23), and the date of any technical maintenance should be disclosed beforehand (item 24). Third party links that abide by ethical principles should be provided (item 25). Contact information (item 26) and a feedback mechanism (item 27) should also be provided.

Design and aesthetics

Design and aesthetic elements are the first thing to catch the attention of visitors to a website and can facilitate understanding, the speed with which website visitors can find what they are looking for, and their belief that the website is trustworthy [37]. Ten items were included for this domain. Pleasant visual presentation (item 1) is key, with an option to view the website in a partial window or restore down (item 2) and distinct menus with directional icons, bars, indicators, listings, and indexes (item 3). Text should be at least 12 points with appropriate use of fonts, colors, and capitalization, and users should have the option to change the type size or font (item 4). Appropriate grammar should be used, abbreviations and acronyms should be spelled out the first time they are mentioned on each page, and the use of medical jargon should be limited and clearly defined on a glossary page (item 5). The layout should be consistent (item 6) and the sequence of information should be clear throughout the website, with bigger topics subdivided into subheadings and short lists (item 7). Because images present key messages visually to the reader, relevant images and graphics with an explanation should be provided (item 8). Cover images on the website should be friendly, media could be used to communicate with users, and autoplay should be disabled by default if any audio or video is present (item 9). The website should be compatible with all browsers (item 10).


Usability is defined as “the capability to be used by humans easily and efficiently” [38]. Usability is critical because information is effective only if it can be used with ease. Ten items were included in the usability domain after we combined items measuring the same concept. Content should be well organized ideally with an index, table of contents, navigation icons, breadcrumbs, ability to easily return to the previous and proceed to next pages, site map, and help function favoring an optimal navigation experience (item 1). An internal search engine (item 2) and functionality supporting content such as calculators (item 3) are recommended. If any registration and password protection is present, registration should be done within three screens (item 4). All content should be in a printer-friendly format with an option to download (item 5), and if the content requires large file sizes, the website should mention the size of the file along with the estimated time to download (item 6). Graphics files should be marked with a “mouse over” distinctly indicating the presence of graphical content (item 7). The loading time of the webpage should perform according to standards even at relatively low bandwidth speeds; the current average rate is 512 kilobits per second (this may change with time; item 8). The website content should be tagged with Dublin core tags [39], a component of metadata that increases the searchable index of the website content (item 9). The page should generally not require additional computer applications, and if additional applications are needed, working links for application download should be provided (item 10).


Accessibility is defined by the World Wide Web Consortium (W3C) as “making content more accessible to people with the wider range of disabilities including blindness and low vision, deafness, speech disabilities, learning disabilities, cognitive limitations, limited movement, photosensitivity, and combinations of these” [40]. The W3C proposed Web Content Accessibility Guidelines, which stated ways to make content on the internet accessible to people with various degrees of disabilities [40]. W3C recommended providing an option of increasing the type size and changing the font and background color, as well as offering podcasts for voice reading and text and making all content accessible by keyboard, among other recommendations.

Using the W3C guidelines as a basis, we included five items in accessibility. The website should comply with the W3C 2018 guidelines (item 1). Easy findability (i.e., the ability to search content using minimal steps; item 2) and appropriate color contrast (item 3) should also be present. Comprehension has been found to be better when an individual is addressed in their native language. Thus, the language used on the website should be based on the target audience (item 4), and when the content is translated from one language to another, cultural match is important because some words may have different meanings in various cultures and languages. Sign language interpretation for all recorded materials (audio or video) should be provided when relevant to the target audience. The translated content should be proofread by a language expert before it is released on the website. Similarly, the cultural match of images and examples should be taken into account (item 5).


Readability is the ease with which text can be read [41]. We considered three items from our review: determination of the content level (item 1), appropriate writing style (item 2), and sentence construction (item 3). The US National Institutes of Health and the American Medical Association recommend that readability scores for content written on a website range from sixth- to eighth-grade level [42, 43]. Among the 92 instruments identified, there were 18 readability tools used to determine the years of education needed to understand the content (Table 3). The most commonly used were the Simple Measure of Gobbledygook grading [27], Gunning Fog index [26], and Flesch-Kincaid grade level [20, 28]. It is important to use a readability tool only to determine the reading level of the text and not for assessing the overall suitability of online material, because readability tools fail to take into account comprehension and the role of the reader [44, 45]. Regarding the writing style and the sentence construction, information on the website should be conveyed in an active voice in small sentences and complex information should be broken down.


Although the internet has the capacity to disseminate rich medical information, it can also disseminate inaccurate, biased, and out-of-date information, which can have adverse effects on consumers of health information (e.g., patients and caregivers) [2]. The content of the website has been found to be as important as the design to gain the trust of the viewers and to easily disseminate health-related information if the information is provided in an interactive manner [37, 46]. Therefore, an instrument that incorporates the seven fundamental domains to provide online health information to the patients, which is likely to be different from that provided by print media, is needed [2].

Such an instrument is long overdue. Most studies that previously evaluated the quality of information on websites used criteria derived by personal opinion. The purpose of our study was to identify standards for the development of health information resources on the Internet for patients. In our review, we incorporated all of the domains suggested by Eysenbach et al and Zhang et al [9, 10]. However, one of these studies did not provide the items that should be considered for each domain for in-depth evaluation and the other did not define the items presented. We reviewed the literature and retrieved 92 evaluation instruments. We then defined and summarized all of the items proposed by previous studies and included the retrieved items under their respective domains. We have thus proposed an instrument that furthers the contributions of Eysenbach et al and Zhang et al to the medical literature, incorporating detailed guidelines for the development of websites providing health information to patients.

In 1995, the Economic and Social Council of the United Nations set up the Health on the Net foundation guidelines for online health information. These guidelines included components of completeness and technical criteria but did not include accuracy, readability, design, usability, and accessibility [34]. In the same year, the Journal of the American Medical Association published benchmark criteria to evaluate websites on authorship, attribution, disclosure, and relevance (i.e., whether the information was up to date) but did not cover accuracy, readability, usability, and accessibility [2]. Thereafter, the DISCERN instrument was developed, and this evaluated the reliability and quality of information on treatment choices, including accuracy, completeness, and technical criteria [11]. The principles governing American Medical Association websites were the first to offer additional guidelines for advertising, sponsorship, privacy, confidentiality, and principles for e-commerce, but these guidelines also did not mention readability, usability, and accessibility [47]. Recently, the US Agency for Healthcare Research and Quality created two patient education materials assessment instruments for assessing the understandability and accountability of print and audiovisual patient education materials. These instruments evaluated content, readability, organization, and design, but they did not mention accessibility and usability [48]. All of these guidelines focused only on specific domains for evaluation, and none of them mentioned usability or accessibility. These domains are important and should be considered because the information on the internet can be beneficial only if it is easily conveyed to the audience. Evidence suggests that the navigation experience (e.g., website layout, accessibility) is associated with the trust that users place in the information provided. When information is better displayed and easy to find, users pay more attention, which may result in increased readability, reliability, and improved learning [49].

Several other systematic reviews have evaluated the rating instruments that are commonly used for evaluation of the health-related information on the internet [9, 5054]. Kim et al developed an instrument for evaluation of health-related websites using 12 categories grouping all criteria they identified from the literature search, but that instrument did not include evaluation of readability [53]. All others did not propose any instruments and used only limited categories that were previously proposed for evaluation and were missing evaluation of important domains such as usability and accessibility [5052] (see S3 Table). These authors were contacted to determine their current activities to further refine their published work and avoid duplication of efforts, but such a comperhensive instrument is not under development.

Because the internet has grown at a fast pace and personal health information is being shared among all platforms, we have taken into account all aspects of privacy recommended by authorities, including the US National Research Council proposed guidelines for protecting electronic health information and preventing inappropriate sharing of the personal information of website users [35], as well as the ePrivacy Directive [36] (an updated set of criteria derived from the European Commission devised eEUROPE 2002). These are a set of quality criteria for health-related websites that provide directives for transparency and honesty, authority, privacy and data protection, updating of information, accountability, and accessibility [55].

Any content provided to patients can be beneficial only if the patient will be able to comprehend the information, and according to the US National Center for Education Statistics Survey, 50% of Americans are unable to understand above an eighth grade reading level [8]. The Suitability Assessment of Material guidelines proposed ways to make content understandable for a patient with low literacy skills. These guidelines also pointed to the cultural relevance of language, images, typography and layout, and multimedia materials for patients [56]. We advocate using a readability tool in addition to other items related to the writing style and sentence construction, as recommended by the Suitability Assessment of Material guidelines [56].

Our proposed instrument contains extensive and precise elements based on, to the best of our knowledge, the available guidelines. The instrument includes all necessary domains and proposes a detailed definition of items, which will pave the way for efficient development of websites providing health care information to patients. Our instrument is further strengthened by the inclusion of the Web and Usability Guidelines [57], the Guide to Writing and Designing Easy-to-Use Health Websites [57], the Accessible Health Information Technology Guidelines for Populations with Limited Literacy [55, 58], and the Web Content Accessibility Guidelines [40]. Moreover, our instrument is not restricted to the development of disease-specific content, although it can be used to develop websites with a specific focus. Although we sought to generate a comprehensive list of all items required for our instrument, we did emphasize some disease-specific components, such as pathogenesis and coverage of areas of uncertainty, as well as website-specific components such as internal search engines, breadcrumbs, and Dublin core tags, which were mentioned in a few studies. The items included in our instrument will serve as standards for developing future online health information. We included an extensive list of domains and items in an attempt to ensure comprehensiveness, however, the Internet is constantly changing and new technologies may dictate future modifications to the number and complexity of the items included. All collected information is available as a supplementary material.


In conclusion, as the reach of the internet grows and evolves, the components needed for the development of online health information must be continually improved, and new components must be added. A static instrument will not always suffice to report information over such a dynamic platform as the internet. Therefore, development criteria need to be periodically revised with the addition of new aspects or modification of existing aspects while retaining those considered fundamental [47].

Supporting information

S2 Table. Instruments included in our analysis (n = 92).


S3 Table. Summary of previous systematic reviews evaluating the rating instruments used for evaluation of health-related information on the internet.



We sincerely thank Eliuth Hosse Lopez-Olivo and Yongchang Jang for their assistance in translating articles in Spanish and Korean, respectively, to English. We also thank Erica Goodoff, Senior Scientific Editor in the Department of Scientific Publications at The University of Texas MD Anderson Cancer Center, for editing assistance. We are grateful to Gregory F. Pratt (D.D.S, M.S.L.I.S) from the Research Medical Library at The University of Texas MD Anderson Cancer Center for assisting in the study selection.


  1. 1. Basnet S, Dhital R, Tharu B. Acute Tubulointerstitial Nephritis: A Case Report on Rare Adverse Effect of Pembrolizumab. Medicina (Kaunas). 2019;55(5):21. pmid:31117208.
  2. 2. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor—Let the reader and viewer beware. Jama. 1997;277(15):1244–5. Epub 1997/04/16. pmid:9103351.
  3. 3. Fox S, Duggan M. Health online 2013: Pew Research Center 2013 [October 10, 2016]. Available from:
  4. 4. Hesse BW, Nelson DE, Kreps GL, Croyle RT, Arora NK, Rimer BK, et al. Trust and sources of health information: the impact of the Internet and its implications for health care providers: findings from the first Health Information National Trends Survey. Archives of internal medicine. 2005;165(22):2618–24. Epub 2005/12/14. pmid:16344419.
  5. 5. Diaz JA, Sciamanna CN, Evangelou E, Stamp MJ, Ferguson T. Brief report: What types of Internet guidance do patients want from their physicians? Journal of general internal medicine. 2005;20(8):683–5. Epub 2005/07/30. pmid:16050874; PubMed Central PMCID: PMC1490184.
  6. 6. Cutilli CC. Seeking health information: what sources do your patients use? Orthopedic nursing. 2010;29(3):214–9. Epub 2010/05/28. pmid:20505493.
  7. 7. Heikkinen K, Leino-Kilpi H, Salantera S. Ambulatory orthopaedic surgery patients' knowledge with internet-based education. Methods Inf Med. 2012;51(4):295–300. Epub 2012/04/06. pmid:22476362.
  8. 8. Kutner M, Greenburg E, Jin Y, Paulsen C. The Health Literacy of America's Adults: Results from the 2003 National Assessment of Adult Literacy. NCES 2006–483. National Center for Education Statistics. 2006.
  9. 9. Eysenbach G, Powell J, Kuss O, Sa ER. Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review. Jama. 2002;287(20):2691–700. Epub 2002/05/22. pmid:12020305.
  10. 10. Zhang Y, Sun YL, Xie B. Quality of health information for consumers on the web: A systematic review of indicators, criteria, tools, and evaluation results. J Assoc Inf Sci Tech. 2015;66(10):2071–84. WOS:000361184500008.
  11. 11. Charnock D, Shepperd S, Needham G, Gann R. DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. Journal of epidemiology and community health. 1999;53(2):105–11. Epub 1999/07/09. pmid:10396471; PubMed Central PMCID: PMC1756830.
  12. 12. Seidman JJ, Steinwachs D, Rubin HR. Design and testing of a tool for evaluating the quality of diabetes consumer-information Web sites. Journal of medical Internet research. 2003;5(4):e30. Epub 2004/01/10. pmid:14713658; PubMed Central PMCID: PMC1550576.
  13. 13. Sherman LA. Analytics of literature: A Manual for the Objective Study of English Prose and Poetry. Co G, editor. Toronto: HardPress Publishing; 1847–1933.
  14. 14. Thorndike EL, Lorge I. The teacher's word book of 30,000 words. 2ND edition ed. Columbia University: Teachers College 1963.
  15. 15. Lively BA, Pressey SL. A method for measuring the "vocabulary burden" of textbooks: Educational Administration and Supervision; 1923.
  16. 16. Patty WW, Painter WI. A technique for measuring the vocabulary burden of textbooks. J Educ Res. 1931;24(2):127–34.
  17. 17. Lorge I. Predicting readability. Teachers College Record. 1944.
  18. 18. Dale E, Chall JS. A formula for predicting readability: Instructions. Educ Res Bull. 1948:37–54.
  19. 19. Chall JS, Dale E. Readability revisited: The new Dale-Chall readability formula: Brookline Books; 1995.
  20. 20. Flesch R. A new readability yardstick. J Appl Psychol. 1948;32(3):221. pmid:18867058
  21. 21. Spache G. A new readability formula for primary-grade reading materials. Elem Sch J. 1953;53(7):410–3.
  22. 22. Powers RD, Sumner WA, Kearl BE. A recalculation of four adult readability formulas. J Educ Psychol. 1958;49(2):99.
  23. 23. Bormuth JR. Readability: A new approach. Read Res Q. 1966:79–132.
  24. 24. Senter R, Smith EA. Automated readability index. Virginia: Clearinghous for Federal Scientific and Technical Information, 1967.
  25. 25. Fry E. Fry Readability Graph method -A readability formula that saves time. J Reading. 1968;11(7):513–78.
  26. 26. Gunning R. The fog index after twenty years. J Bus Comm. 1969;6(2):3–13.
  27. 27. Mc Laughlin GH. SMOG Grading-a New Readability Formula. J Reading. 1969;12(8):639–46.
  28. 28. Kincaid JP, Fishburne RP Jr, Rogers RL, Chissom BS. Derivation of new readability formulas (automated readability index, fog count and flesch reading ease formula) for navy enlisted personnel. Institute for Simulation and Training. Paper 56, 1975.
  29. 29. Coleman M, Liau TL. A computer readability formula designed for machine scoring. J Appl Psychol. 1975.
  30. 30. Raygor AL. The Raygor readability estimate: A quick and easy way to determine difficulty. P. D. Pearson E, editor. Clemson: National Reading Conference; 1977.
  31. 31. Ford P, Caylor J, Sticht T, editors. The FORCAST readability formula. Pennsylvania State University Nutrition Center, Bridge to Excellence Conference; 1992.
  32. 32. Daraz L, MacDermid JC, Wilkins S, Shaw L. Tools to evaluate the quality of web health information: a structured review of content and usability. Int J Tech Knowl Soc. 2009;5(3):127–41. ISSN 1832-3669.
  33. 33. Eysenbach G. Infodemiology: The epidemiology of (mis)information. The American journal of medicine. 2002;113(9):763–5. Epub 2003/01/09. pmid:12517369.
  34. 34. Health On The Net Foundation. HONcode. Accessed on: 08-17-2016; available at: https://wwwhealthonnetorg/HONcode/Conducthtml.
  35. 35. The National Academy Press. For the Record Protecting Electronic Health Information: National Academies Press (US); 1997 [October 28, 2016]. Available from:
  36. 36. European commission. Evaluation and review of the ePrivacy Directive 2016 [Octoberr 28, 2016]. Available from:
  37. 37. Sillence E, Briggs P, Fishwick L, Harris P. Trust and mistrust of online health sites. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems; Vienna, Austria. 985776: ACM; 2004. p. 663–70.
  38. 38. Usability-context Shackel B., framework, definition, design and evaluation. Human factors for informatics usability. 1991:21–37.
  39. 39. Dublincore. Dublin core metadata initiative [August 19, 2016]. Available from:
  40. 40. Caldwell B, Cooper M, Guarino Reid L, Vanderheiden G. Web Content Accessibility Guidelines 2.0: W3C Recommendation 2008 [updated 11 December, 2008August 12, 2016]. Available from:
  41. 41. DuBay WH. The Principles of Readability: Impact Information; 2004 [cited 2019 February 06]. 2nd ed.]:[Available from:
  42. 42. NIH U.S. National Library of Medicine MedlinePlus. MedlinePlus Guide to Healthy Web Surfing [Augsut 22, 2016]. Available from:
  43. 43. Weiss BD. Health Literacy and Patient Safety: Help Patients Understand: Reducing the Risk by Designing a Safer, Shame-Free Health Care Environment: American Medical Association; 2007.
  44. 44. Redish J. Readability formulas have even more limitations than Klare discusses. ACM J Comput Doc. 2000;24(3):132–7.
  45. 45. Redish JC, Selzer J. The place of readability formulas in technical communication. Tech Commun. 1985;32(4):46–52.
  46. 46. Zhang Y, Sun YL, Xie B. Quality of health information for consumers on the web: A systematic review of indicators, criteria, tools, and evaluation results. J Assoc Inf Sci Tech. 2015;66(10):2071–84. WOS:000361184500008.
  47. 47. Winker MA, Flanagin A, Chi-Lum B, White J, Andrews K, Kennett RL, et al. Guidelines for medical and health information sites on the internet: principles governing AMA web sites. American Medical Association. Jama. 2000;283(12):1600–6. Epub 2000/03/29. pmid:10735398.
  48. 48. Shoemaker SJ, Wolf MS. The Patient Education Materials Assessment Tool (PEMAT) and User’s Guide 2013 [August 5, 2016]. Available from:
  49. 49. Vega LC, Montague E, Dehart T. Trust between patients and health websites: a review of the literature and derived outcomes from empirical studies. Health and technology. 2011;1(2–4):71–80. pmid:22288026; PubMed Central PMCID: PMC3266366.
  50. 50. Bernstam EV, Shelton DM, Walji M, Meric-Bernstam F. Instruments to assess the quality of health information on the World Wide Web: what can our patients actually use? International journal of medical informatics. 2005;74(1):13–9. Epub 2005/01/01. pmid:15626632.
  51. 51. Gagliardi A, Jadad AR. Examination of instruments used to rate quality of health information on the internet: chronicle of a voyage with an unclear destination. BMJ (Clinical research ed). 2002;324(7337):569–73. Epub 2002/03/09. pmid:11884320; PubMed Central PMCID: PMC78993.
  52. 52. Jadad AR, Gagliardi A. Rating health information on the Internet: navigating to knowledge or to Babel? Jama. 1998;279(8):611–4. Epub 1998/03/05. pmid:9486757.
  53. 53. Kim P, Eng TR, Deering MJ, Maxfield A. Published criteria for evaluating health related web sites: review. BMJ (Clinical research ed). 1999;318(7184):647–9. Epub 1999/03/05. pmid:10066209; PubMed Central PMCID: PMC27772.
  54. 54. Sagaram S, Walji M, Meric-Bernstam F, Johnson C, Bernstam E. Inter-observer agreement for quality measures applied to online health information. Studies in health technology and informatics. 2004;107(Pt 2):1308–12. pmid:15361026.
  55. 55. Commission of the European Communities. eEurope 2002: Quality Criteria for Health Related Websites. Journal of medical Internet research. 2002;4(3):E15. Epub 2003/01/30. pmid:12554546; PubMed Central PMCID: PMC1761945.
  56. 56. Doak CC, Doak LG, Root JH. Teaching Patients with Low Literacy Skills. Am J Nurs. 1996;96(12):16M. 00000446-199612000-00022.
  57. 57. U.S. Department of Health & Human Services. Web and Usability Guidelines [August 18, 2016]. Available from:
  58. 58. Eichner J, Dullabh P. Accessible Health Information Technology (IT) for Populations with Limited Literacy:A Guide for Developers and Purchasers of Health IT. Maryland: Agency for Healthcare Research and Quality; 2007. Available from: