Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Interventions to Assist Health Consumers to Find Reliable Online Health Information: A Comprehensive Review



Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers.


To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied.

Data Sources

PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus ‘gray literature’ search engines; and manual review of reference lists of selected publications.

Study Selection

Publications were selected by firstly screening title, abstract, and then full text.

Data Extraction

Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design) Model. Two eligible gray literature papers were also reported.

Data Synthesis

Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported.


While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn.


The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for assisting health consumers to find reliable online health information, and to assess outcomes via objective measures.


There is a growing body of evidence that health consumers increasingly rely on the Internet for health information [1][8]. This increase is largely due to 1) the immense abundance of online information [9], [10], decision aids [11] and Web 2.0 health applications [10], [12], [13], 2) the increasing prevalence of chronic disease in society [14], and 3) the pervasiveness and accessibility of Information Technology in our daily lives [15].

While the role of health professionals in advising and assisting health consumers with decision making is undisputed [1], [16], [17], it is unrealistic, in terms of the health professional's time and ability, to control out-of-consultation behavior of consumers, and to expect consumers to rely entirely on written and verbal information provided by their regular practitioner(s). Indeed, health consumers empowered by information-seeking online may be more engaged in the management of their health conditions, and form more productive relationships with their healthcare practitioners [18].

In order to effectively engage in self-management, health consumers must be able to effectively find, understand and utilize relevant health information [19][21]. At the same time, health consumers must also be able to discern reliable from less-reliable information. Collectively, these abilities and skills constitute the definition of an individual's health literacy [19][21]. The literature suggests that many health consumers have low levels of health literacy [21][26] which hinders their ability to effectively find, understand and use reliable health information to assist them manage their health conditions. As use of the Internet requires technological knowledge and skills [9], finding health information online potentially becomes an even greater burden on consumers with poor computer literacy as well as low levels of health literacy.

Rationale and Objectives

Given the prevalence of chronic diseases in society and the important role self-management plays in chronic disease management, there appears to be a need for initiatives to assist health consumers to develop their capacity to find reliable health information on the Internet, thereby potentially contributing to improved levels of health literacy. Our comprehensive review follows a systematic review by Car et al. [9] that examined the effects of online health literacy training interventions on health outcomes. We intended to focus on the gambit of interventions (face-to-face, online or via other means) implemented by researchers to assist adult health consumers to autonomously find reliable online information about chronic health conditions, and where possible, their outcome measures. This includes, but is not limited to, training interventions.

As the focus of our review is on identifying and descriptively comparing humanistic interventions, as opposed to evaluating formal clinical trials, assessment of the quality of each study (as would be performed in a systematic review) was not formally conducted. Nevertheless, we adopted a systematic approach to the identification and selection of relevant studies, and applied methodological [27] and reporting [28] standards for the conduct of systematic reviews where practical.


The method was guided by the Methodological Standards for the Conduct of Cochrane Intervention Reviews [27], where relevant, recognizing the humanistic nature of the interventions of interest.

Eligibility Criteria

The eligibility criteria for published studies were developed via consultation between all authors of this review, and were:

  1. Participants: at least 18 years of age
  2. Intervention: any approach where the intention, primary or other, was to assist health consumers in autonomously finding existing reliable online information related to chronic health conditions
  3. Language: publications in English.

Publications were included if the study, or a portion of the study, met all of the above criteria. Studies that met criteria 2 and 3, but included some participants less than 18 years of age, were considered relevant. Studies were excluded if the primary intervention involved the development of new online health information material, such as producing easy-to-understand material for the purpose of assisting people with low health literacy. Studies were also excluded where assistance was provided to help participants use a particular existing online health website. While the reporting of outcomes for studies is a critical component to evaluating quality, and is desirable, studies that did not report outcomes were not excluded from this review as the primary objective of this review was to identify the types of interventions that have been implemented.

Information Sources

Four health-related databases and two popular search engines for ‘gray literature’ were utilized for this review. While the term ‘gray literature’ is poorly defined in literature, we defined it as studies published in a format other than academic journals. Reference lists of selected publications were also searched, and authors were contacted where clarification or more information was needed for data analysis.

The databases used were PubMed, PsycINFO, CINAHL Plus and Cochrane Library. Gray literature was searched via WorldCat and Scirus. No limits were applied to dates, although it was expected that most research in this field would be recent due to the relative recency of the Internet and Information Technologies. The last search was conducted on 11 February 2013.


Search terms and search strings were agreed upon by all authors of this review, and further refined by the first author, KL, after pilot searching various permutations of search terms in PubMed and CINAHL Plus. To optimize efficiency and minimize the volume of irrelevant publications returned, a single search string was used for all databases and gray literature search engines. A second search, using MeSH terms, was conducted on the four databases. This is because MeSH terms are manually assigned after a study has been published; thus, a search using MeSH terms could potentially produced different results. Manual searching of reference lists from selected publications were subsequently conducted to ensure a more comprehensive search.

The following search terms were used to search all databases and gray literature search engines: consumer*; patient*; find*; search*; navigat*; seek*; access*; retriev*; locat*; identify*; “health literacy”; informat*; internet; online; web*.

An example of a search string that was used for this review is:

((consumer* or patient*)) AND ((find* OR search* OR navigat* OR seek* OR access* OR retriev* OR locat* OR identif*)) AND “health literacy” AND informat* AND ((internet or online or web*))

Study Selection, Data Collection Process and Data Items

After retrieval of publications from the aforementioned databases and search engines, duplicate publications were removed.

A stepwise approach of selecting relevant publications firstly via title, then abstract, then full text was used, adapting relevant sections of the PRISMA flow diagram for reporting of systematic reviews [28]. In this approach, titles of publications were firstly screened for relevance to the review, with relevant publications retained. Where relevance was unclear from examination of title alone, the abstract was then scanned and irrelevant publications were subsequently removed. Publications in which both title and abstract were examined, and relevance to the review was still unclear, were retained for full-text review. Reference lists of selected publications were manually searched for further relevant sources.

Eligibility of publications was further ascertained by full-text review. Authors were contacted for supplementary information where necessary.

A data extraction form was developed to assist data collection and subsequent analysis. This form was developed and piloted by KL, and reviewed by LE. All elements of the PICOS model [28] were incorporated into the form. As few relevant papers were anticipated, trial of the form was not considered necessary, although refinements were incorporated during data extraction. KL extracted data from the included studies, and LE, confirmed the extracted data. Consensus was reached on all points via discussion between KL and LE.

As this review primarily focuses on identifying types of interventions with a view to also identifying types of outcomes where possible, as opposed to examining the effects of these interventions on outcomes, risk of bias in individual studies and across studies was not assessed. It was anticipated that little methodological comparison would be viable.


Study Selection

The literature search identified 707 publications after duplicates were removed (Figure 1). Of these, 680 publications were excluded through the screening process (via their titles and abstracts). Common reasons for exclusion were that the publications reported train-the-trainer interventions with no follow-up of the trainers subsequently coaching health consumers, interventions to help consumers find information on a particular website, studies about the various implications of poor health literacy, and exploratory studies of health information-seeking behaviors.

Figure 1. Applies principles of the PRISMA Flow diagram template and outlines the process used to identify, screen, select, and analyse studies for this comprehensive review.

A total of 27 publications were reviewed in full to assess their eligibility for this review. To further assist in assessing eligibility of publications, 14 authors were contacted for supplementary information, with four replies. Overall, seven journal publications and two gray literature reports were deemed eligible for synthesis and analysis. Due to missing data in the gray literature reports, they are described in a separate section from the published studies.

Study Characteristics

Of the seven published studies [29][35], two [29], [32] were randomized controlled trials, and the remaining five [30], [31], [33][35] were uncontrolled, single-group trials (Table 1). The number of participants for one study [31] was unknown, while the number of participants for the remaining six studies ranged from 60 to 448. A convenience sampling method was used in all studies. Of the six studies conducted in the United States [29], [31][35], three [31], [33], [34] were located in regional or rural parts of the United States. The remaining study [30] was conducted in metropolitan Melbourne, Australia.

Table 1. Summary of Interventions, Outcomes, and Brief Critique – Published Studies.

The interventions for three [30], [31], [34] of the seven studies were administered in a single session. The total duration of studies with multiple-session interventions ranged from three weeks to nine months.

Regarding instruments used to measure outcomes, three studies [30], [32], [35] used adapted instruments and one study [29] utilized standard instruments for measuring biochemical and biometric markers of health outcomes in people with diabetes. In terms of the participant outcomes measured, one study [29] indirectly measured health outcomes, while the remaining six studies measured one or more humanistic indicators of health information knowledge, skills and behavior, for which, measurements are not directly comparable. Demographic data for participants were reported in varying degrees for six studies [29][32], [34], [35]. Common demographic variables were age [29], [32], [34], [35], gender [29], [30], [32], annual income [29], [30], [32], [35], and level or years of education [29][32], [35].

Design of Interventions

Interactive workshops.

Interactive workshops featured as the main intervention in five [30][32], [34], [35] of the seven published studies. The style of the interactive workshop for one study [31] was unclear, however, four studies [30], [32], [34], [35] used a combination of a didactic approach to teaching and hands-on activities that involved participants searching for health information online. In most of the studies, participants were issued a copy of the presentation slides, as well as lists of credible health websites. Further, discussion amongst participants was encouraged during the workshops in three studies [30], [32], [35]. In particular, collaborative learning was emphasized in one [35] of the studies. Although the intervention for this study included a didactic component, the workshop trainer encouraged interaction between participants. In another study [31], a website hosted the online version of the in-person interactive workshop. This online version used the same content as the in-person workshop.

Common workshop topics included how to judge reliability of health information and credibility of health websites, and awareness of credible websites. Two studies [34], [35] used content from The Trainer's Toolkit from the NIHSeniorHealth website ( as the foundation of their workshops. Participants in one study [31] were provided information on their medical condition, in addition to being educated on how to find credible health information online.

Health literacy curriculum and community outreach.

In one study [33], a health literacy curriculum was designed and trialed in two middle schools, two high schools and one adult education program, delivered by teachers and librarians. The curriculum was designed as five one-hour lessons with activities, although teachers and librarians were allowed to deliver each lesson at their own pace. As a result, the duration of the intervention ranged from three weeks to three months. In the final lesson, participating students shared what they had learnt with seniors in their community. In this way, seniors were taught by participating students how to find health information online.

The curriculum established that health information is commonly organized via disease type and population, and participants practiced search techniques. Students were also given a checklist adapted from the QUICK website ( for evaluating the credibility and reliability of online health information.

Online portal with support via videoconferencing.

One study [29] took an arguably more holistic approach, in the development of an online portal to house three modules: self-management, health education, and social networking. The self-management module provided a space for each participant to access their individualized care plans. These plans were reviewed during the bi-weekly videoconferencing with a telehealth nurse, and were amended, if needed, by the participant's physician. During these videoconferences, participants were also encouraged to raise questions with the telehealth nurse. The health education module consisted of age and culturally-appropriate educational videos and links to various health-related websites. The social networking module provided a space for participants in the study to interact with one another, and participants were encouraged to share preferred educational resources.

Outcomes Studied

All seven published studies demonstrated either positive-significant, or positive-but-non-significant outcomes in at least some of the measured outcomes (Table 1). No study reported any worsening of outcomes from baseline/pre-intervention. Of the measured outcomes, many were self-reported. Two of the published studies [29], [31] included some form of objective measure – one study [29] used biometric and biochemical markers, another study [31] used an objective test of knowledge. One study [32] conducted a follow-up assessment up to nine months post-intervention.

Knowledge and skills.

The breadth of knowledge and skill-oriented outcomes included participants' knowledge of their medical condition [29], awareness of what is evidence-based health information [30], [33], knowledge of credible online health resources [31], ability to find reliable online health information [30], ability to find relevant online health information [34], [35], ability to evaluate reliability of online health information [32], [33], and general computer/Internet knowledge and skills [35]. The majority of the knowledge outcomes were based on self-perceived measures of knowledge, with the exception of two studies [29], [31]. Carter et al. [29] applied a brief test of diabetes knowledge as part of a survey pre- and post-intervention, whereas Gross et al. [31] measured knowledge of credible online health resources via a pre-test and post-test questionnaire based on items covered in the interactive workshop. Two studies tested their participants' skills: one [32] tested participants' ability to evaluate reliability of online health information by instructing participants to rate two pre-selected websites on five dimensions of website quality, as specified by a previous study conducted by Kim et al. [36]; while the other [34] investigated participants' ability to find relevant online health information by instructing participants to find answers to two questions on a pre-selected health condition.

Attitude and behavior.

In one study [30], the key attitude and behavior-oriented outcome were participants' change in the way they search for health information, and in another study [32], information-seeking self-efficacy. In both of these studies, attitudes and behaviors were assessed based on self-perceived measures.

Health outcomes.

One study [29] indirectly assessed health outcomes in patients with diabetes via measurement of biochemical and biometric markers for both treatment and control groups pre- and post-intervention: mean weight, mean blood pressure, and mean hemoglobin A1c.

Gray Literature Reports

The interventions, outcomes and brief critique of relevant aspects of the two gray literature reports are summarized in Table 2. Both reports relate to projects aimed to facilitate access to reliable health information for various communities in the United States. The Access to Resources for Community Health (ARCH) project [Schneider E. ARCH Evaluation Focus Groups. Boston (USA): Massachusetts General Hospital; 2009] focuses on providing access to online health information, whereas the Medline in the Mountains project [Carlson G. NN/LM Quarterly Report. Colorado (USA): Poudre Valley Health System; 2003] focuses on providing access to print and electronic health information resources and services.

Table 2. Summary of Interventions, Outcomes, and Brief Critique – Gray Literature Reports.

The use of group training sessions was common to both projects. The ARCH training focused on teaching general computer and Internet skills, and how to use their project's website to find reliable health information. The Medline in the Mountains training focused on teaching skills to evaluate reliability of health information, and how to use health information databases; their website appeared to supplement the training, as it hosted materials on how to evaluate reliability of health information, and listed various health information databases.

While both projects were conducted in the United States, the ARCH project was conducted in a metropolitan city, while the Medline in the Mountains initiative addressed various geographically isolated and rural communities. The ARCH project utilized focus groups to evaluate their outcomes, while a pre-/post-intervention survey was utilized in the Medline in the Mountains study, albeit with limited data reported.


The seven published studies [29][35] included in this review presented a number of substantive design limitations, including small samples and the use of descriptive analysis. Outcomes were predominantly assessed via self-reported pre-post measures, which arguably have greater potential for bias [37][39]. The majority of these studies did not describe any validation process for the instruments used. Further, only two studies [29], [32] were randomized-controlled studies. Despite this, four [30], [31], [33], [35] of the five remaining studies assessed either participants' attitudes or outcomes based on pre- and post-intervention measures; this facilitates a more meaningful way to compare the impact of the intervention. Nevertheless, only two [30], [35] of these four studies went beyond the use of descriptive statistics to assess statistical significance of their findings. Thus, the design characteristics, analysis, and perceived overall quality of these studies highlight areas to be addressed in future research. Furthermore, the nature of the studies did not lend themselves to being evaluated systematically.

Our review of the intervention type identified that five [30][32], [34], [35] studies used workshops with varying levels of interactivity. Although a didactic approach to helping participants search for online health information was not the sole approach to teaching participants, it was a prominent approach that featured in all of the included studies. Didacticism is a prominent, yet arguably outdated, philosophy in education [40]. Despite vast literature suggesting that student-centered pedagogical approaches may be more effective than didactic education in formal education settings [40]-[42], such approaches do not appear prominent in the education of health consumers, as based on our findings, and it is unclear whether people with limited health literacy respond better to, or prefer, didactic approaches to learning. The concept of patient-centered care - whereby patients have active roles in healthcare, and healthcare is based on the individual patient - is flourishing [43]-[44]. It appears that a didactic approach to developing health consumers' online information-seeking skills may only partially address the issue, given factors such as the recognized importance health consumers' play in day-to-day management of their health, and an era of fast-developing technologies and information abundance. Thus, there appears to be potential to apply various student-centered pedagogical approaches to educating health consumers. Further, it can be argued that a potential contributing factor to the positive results demonstrated by the five workshop-based studies is due to engagement [45] between the workshop trainer and the participants. Thus, reproducibility of results could be problematic.

It is also important to note that, of the included studies, only one study [35] explicitly indicated the use of a particular learning framework – collaborative learning. There is therefore potential for future studies to examine different learning frameworks in the online health literacy space.

All of the seven studies measured varying facets of knowledge, skills or both domains; in a majority, knowledge and skill domains were assessed via measures that were self-reported by participants. While it is important to gain insight into participants' perceptions of the impact of the intervention, the use of objective tests would arguably improve objectivity and minimize potential for bias. Despite the apparent lack of objective tests in many of the included studies, it is important to note that there were no reported worsening of outcomes from baseline/pre-intervention in any of the included studies. This suggests that, while the effectiveness of didactic approaches are yet to be compared with other approaches, current identified interventions do not appear to have negative or detrimental outcomes.

In the two included gray literature reports, a didactic approach to helping participants search for reliable online health information was prominent - similar to the findings from the seven included published studies. The intervention for the first of these studies aimed to general computer and Internet skills, with a view to optimize participants' use of a particular website developed specifically for the project. The second study focused on improving skills to evaluate reliability of health information and use of health information databases. These interventions also bear similarity to some of the included published studies. In terms of evaluation of outcomes, no objective measures were used.


Few studies were deemed eligible for inclusion. It is possible that the use of additional search strings may yield more relevant studies. While the search strings were reviewed by all authors of this review (discipline experts), and advice had been sought from a medical librarian prior to the review, participation in the searches by a medical librarian may have provided a different perspective. While extensive, the gray literature search may have had limited reach in some countries. The finding of few relevant published studies on this topic highlights the potential for further research. Furthermore, their varied application of scientific methods limited our ability to compare them systematically.


Despite the increasing pervasiveness of, and reliance on, the Internet in the area of health information, we found few reports of interventions to assist health consumers to find reliable health information online. The identified studies measured varying facets of knowledge and/or skills, and commonly reported a didactic approach to training their participants. Outcomes studied were predominantly assessed via self-report by participants, and while arguably subjective, were largely positive and there was no report of worsening of outcomes. As such, there is considerable scope for further research to enhance consumer health literacy in the context of sourcing online information. With the pervasion of mobile technologies and popularity of social media, future research could focus on these developments. Specific interventions could focus on transparent labeling of trustworthy websites – or conversely, ‘blacklists’ of websites deemed biased or otherwise unreliable.

Additionally, as only one identified study explicitly indicated the use of a particular learning framework, there is also potential for further research to explore the use of various learning frameworks in the online health literacy space.

Author Contributions

Conceived and designed the experiments: KL KH JDH LME. Performed the experiments: KL. Analyzed the data: KL LME. Wrote the paper: KL KH JDH LME.


  1. 1. Andreassen HK, Bujnowska-Fedak MM, Chronaki CE, Dumitru RC, Pudule I, et al.. (2007) European citizens' use of E-health services: a study of seven countries. BMC Public Health 7: : 53. Available: Accessed 2014 Mar 17.
  2. 2. Attfield SJ, Adams A, Blandford A (2006) Patient information needs: pre- and post-consultation. Health Informatics J 12(2): 165–77
  3. 3. Bell RA, Hu X, Orrange SE, Kravitz RL (2011) Lingering questions and doubts: Online information-seeking of support forum members following their medical visits. Patient Educ Couns 85(3): 525–528
  4. 4. Davison BJ, Gleave ME, Goldenberg SL, Degner LF, Hoffart D, et al.. (2002) Assessing information and decision preferences of men with prostate cancer and their partners. Cancer Nurs 25: : 42–9. Available: Accessed 2014 Mar 17.
  5. 5. Fox S. (2011) Health Topics. Report from the Pew Internet and American Life Project. Washington, DC: Pew Research Center. Available: Accessed 1 Feb 2011.
  6. 6. Halpert A, Dalton CB, Palsson O, Morris C, Hu Y, et al. (2008) Patient educational media preferences for information about irritable bowel syndrome (IBS). Dig Dis Sci 53(12): 3184–90
  7. 7. Rice RE (2006) Influences, usage, and outcomes of Internet health information searching: multivariate results from the Pew surveys. Int J Med Inform 75: 8–28 Available:
  8. 8. McDaid D, Park AL. (2010) Online Health: Untangling the Web. The London School of Economics and Political Science. Available:
  9. 9. Car J, Lang B, Colledge A, Ung C, Majeed A. (2011) Interventions for enhancing consumers' online health literacy. Cochrane Database Syst Rev 6 . doi:10.1002/14651858.CD007092.pub2.
  10. 10. Metzger MJ, Flanagin AJ (2011) Using Web 2.0 Technologies to Enhance Evidence-Based Medical Information. J Health Commun 16(sup1): 45–58
  11. 11. Raats CJ, van Veenendaal H, Versluijs MM, Burgers JS (2008) A generic tool for development of decision aids based on clinical practice guidelines. Patient Educ Couns 73(3): 413–417
  12. 12. Ekberg J, Ericson L, Timpka T, Eriksson H, Nordfeldt S, et al. (2010) Web 2.0 systems supporting childhood chronic disease management: design guidelines based on information behaviour and social learning theories. J Med Syst 34(2): 107–17 Available:
  13. 13. Adams SA (2010) Blog-based applications and health information: two case studies that illustrate important questions for Consumer Health Informatics (CHI) research. Int J Med Inform 79(6): e89–96
  14. 14. Donald M, Ware RS, Ozolins IZ, Begum N, Crowther R, et al. (2011) The role of patient activation in frequent attendance at primary care: A population-based study of people with chronic disease. Patient Educ Couns 83(2): 217–221
  15. 15. Alpay L, Toussaint P, Zwetsloot-Schonk B. (2004) Supporting healthcare communication enabled by information and communication technology: can HCI and related cognitive aspects help? Paper presented at the Dutch Directions in HCI, Amsterdam.
  16. 16. European Commission. (2003) Survey of Online Health Information. Available:
  17. 17. Kennedy A, Gask L, Rogers A (2005) Training professionals to engage with and promote self-management. Health Educ Res 20: 567–78 Available:
  18. 18. Diaz JA, Griffith RA, Ng JJ, Reinert SE, Friedmann PD, et al. (2002) Patients' use of the Internet for medical information. J Gen Intern Med 17(3): 180–5 Available:
  19. 19. Nutbeam D (2000) Health literacy as a public health goal: A challenge for contemporary health education and communication strategies into the 21st century. Health Promotion International. 14(3): 259–267.
  20. 20. Nutbeam D (2008) The evolving concept of health literacy. Soc Sci Med 67(12): 2072–2078 Available:
  21. 21. World Health Organization. (2011) Health literacy and health behavior. Available: Accessed 19 June 2011.
  22. 22. Baker DW, Wolf MS, Feinglass J, Thompson JA, Gazmararian JA, et al. (2007) Health Literacy and Mortality Among Elderly Persons. Arch Intern Med 167: 1503–9 Available:
  23. 23. Cutilli CC, Bennett IM. (2009) Understanding the health literacy of America: results of the National Assessment of Adult Literacy. Orthop Nurs 28(1): : 27–32; quiz 33-4. doi:10.1097/01.NOR.0000345852.22122.d6
  24. 24. Manning DL, Dickens C (2006) Health literacy: more choice, but do cancer patients have the skills to decide? Eur J Cancer Care (Engl) 15(5): 448–52
  25. 25. Nielsen-Bohlman L, Panzer AM, Kindig DA. (2004) Health literacy: a prescription to end confusion. Washington, DC: National Academies Press.
  26. 26. Ferguson LA, Pawlak R (2011) Health literacy: the road to improved health outcomes. J Nurs Pract 7(2): 123–129
  27. 27. Chandler J, Churchill R, Higgins J, Lasserson T, Tovey D. (2012) Methodological Expectations of Cochrane Intervention Reviews (MECIR): Methodological standards for the conduct of new Cochrane Intervention Reviews. The Cochrane Library. Available:
  28. 28. Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gøtzsche PC, et al. (2009) The PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions: Explanation and Elaboration. PLoS Med 6(7): e1000100
  29. 29. Carter EL, Nunlee-Bland G, Callender C (2011) A patient-centric, provider-assisted diabetes telehealth self-management intervention for urban minorities. Perspect Health Inf Manag 8: 1b Available:
  30. 30. Gray K, Elliott K, Wale J. (2012) A community education initiative to improve using online health information: Participation and impact. Inform Health Soc Care. doi:10.3109/17538157.2012.705201.
  31. 31. Gross VA, Famiglio LM, Babish J (2007) Senior citizen access to trusted stroke information: a blended approach. J Consum Health Internet 11(2): 1–11
  32. 32. Kalichman SC, Cherry C, Cain D, Pope H, Kalichman M, et al. (2006) Internet-based health information consumer skills intervention for people living with HIV/AIDS. J Consult Clin Psychol 74(3): 545–554
  33. 33. Kurtz-Rossi S, Duguay P (2010) Health Information Literacy Outreach: Improving Health Literacy and Access to Reliable Health Information in Rural Oxford County Maine. J Consum Health Internet 14(4): 325–340
  34. 34. Susic J (2009) NIHSeniorHealth classes for senior citizens at a public library in Louisiana. J Consum Health Internet 13(4): 417–419
  35. 35. Xie B (2011) Older adults, e-health literacy, and Collaborative Learning: An experimental study. J Am Soc Inf Sci Technol 62(5): 933–946
  36. 36. Kim P, Eng TR, Deering MJ, Maxfield A (1999) Published criteria for evaluating health related web sites: review. BMJ 318(7184): 647–9 Available:
  37. 37. Masse LC, de Niet JE (2012) Sources of validity evidence needed with self-report measures of physical activity. J Phys Act Health 9 Suppl 1S44–55 Available:
  38. 38. Cook TD, Campbell DT. (1979) Quasi-experimental design and analysis issues for field settings. Chicago: Rand McNally College Publishing Company.
  39. 39. Prince SA, Adamo KB, Hamel ME, Hardt J, Gorber SC, et al.. (2008) A comparison of direct versus self-report measures for assessing physical activity in adults: a systematic review. Int J Behav Nutr Phys Act 5(56) . doi:10.1186/1479-5868-5-56
  40. 40. Johnes G (2006) Didacticism and Educational Outcomes. Edu Res Rev 1(2): 23–28 Available:
  41. 41. Eberlein T, Kampmeier J, Minderhout V, Moog RS, Platt T, et al. (2008) Pedagogies of engagement in science. Biochem Mol Biol Educ 36(4): 262–273
  42. 42. Sandholtz JH, Ringstaff C, Dwyer D. (1997) Teaching with technology: Creating student-centred classrooms. New York: Teachers College Press.
  43. 43. Bergeson SC, Dean JD (2006) A systems approach to patient-centered care. J Am Med Assoc 296: 2848–2851.
  44. 44. McNutt RA (2004) Shared medical decision making. J Am Med Assoc 292: 2516–2518.
  45. 45. Miles MB, Saxl ER, Lieberman A (1988) What Skills do Educational “Change Agents” Need? An Empirical View. Curriculum Inquiry 18(2): 157–193 Available:
  46. 46. Norman CD, Skinner HA (2006) eHEALS: The eHealth Literacy Scale. J Med Internet Res 8(4): e27