Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

Evaluation of an online suicide prevention program to improve suicide literacy and to reduce suicide stigma: A mixed methods study

  • Mareike Dreier ,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Visualization, Writing – original draft, Writing – review & editing

    Affiliation Department of Medical Psychology, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Julia Ludwig,

    Roles Conceptualization, Writing – review & editing

    Affiliation Institute of Medical Sociology, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Martin Härter,

    Roles Conceptualization, Funding acquisition, Resources, Supervision, Writing – review & editing

    Affiliation Department of Medical Psychology, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Olaf von dem Knesebeck,

    Roles Conceptualization, Funding acquisition, Resources, Supervision, Writing – review & editing

    Affiliation Institute of Medical Sociology, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Farhad Rezvani,

    Roles Formal analysis, Methodology, Writing – review & editing

    Affiliation Department of Medical Psychology, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Johanna Baumgardt,

    Roles Conceptualization, Investigation, Methodology, Writing – review & editing

    Affiliation Department of Psychiatry and Psychotherapy, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Nadine Janis Pohontsch,

    Roles Formal analysis, Methodology, Supervision, Writing – review & editing

    Affiliation Department of General Practice and Primary Care, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Thomas Bock,

    Roles Conceptualization, Funding acquisition, Resources, Supervision, Writing – review & editing

    Affiliation Department of Psychiatry and Psychotherapy, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany

  • Sarah Liebherz

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – review & editing

    Affiliation Department of Medical Psychology, Center for Psychosocial Medicine, University Medical Center Hamburg-Eppendorf, Hamburg, Germany


Low-threshold e-health approaches in prevention to reduce suicide stigma are scarce. We developed an online program containing video reports on lived experience of suicide and evidence-based information on suicidality. We evaluated the program by a mixed methods design. We examined pre-post-changes of program completers (n = 268) in suicide literacy, suicide stigma (self and perceived), and self-efficacy expectation of being able to seek support in psychologically difficult situations using linear mixed models. To examine reported changes and helpful program elements 12–26 weeks after program completion, we content analyzed transcripts of telephone interviews (n = 16). Program completers showed more suicide literacy (Cohen’s d = .74; p < .001), higher self-efficacy expectations to seek support (d = .09; p < .01), lower self-stigma (subscales glorification/normalization: d = -.13, p = .04; isolation/depression: d = -.14; p = .04; stigma: d = -.10; p = .07; n = 168) compared to baseline. We found no significant differences in perceived suicide stigma. We identified lived experience reports, the possibility of sharing own narrative on stigma and suicidality, and information on support as helpful elements. The current online program can increase suicide literacy and self-efficacy expectations to seek support and reduce self-stigma. We recommend a larger randomized controlled trial with longer follow-up to confirm these findings.


Worldwide, 703 000 people die by suicide every year [1]. Effective treatment options for underlying mental disorders and specifically for suicidality exist [2]. Help seeking can prevent aggravation of psychological problems and even be vital to the survival of a suicidal person [3]. Nevertheless, many people affected by suicidality do not seek help [4]. Insufficient information on mental conditions, on support offers, and negative attitudes towards help-seeking, stigma associated with suicidality, i.e., internalized, anticipated, experienced or perceived suicide stigma may decrease help-seeking [512].

Suicide stigma and the attached taboo are obstacles in suicide prevention. Raising awareness is important to make progress in preventing suicides [13, 14]. In general, public stigma can be reduced by education [15] and contact between members of a stigmatizing and stigmatized group [16]. Self-stigma in persons with a mental condition can be reduced by education and psychoeducation, therapeutic approaches, empowering, and self-help [17, 18].

Mental health literacy, defined as the “knowledge and beliefs about mental disorders which aid their recognition, management or prevention”, enables individuals to seek appropriate help [19, 20]. Rates of help-seeking among people with suicidal ideation are low [21]. Suicidal ideation may not be perceived by the individual as a mental health problem [22]. A lack of knowledge about a mental health condition (e.g., how to recognize or prevent a mental disorder) may also contribute to stigma towards the affected group [10]. Increased literacy may lead to less stigma and more openness to seek treatment for mental disorders [23]. In a German population sample, suicide literacy was negatively associated with stigmatizing attitudes toward suicidal persons, so disseminating knowledge may help to reduce suicide stigma [24].

A person’s self-efficacy expectations, i.e., the judgement of one’s capability, partly explains the change or maintenance of certain behaviors [25]. In suicide prevention, communicating suicidality and seeking support can be an important behavior for individuals in a suicidal crisis and can ultimately be life-saving [2, 14]. Specific aspects of self-efficacy expectancy, such as confidence in being able to confide in others and being able to seek support in psychologically difficult situations, could be modifiable through a brief program [26, 27].

Online interventions are a low-threshold opportunity to reach many persons who cannot access care in traditional ways. A meta-analysis on internet-based programs which included 16 studies, showed that online interventions can help to reduce suicidal ideation [28]. A systematic literature review found only seven published studies on online self-help interventions that aimed at reducing self-stigma in persons with mental health problems; one targeting suicide attempt related-personal stigma [18]. Although mental illness stigma and suicide stigma overlap, there is some evidence that there are differences between these two stigma types [29]. To our knowledge, there is only a limited number of evaluated self-help online programs which aim at reducing suicide stigma [30, 31] or improving suicide literacy [32]. Furthermore, the involvement of persons with a lived experience in suicide prevention interventions is scarce [33]. Therefore, we developed the German online suicide prevention program 8 Lives–Lived experience reports and facts on suicide involving persons with a lived experience of suicide [34]. The aim of this article is to present the evaluation of this newly developed program by answering the following research questions:

  1. To what extent do suicide literacy, self-stigma and perceived suicide stigma, and self-efficacy expectation of being able to seek support in psychologically difficult situations of the participants change after completing the program compared to baseline?
  2. How do the participants evaluate the program and aspects of the program (e.g., overall satisfaction and helpful program elements) at completion and 12 to 26 weeks after program completion?
  3. What kind of changes, including adverse events or undesired side effects, do participants report 12 to 26 weeks after program completion?


We used a mixed methods sequential explanatory design [35] to evaluate the online program. As described in our study protocol [27], the quantitative part included an online pre-post-survey with participants of the online suicide prevention program to assess possible changes over the course of the program in terms of the main goals and satisfaction with the program. The qualitative study comprised of follow-up telephone interviews with a subsample of online program completers to understand possible changes and participants’ evaluation of the program in more depth.

Ethics approval and consent to participate

The Ethics Committee of the Hamburg Medical Chamber approved this study on March 9, 2018 (process number: PV5750). All participants were informed in written form about the voluntariness of their participation, about data protection and about the possibility to terminate their participation in the online program or the telephone interview at any time. Participants were informed of the goals of the program, that it was not a crisis intervention program, and that the program was not a substitute for on-site personal care. Referrals were made to regional and telephone support services. Participants in the online program provided consent by checking an online tick box. Participants in the telephone interviews provided written informed consent to participate. All participants provided informed consent for publication of the results in anonymous form.

Content, development process, and rationale

The content and development process of the unguided online program are described elsewhere [34]. The program was intended to be a low-threshold program that could be used anonymously by participants. An outline of the content of the program’s eight chapters can be found in S1 Table. The program development closely involved ten persons with lived experiences of suicide (“lived experience team”). The program is based on the Australian digital intervention The Ripple Effect [30, 36]. A distinctive feature in suicide prevention is the consideration that suicide stigma, among its many negative effects, such as reduced help-seeking and self-deprecation, may be a protective factor preventing some individuals from taking their lives [37]. We therefore paid special attention not to normalize suicidality or suicides within the developed program. We also implemented a permanent access to information about support services. In this regard we also refer to the so called “Papageno effect” that appropriate media coverage of suicides can prevent them, by refraining from monocausal explanations and detailed descriptions of the circumstances and instead pointing out constructive ways of coping with crisis situations and professional support options [38], especially when the message is delivered by a person with personal experience [39]. In the program development, we refer to the empowerment and recovery story telling approaches by using video reports on lived experience of suicide. We also resorted to elements from cognitive behavioral therapy, that participants could optionally use. However, our program does not have a therapeutic focus. We classify it in suicide prevention through education and awareness.

Eligibility criteria for participants

Criteria for participating in the online program were:

(1) at least 18 years of age,

(2) internet access and the ability to understand German language,

(3a) affected by suicidal ideation or a suicide attempt, or

(3b) affected as a close person, i.e., loss of a person by suicide or caring for a suicidal person, or

(3c) interested in the topic of suicidality in general.

Participants were assigned to one of five program variants, depending on the self-reported type of affectedness that currently affects them the most. The content, e.g., texts, videos of persons with a lived experience of suicide and work sheets varied accordingly. Age was the only fixed exclusion criterion. Language skills were not assessed. As the program was neither a therapeutic nor a crisis intervention program, the current level of suicidality of the participants was not assessed.


The program was available free of charge on the subdomain from December 19, 2019, to August 31, 2020. Study participants were recruited by various means, including teasers on the e-mental-health portal, e-mail appeals to multipliers (e.g., distribution lists of university clinics, self-help organizations) to spread the link to the study website, and references to the study in social media. Search engine optimization was performed. In addition to the online recruitment, posters, postcards, and notices were distributed in supermarkets, medical practices, and psychiatric facilities in two German cities (Hamburg, Berlin). A press release of the University Medical Center Hamburg-Eppendorf provided information on the study.

Delivery method and setting

Participants used their own devices to access the browser-based online program (e.g., smartphones, tablets). Chapters were unlocked successively. Chapters 1 and 2 contained the baseline assessment, chapter 8 contained the post assessment (see also S1 Table). When a participant completed all eight chapters, the participant could access all chapters again as well as the library, which contained all video reports, worksheets, and a gallery of “digital postcard messages” from other participants.

Exposure quantity, duration, and time span

We informed the participants that the program would take about 1.5 to 5 hours, and participants could freely divide the time or logout occasionally and continue working at the chapter that was last saved. There was a note to take a break at the end of chapters 3–6. Web analytics (e.g., time spent on the eight chapters) were carried out using the open-source web analytics tool Matomo; see S2 Table for results.

Activities to increase adherence

The use of financial incentives in health research is widely debated [40]. Given the sensitivity of the topic of suicidality and suicide, we decided that it was more appropriate not to use reminder emails, financial incentives, or other compensations to encourage individuals to participate in our study, or to encourage participants to continue using the program if they dropped out.


Quantitative approachPre-post comparison without control group.

The primary outcomes were suicide literacy and perceived suicide stigma, assessed online at baseline and post-intervention using the following scales:

Literacy of Suicide Sale–Short Form (LOSS-SF).

We used the translated German version of LOSS-SF [24, 41] to assess suicide literacy. The scale includes twelve true and false statements concerning suicidality covering the domains signs, risk factors, causes/nature and treatment/prevention. Correctly answered statements were coded with a score of 1; false or “I don’t know” responses were coded with 0. By summing the item scores, a total score can be calculated (0–12).

Stigma of Suicide Scale–Short Form (SOSS-SF) adapted: Perceived suicide stigma.

We used a translated and adapted German version of SOSS-SF [42, 43] to assess perceived suicide stigma. Perceived suicide stigma is defined as a person’s believe about the general public’s attitude (stereotypes) towards a person who dies by suicide. Consistent to the evaluation of The Ripple Effect [30], we used the introductory statement “In general, other people think that a person who takes his or her own life is …”. The sixteen following descriptions were used as in the original: eight items assessing stigma (e.g. “irresponsible”), four items each assessing isolation/depression (e.g. “lonely”) and normalization/glorification (e.g. “brave”). Participants agreement was measured on a 5-point Likert scale (strongly disagree to strongly agree). In the principal component analysis with varimax rotation, based on an eigenvalue greater than 1, there was a three-component solution that explained 57.7% of the variance. Scores were calculated for these three subscales. The internal consistency (Cronbach’s alpha) in our sample at baseline (N = 802) was α = .88 for subscale stigma, isolation/depression: α = .82; normalization/glorification: α = .67.

The secondary outcomes were self-stigma and self-efficacy expectations of being able to seek support measured pre- and post-completion of the online program:

Stigma of Suicide Scale–Short Form (SOSS-SF) adapted: Self-stigma.

To assess negative attitudes towards oneself because of own suicidality as a part of self-stigma, we used the same 16 descriptors as in SOSS-SF but we changed the introductory statement to “Because I had thoughts of taking my life, I feel …” [30]. Response categories were the same as in the perceived stigma scale. The principal component analysis with varimax rotation yielded a three-component solution that resolved 55.5% of the variance, with the item “vengeful” loading substantially on the factor normalization/glorification (.38) in addition to the factor stigma (.25), yet we assigned it to the subscale stigma. The internal consistency in our sample at baseline (N = 507) was Cronbach’s α = .81 for the subscale self-stigma, α = .84 for isolation/depression, α = .78 for normalization/glorification. Different to our study protocol [27], we considered self-stigma as a secondary outcome, as only participants from program variants 1 (suicidal ideation) and 2 (suicide attempt) answered this instrument.

Self-efficacy expectations of being able to seek support in psychologically difficult situations (SWEP-6 and SWEP-7).

A scale to measure self-efficacy expectations of being able to seek support in psychologically difficult situations–based on Bandura [25, 44]–was newly developed for this study [26]. With 6–7 German items on an interval scale from 0 (“I do not feel confident at all”) to 10 (“I feel completely confident”), the participants were asked to indicate the extent to which they feel confident to seek support, e.g. “I can seek professional support (e.g., physician, psychotherapist) when I need it.”. The additional SWEP-7 item “I feel confident that I can talk to someone about my suicidal thoughts.” was only administered in program variant 1 and 2. Cronbach’s α of the SWEP-6 scale in our sample (N = 802) was α = .89.

Distress thermometer.

As an additional measure we used the distress thermometer [45] in the pre- and post-assessment to capture current distress. Using a thermometer scaled from 0–10, participants were asked to indicate how much distress they felt during the past week, including today (0 = no distress, 10 = extreme distress).

Satisfaction with the program and helpful elements.

As an additional measure we assessed satisfaction with the program after completion using a 5-point Likert scale (strongly disagree to strongly agree or not at all helpful to very helpful). In total, participants answered 21 items in the domains: a) knowledge about suicidality and stigmatization (e.g. “By participating in the online program, I know the risk factors and protective factors of suicidality better.”); b) skills (e.g. “By participating in the online program, I can talk better to others about my lived experience of suicide.”); c) helpful elements of the program (e.g. “Please rate how helpful you found these elements of 8 lives: Lived experience video reports”); d) satisfaction with the length of the program; e) sharing own experiences (“This was the first time I shared my attitudes about suicide and/or lived experiences of suicide. Yes/No”); f) recommendation of the program (“Would you recommend the online program to others? Yes/No.”).

Qualitative approach: Follow-up telephone interviews

At the end of the post-assessment, study participants who completed the online program could optionally leave their contact details in an online input mask. These interested persons were contacted by e-mail approximately 6–12 weeks after completion of the online program. They received further written study information about the telephone interview and were requested to sign an informed consent to participate in the interview. 12 to 26 weeks after completing the online program, we conducted semi-structured follow-up telephone interviews, i.e., open evaluation interviews, with this subsample of completers to explore a) reasons for participating in the program, b) experiences with using the program, c) changes after participation in the program, d) evaluation of the program, e) appropriate consideration of suicide stigma experiences in the program. We developed a semi-structured guide following a process that involved (1) collecting questions, (2) reviewing the questions from aspects of prior knowledge and openness, (3) sorting the remaining questions and keywords, and (4) subsuming and bundling the questions, including finding a narrative prompt that is as simple as possible [46]. The guide can be found in S7 Table. The telephone interviews (one-on-one) were conducted by MD. Interviews were digitally audio recorded and then transcribed for further analysis according to predefined transcription rules [47]. Outcomes are the developed coding tree (see S8 Table) and participants’ quotes. In S4 Table, we present further information based on COREQ checklist [48].

Sample size for quantitative and qualitative approach

A conservative power calculation was carried out following the study on the digital intervention The Ripple Effect [27, 36]. As described in our study protocol, we planned a pre-post-design with no control; a sample size of 241 completers was necessary to identify an effect size of d = .20 with a power of .80 and a significance level of α = .05. In the a priori power analysis, we adjusted alpha for two primary endpoints (SOSS-SF and LOSS-SF) [27]. Due to the exploratory design of the study and our aim to provide an online program that is accessible and available for all interested parties, a randomized controlled design was not conducted. Primary and secondary outcomes were tested two-sided.

In terms of the qualitative part, we planned to conduct at least twelve interviews [49]. We used maximum variation sampling with respect to different program variants, genders, and age groups to collect data from as many different perspectives as possible [50].

Quantitative approach: Statistical methods

We calculated arithmetic means and standard deviations for primary and secondary outcomes at baseline for all participants. We performed Chi-square test, two-sided Fisher’s exact test, and t-test for independent samples to explore differences between completers and non-completers. This included a comparison regarding age, gender, years of education, size of residence, online program variant, baseline data in primary, secondary and additional outcomes. Our hypothesis was undirected; we tested whether there were differences at baseline. We used descriptive statistics to assess satisfaction with the online program at post-assessment. To explore pre-post-differences in distress, we performed the t-test for dependent samples.

Different from the study protocol, we did not use the t-test for dependent samples for primary and secondary outcomes. Instead, we used linear mixed models to calculate the difference in estimated marginal means (EMMs) from pre- to post-intervention for participants with complete data sets. Covariates included age, gender, education, size of residence, chosen variant of the online program (own affliction with suicidality), and distress since these factors can be associated with stigmatizing attitudes as well as suicide literacy and self-efficacy expectations [24, 51]. To check the robustness of the results, we also conducted a sensitivity analysis with the full data set by handling missing data with maximum likelihood estimation on all available data. For the four primary outcomes (LOSS-SF scale; perceived SOSS-SF subscales stigma, isolation/depression and normalization/glorification), we applied Bonferroni adjustment in the main analysis (completer only) to limit alpha error inflation due to multiple testing. Thus, differences on these outcomes were tested at an adjusted significance level of p ≤ .0125. For all other analyses, results with p ≤ .05 were considered statistically significant. Effect sizes for pre- to post-program changes (Cohen’s d) were calculated by dividing the estimated marginal mean differences by the standard deviation at baseline [52].

Finally, in the main analysis, we exploratively compared subgroups on primary and secondary outcomes. We predefined the following subgroups: Variant of online program (1–5), gender, age groups (e.g., 18–29 years), education (lower/higher), size of residence (e.g., city), and distress (lower/higher). First, we identified differences between subgroups by inserting an interaction term. Relevant differences were defined based on the p value (p < .05) of the interaction term. Second, we calculated the EMM pre-post difference within each subgroup. Due to the exploratory nature of the subgroup analyses, we did not adjust for alpha error inflation. For results, see S6 Table.

Participants had to answer every item to progress in the program, i.e., for quantitative data there were no missing values unless a participant dropped out. Analyses were performed with the statistics software IBM SPSS 27 and R version 4.2.1.

Qualitative approach and research paradigm

We analyzed the transcribed telephone interviews according to structuring qualitative content analysis [47] following a realistic paradigm [53]. The analysis was performed in seven steps considering the research questions: (1) marking important text passages, (2) developing main themes, (3) coding the entire material, (4) compiling all text passages coded with the same main theme, (5) inductively determining subcodes on the material, (6) coding the complete material with differentiated coding tree, (7) further analyses. The first draft of the coding tree was developed by MD, was discussed during the research process and modified iteratively (JB, MD). Two researchers (MD, JB) coded main themes and subcodes for all transcripts independently using MAXQDA 18 and 20 (VERBI). To increase intersubjective reproducibility and comprehensibility, we presented and discussed the qualitative research twice with an interdisciplinary group of researchers in a qualitative methods seminar led by NP.

Integration of quantitative and qualitative findings

The integration of the quantitative and qualitative data (carried out by MD) is shown in a joint display including descriptions how qualitative findings may help explain quantitative findings [54]. A convergence assessment is made to display the findings obtained from each component [55]. There are four options: agreement, partial agreement, silence, or dissonance between the quantitative and qualitative results. Here, silence means that a result emerges only from qualitative or only from quantitative data.


Participant flow

The sample size (N = 268) was achieved in 8.5 months and slightly higher than the targeted sample size (N = 241). The dropout between baseline and post-assessment was 66.6% (Fig 1). Program variant 3 (loss by suicide) showed the highest dropout (73.1%), program variant 5 (interested in the topic) the lowest (60.2%).

Fig 1. Flow chart.

1LOSS-SF: Literacy of Suicide Scale, 2SOSS-SF: Stigma of Suicide Scale 3SWEP: Self-efficacy expectations of being able to seek support in psychologically difficult situations, 4Distress thermometer, 5Self-developed instrument.

Recruitment setting

The study completers stated that their attention was drawn to the study via the e-mental-health portal (n = 79, 29.5%), through search engines (n = 50, 18.7%), recommendations from friends, family members or acquaintances (n = 48, 17.9%), references on social media (n = 32, 11.9%), another website dealing with suicide (n = 28; 10.4%), references from support groups (n = 4, 1.5%), newspapers (n = 4, 1.5%), and through other means (n = 23, 8.6%).

Periods between pre-, post- and follow-up-assessment

The time between pre- and post-assessment varied between participants depending on how long it took them to complete the online program. Between the end of the baseline survey and the end of the post-survey, participants (N = 268) took the median time of 2.5 hours (M = 102.2 hours, SD = 378.8; min: 0.3 hours, max: 3,669 hours). 70.1% of participants completed the program and pre-post-assessment within one day (Fig 2). Follow-up telephone interviews (n = 16) were conducted at a mean of 13.3 weeks (SD = 6.9; range: 3.6–31.1 weeks; median: 12.6 weeks) after post-assessment.

Fig 2. Period between pre- and post-assessment by proportion of completers (N = 268).

Baseline data

As shown in Table 1, mainly women (n = 647, 78.2%), persons with a higher education level (n = 594, 71.8%) and persons living in a bigger city (n = 417; 50.4%) participated in the online program. Most of the participants reported having suicidal ideation (n = 337, 40.7%). There were no significant differences between participants who completed the study and non-completers regarding sociodemographic variables or distribution in the program variants. In the baseline assessment, we found significant differences between completers and non-completers in distress (non-completers showed higher distress), in two subscales of self-stigma (non-completers showed higher scores in stigma and in isolation/depression), and in one subscale of perceived stigma (non-completers showed higher scores in the normalization/glorification subscale).

Table 1. Baseline data for total sample and compared between completers and non-completers.

Quantitative approach: Pre-post-comparison

Table 2 shows the estimated marginal means (EMM) differences in primary and secondary outcomes in two data sets using linear mixed models: 1) subset of participants who completed the post-program survey (completer only; main analysis); and 2) full data set with missing data handled using maximum likelihood estimation (sensitivity analysis).

Table 2. Pre-post comparison in primary and secondary outcomes.

Primary and secondary outcomes.

Participants of the online program showed a moderate to high significant increase in suicide literacy in the post-assessment (p < .001); on average, participants answered 1.85 more items of LOSS-SF correctly compared to baseline. The highest increase in literacy between baseline- and post-assessment was shown for the items “A suicidal person will always be suicidal and entertain thoughts of suicide (False)”, “There is a strong relationship between alcoholism and suicide (True)”, and “People who talk about suicide rarely kill themselves (False)”. For perceived stigma (SOSS-SF), participants showed no significant differences between the pre- and post- assessment for all three subscales (stigma: p = .62, isolation/depression: p = .69, normalization/glorification: p = .07).

For self-stigma (SOSS-SF), participants showed a small significant decrease compared to baseline in the subscales isolation/depression (p = .04; d = -.14) and normalization/glorification (p = .04; d = -.13). For self-efficacy expectations of being able to seek support (SWEP), participants showed a small significant increase (p < .01; d = .09) after completing the online program.

Additional measures.

Distress between pre- and post-assessment decreases slightly by -.28 (95% CI: -.13; -.43; t = 3.6) in completers (pre-assessment: M = 6.37, SD = 2.72; post-assessment: M = 6.09, SD = 2.75).

Satisfaction with the program at post-assessment.

240 of the 268 completers (89.6%) would recommend the online program to others. Of all completers, 180 (67.2%) participants found the length of the program just right, 72 (26.9%) too long and 16 (6.0%) too short. 136 completers (50.7%) stated, that within the online program, they shared their attitudes about suicide and/or lived experiences of suicide for the first time. Participants wrote a total of 212 digital postcard messages, i.e., personal narratives on suicidality and stigma.

Fig 3 shows participants’ feedback on the online program at post-assessment.

Fig 3. Feedback and helpful elements of the online program at post-assessment (N = 268).

Self-developed items on a 5-point Likert scale at post-assessment (N = 268). The five response categories (e.g., fully agree, rather agree, neutral, rather disagree, fully disagree) are grouped into three categories in the figure. *n = 168.

Qualitative approach: Follow-up telephone interviews

44 participants (16.4% of the completers) agreed to be contacted for an additional telephone interview for further evaluation of the program. To reach a maximum variation sample of at least 12 participants of the 44 potential interviewees, we contacted the first 30 by mail. Of those contacted, 13 (43.3%) did not respond and one person (3.3%) cancelled because she no longer had interest in an interview. We conducted interviews with 16 participants between June and September 2020, representing a response rate of 53.3% among those contacted. As shown in Table 3, the sample shows a good variation of characteristics concerning gender and program variant. The median program completion time of the interviewed participants was 3 days (M = 6.75 days, SD = 10, range 0–40 days) and thus considerably higher than the median of all completers (Fig 2).

Table 3. Demographic data of participants, program variant, and duration of follow-up telephone interviews.

Qualitative approach: Main results

Tables 4 and 5 present the results of the qualitative analysis for the subcodes of evaluation and achievement of program objectives. Descriptions of the corresponding subcodes and example quotes from participants illustrate the results. The entire coding tree is provided in S8 Table including all identified codes (motivations for participating in the program, access to the program, prior knowledge or experience regarding suicide or suicidality and stigma, experiences during program use, feedback and ideas for improvement, and potential mechanisms of action).

Table 4. Identified subcodes of main code “achievement of program objectives” and example quotes from the 16 follow-up telephone interviews 12–26 weeks after program completion.

Table 5. Identified subcodes of main code “overall evaluation” and example quotes from the 16 follow-up telephone interviews 12–26 weeks after program completion.

Interviewees reported an increase in their suicide literacy, and only subtle changes in suicide stigma, and self-efficacy expectations of being able to seek support in psychologically difficult situations (see Table 4). Participants also reported other changes after program participation, e.g., changes in actual action or intended action.

Overall, interview participants expressed satisfaction with the program and positively evaluated it (see Table 5). As a particularly helpful element, lived experiences video reports were highlighted.

No adverse events during or after completing the online program were reported in the telephone interviews. However, some interviewees reported that the online program was sometimes exhausting, especially some video reports were experienced as emotionally distressing. One participant reported a deterioration in mood for several days, although she did not attribute this causally to program exposure.

Three of the sixteen interviewed participants having suicidal ideation, stated that they sought help (clinic, outpatient clinic, counselling) after program completion and two associated this clearly to the program. Two interviewed participants having survived a suicide attempt stated that the program supported them to deal with and to understand the suicide attempt better.


Table 6, a joint display, shows that the quantitative and qualitative results are mainly in agreement.

Table 6. Joint display including a convergence assessment of quantitative and qualitative evaluation results of the online suicide prevention program 8 lives.


This study provided initial evidence that an online suicide prevention program can enhance suicide literacy among participants, reduce self-stigma, and promote self-efficacy expectations of being able to seek support in psychologically difficult situations. However, only minor changes were observed in the latter two. We found no significant differences between pre- and post-assessment in perceived suicide stigma. Participants reported in follow-up telephone interviews, that video reports of persons with lived experience of suicide are particularly memorable. Interviewees emphasized the importance of openness and authenticity in the video reports as well as the presentation of how to talk about suicidality and conveying hope. The program prepared participants for communication about suicidality. The encouragement for self-reflection in the interactive elements, e.g., the possibility to anonymously share one’s own experience of suicide, was considered as a helpful aspect as one was more actively involved. This is in line with a recent review which found that digital interventions can improve help-seeking for mental health problems especially when promoting elements of active participation by sharing one’s own narrative [56]. Participants in our study reported the importance to be able to control the degree of confrontation with the topic of suicidality.

We found no substantial change in perceived stigma between pre- and post-assessment. A reason could be that the participants’ beliefs about the general public attitudes is stable or at least does not change because of a short intervention. We cannot rule out that some participants may even become more aware of suicide stigma in course of the program as we found a small increase in some subgroups. In principle, an increase in perceived suicide stigma by dealing with the issue is conceivable [30], so no change after an online suicide prevention program may also be a desirable effect–in particular in combination with a decrease in self-stigma. In the sensitivity analysis, we found a small reduction in the subscale normalization/glorification of perceived stigma–which is in line with our intention that through our program suicidality and suicides should neither be normalized nor glorified.

Results compared to other studies

Evaluation results of the online program The Ripple Effect, on which the 8 lives program is based, showed no significant changes in suicide literacy and stigma outcomes, except for an unexpected increase on the perceived stigma glorification/normalization subscale at post assessment [30]. In our sample, there was no significant difference in this outcome. Descriptively, we found a decrease at post assessment in the overall sample, and also in the subgroup analysis for participants affected by suicidality. The difference in evaluation results could be due to the different target samples (male farmers vs. broad target group) and to the different program content. In terms of suicide literacy, participants of The Ripple Effect showed higher values at baseline (M = 9.82, SD = .22) than our sample so that a ceiling effect cannot be ruled out. Although compared to a representative population survey in Germany [24], participants in our sample showed slightly higher suicide literacy at baseline (M = 7.00, SD = 2.14 vs. M = 7.76, SD = 2.5), suicide literacy still increased to post assessment, corresponding with other results from online psychoeducational programs [32].

An experimental study on a video showing a person’s individual recovery story after a suicide attempt showed minimal effects on treatment-seeking attitudes [57] which is in line with our qualitative findings. A study on peer-led face-to-face intervention targeting suicide-specific self-stigma of suicide attempt survivors showed promising effects [58]. This study also pointed out that a disclosure of one’s suicide attempt can be associated with various risks. The online program 8 lives dealt with advantages and disadvantages of disclosing experiences of suicidality or suicide. Therefore, it can only be compared to a limited extent to an intervention that was especially developed for disclosure. Nevertheless, our study suggests that an online program may prepare for an informed decision-making on disclosure of lived experience of suicide, and thus may encourage help-seeking. However, an online program is ultimately no substitute for a face-to-face encounter [59].

Possible side effects

Dropout was 66.6% and higher than expected. Internet-based psychological interventions often show high dropout rates, especially unguided interventions [60]. The unguided online program presented here is not a therapeutic one, had a low threshold for participating, but at the same time was of a broad scope and comparatively complex. The lived experience video reports were described in some telephone interviews as emotionally distressing, which is consistent with prior findings [61]. In the online program 8 lives, no concrete suicide methods nor any concrete details in this regard were mentioned. Nevertheless, engagement with the topic suicidality might have been too intense, especially for participants who reported higher levels of distress.

We decided not to exclude persons with a higher symptom burden from the program per se. We transparently informed participants that the program was not therapeutic, and not suitable in an acute crisis, about program aims, and where to find local support offers. Nevertheless, we found that persons with a high distress level at baseline participated in the program. From baseline comparison and reports in the interviews, it is conceivable that these participants may have dropped out of the program, as the confrontation was too demanding, or the program was not suitable for their demands. Interview participants with higher emotional distress reported that the program did not cause them lasting distress; severe adverse events were not reported.

Strengths and limitations

A strength of our study was its mixed methods approach combining statistical trends with personal experience of program participants [35]. By this approach, we were able to generate hypotheses about what may have contributed to the dropout and about potential mechanisms of actions. We were also able to capture changes at follow-up qualitatively, both desired and undesired, which were not determined in advance.

Our study had several limitations: Firstly, the study design without a control group did not allow any causal conclusions. Secondly, we cannot verify the data based on participants’ online self-reports. Thirdly, we cannot determine whether participants watched videos or read texts. However, the web analysis showed a longer time spent in the program; given the increase in literacy at the post-assessment it seems implausible that participants did not interact with the material. Fourthly, the time period between pre- and post-assessment varied considerably. For about 70% of the participants, the post-assessment took place within 24 hours after the pre-assessment. A longer follow-up would be desirable to see if literacy effects persist. Measurement on one day has advantages, e.g., less fluctuation due to different states of conditions, but also disadvantages, e.g., program content must be processed, so that no changes are reported, or memory effects intervene. Further, from all participants who started the program, only a third completed it, i.e., the dropout rate was high. Although we methodologically considered dropouts, the pre-post-results are only generalizable to a limited extend for those who dropped out. Other effects on the participants who dropped out cannot be ruled out based on the available data. In the qualitative evaluation, we were able to cover a broad range of statements, still we had no possibility to interview participants who dropped out of the program. For a better dropout analysis, a systematic survey of reasons for dropping out should have been conducted among all participants who dropped out, as well as a survey of their needs for the program (e.g., via e-mail and interviews). Finally, the quantitative evaluation captured perceived suicide stigma referring to persons who had taken their lives, and self-stigma of participants who had suicidal ideation or survived a suicide attempt. We did not measure other stigma types (public, anticipated, experienced stigma, stigma by association). While interpreting the quantitative results, it should also be considered that the SOSS-SF [42] asks mainly for stereotypes, while emotional and behavioral components are missing [62]. We did not assess other relevant stigma outcomes [6366] as the scope of the questionnaire survey would have been too large, also compared to the overall length of the program.


Reducing stigma is complicated–especially in suicide prevention–but necessary so that persons can get the support they need [14]. An online program may prepare for an informed decision-making on disclosure. Encounters with people with lived experience are essential in stigma reduction [67], but possible emotional distress for participants in the program must be considered. To address the feedback that lived experience reports can be distressing, we formulated a more detailed, yet as concise as possible, note.

For future online suicide prevention programs, we recommend a stronger tailoring as this could increase program satisfaction and may decrease emotional distress. For example, a program could consider the desired degree of confrontation (e.g., only text reports, only video reports of persons with similar kinds of experiences) in combination with current distress (e.g., adapt the sequence of modules, e.g., point out offers of help in case of high distress; introduce safety plans). Conceivable is also to consider gender, age, whether a psychiatric diagnosis is present and type of diagnosis (e.g., information and reports of persons with this diagnosis), time passed since suicidal ideation or a suicide attempt, time passed since the loss of a person by suicide, kind of relationship to person died (e.g., video reports of persons with same kind of relationship), and current level of suicide literacy (e.g., amount of basic literacy information needed). However, this would mean a larger development effort and require considerably more resources than was feasible in our pilot project.

The emotional activation in this online suicide prevention program (e.g., experiencing sadness, guilt, shame) could lead to a need for exchange with others (online or face-to-face). The development of initiatives that combine online suicide prevention programs including antistigma elements and the opportunity to meet a group consisting of experts by experience, their relatives, and clinicians (trialogue), or therapy groups on site would be desirable. Such an approach would combine advantages of online (high reach, low-threshold, own pace, anonymity) and offline world (actual encounter with peers, direct exchange, support offers). An online suicide prevention program planned on national level, could serve as a starting point for smaller local antistigma initiatives.

Online interventions developed for reducing suicidal ideation [28] could consider the integration of an add-on module on suicide stigma (e.g. if a person is interested in the topic and reached a sufficient stabilization) without normalizing suicidality. Such online interventions would require more supervision and guidance by a therapist.


Our results suggest that an increase in suicide literacy and self-efficacy expectations of being able to seek support, as well as decreases of self-stigma can be achieved through an online program that shows video reports emphasizing hope and recovery of persons with a lived experience of suicide. However, the effects on self-stigma and self-efficacy expectations are only small. Lived experience video reports were an essential element in the online suicide prevention program as well as the possibility of interactive elements (e.g., sharing own experience on suicide). It should be noted that video reports from persons with a lived experience of suicide, but also the general preoccupation with the topic can be emotionally demanding. Therefore, we advise for future programs that participants should be able to control the intensity of confrontation within an online program, e.g., future online suicide prevention programs could consider stronger tailoring.

Supporting information

S1 Table. Content of the online program and variants.


S4 Table. Consolidated criteria for reporting qualitative studies (COREQ).


S5 Table. Good Reporting of A Mixed Methods Study (GRAMMS) checklist.


S7 Table. Semi-structured guide for telephone interviews.



We would like to thank all participants of the online suicide prevention program. We would like to thank the lived experience team consisting of members of Irre menschlich Hamburg e.V.: Andre, Birgit, David, Gebke, Jenny, Ralf, Regina, among others. Finally, we would like to thank Prof. Susan Brumby and Dr. Alison Kennedy for their collaboration and the permission to use The Ripple Effect as a model for our program.


  1. 1. World Health Organization. Suicide worldwide in 2019: global health estimates. Geneva. 2021.
  2. 2. Zalsman G, Hawton K, Wasserman D, van Heeringen K, Arensman E, Sarchiapone M, et al. Suicide prevention strategies revisited: 10-year systematic review. The Lancet Psychiatry. 2016;3(7):646–59. pmid:27289303
  3. 3. Reynders A, Kerkhof AJ, Molenberghs G, Van Audenhove C. Attitudes and stigma in relation to help-seeking intentions for psychological problems in low and high suicide rate regions. Soc Psychiatry Psychiatr Epidemiol. 2014;49(2):231–9. pmid:23896893
  4. 4. Bruffaerts R, Demyttenaere K, Hwang I, Chiu WT, Sampson N, Kessler RC, et al. Treatment of suicidal people around the world. Br J Psychiatry. 2011;199(1):64–70. pmid:21263012
  5. 5. Reynders A, Kerkhof AJ, Molenberghs G, Van Audenhove C. Help-seeking, stigma and attitudes of people with and without a suicidal past. A comparison between a low and a high suicide rate country. J Affect Disord. 2015;178:5–11. pmid:25770477
  6. 6. Hogge I, Blankenship P. Self-concealment and suicidality: Mediating roles of unmet interpersonal needs and attitudes toward help-seeking. Journal of Clinical Psychology. 2020;76(10):1893–903. pmid:32399988
  7. 7. Angermeyer MC, van der Auwera S, Carta MG, Schomerus G. Public attitudes towards psychiatry and psychiatric treatment at the beginning of the 21st century: a systematic review and meta-analysis of population surveys. World Psychiatry. 2017;16(1):50–61. pmid:28127931
  8. 8. Calear AL, Batterham PJ, Christensen H. Predictors of help-seeking for suicidal ideation in the community: risks and opportunities for public suicide prevention campaigns. Psychiatry Research. 2014;219(3):525–30. pmid:25048756
  9. 9. Clement S, Schauman O, Graham T, Maggioni F, Evans-Lacko S, Bezborodovs N, et al. What is the impact of mental health-related stigma on help-seeking? A systematic review of quantitative and qualitative studies. Psychol Med. 2015;45(1):11–27. pmid:24569086
  10. 10. Corrigan PW. How stigma interferes with mental health care. American Psychologist. 2004;59(7):614–25. pmid:15491256
  11. 11. Oexle N, Ajdacic-Gross V, Kilian R, Muller M, Rodgers S, Xu Z, et al. Mental illness stigma, secrecy and suicidal ideation. Epidemiol Psychiatr Sci. 2017;26(1):53–60. pmid:26606884
  12. 12. Han J, Batterham PJ, Calear AL, Randall R. Factors Influencing Professional Help-Seeking for Suicidality. Crisis. 2018;39(3):175–96. pmid:29052431
  13. 13. World Health Organization. Fact Sheet on Suicide 2021 [Available from:].
  14. 14. World Health Organization. Preventing suicide: A global imperative. Geneva: World Health Organization; 2014.
  15. 15. Jorm AF. Mental health literacy: empowering the community to take action for better mental health. American Psychologist. 2012;67(3):231–43. pmid:22040221
  16. 16. Pettigrew TF, Tropp LR. A meta-analytic test of intergroup contact theory. J Pers Soc Psychol. 2006;90(5):751–83. pmid:16737372
  17. 17. Alonso M, Guillén AI, Muñoz M. Interventions to Reduce Internalized Stigma in individuals with Mental Illness: A Systematic Review. The Spanish Journal of Psychology. 2019;22:E27. pmid:31084665
  18. 18. Mills H, Mulfinger N, Raeder S, Rüsch N, Clements H, Scior K. Self-help interventions to reduce self-stigma in people with mental health problems: A systematic literature review. Psychiatry Research. 2020;284:112702. pmid:31839418
  19. 19. Jorm AF, Korten AE, Jacomb PA, Christensen H, Rodgers B, Pollitt P. “Mental health literacy”: a survey of the public’s ability to recognise mental disorders and their beliefs about the effectiveness of treatment. Medical Journal of Australia. 1997;166(4):182–6. pmid:9066546
  20. 20. Waldmann T, Staiger T, Oexle N, Rüsch N. Mental health literacy and help-seeking among unemployed people with mental health problems. Journal of Mental Health. 2020;29(3):270–6. pmid:30862221
  21. 21. Michelmore L, Hindley P. Help-Seeking for Suicidal Thoughts and Self-Harm in Young People: A Systematic Review. Suicide and Life-Threatening Behavior. 2012;42(5):507–24. pmid:22889130
  22. 22. Batterham P, Calear A, Christensen H. Correlates of suicide stigma and suicide literacy in the community. Suicide and Life-Threatening Behavior. 2013;43(4):406–17. pmid:23556504
  23. 23. Jung H, von Sternberg K, Davis K. The impact of mental health literacy, stigma, and social support on attitudes toward mental health help-seeking. International Journal of Mental Health Promotion. 2017;19(5):252–67.
  24. 24. Ludwig J, Dreier M, Liebherz S, Härter M, von dem Knesebeck O. Suicide literacy and suicide stigma–results of a population survey from Germany. Journal of Mental Health. 2021:1–7. pmid:33522335
  25. 25. Bandura A. Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review. 1977;84(2):191–215. pmid:847061
  26. 26. Dreier M, Ludwig J, Baumgardt J, Bock T, von dem Knesebeck O, Härter M, et al. Entwicklung und psychometrische Überprüfung eines Kurzfragebogens zur Selbstwirksamkeitserwartung im Umgang mit psychisch belastenden Situationen (SWEP). Psychiatr Prax. 2022(EFirst).
  27. 27. Dreier M, Ludwig J, Härter M, von dem Knesebeck O, Baumgardt J, Bock T, et al. Development and evaluation of e-mental health interventions to reduce stigmatization of suicidality–a study protocol. BMC Psychiatry. 2019;19(1):152. pmid:31101103
  28. 28. Torok M, Han J, Baker S, Werner-Seidler A, Wong I, Larsen ME, et al. Suicide prevention using self-guided digital interventions: a systematic review and meta-analysis of randomised controlled trials. The Lancet Digital Health. 2020;2(1):e25–e36. pmid:33328037
  29. 29. Sheehan L, Dubke R, Corrigan PW. The specificity of public stigma: A comparison of suicide and depression-related stigma. Psychiatry Res. 2017;256:40–5. pmid:28622573
  30. 30. Kennedy A, Brumby S, Versace V, Brumby-Rendell T. The ripple effect: a digital intervention to reduce suicide stigma among farming men. BMC Public Health. 2020;20(1):813. pmid:32471501
  31. 31. Taylor-Rodgers E, Batterham PJ. Evaluation of an online psychoeducation intervention to promote mental health help seeking attitudes and intentions among young adults: randomised controlled trial. J Affect Disord. 2014;168:65–71. pmid:25038293
  32. 32. Han J, Batterham PJ, Calear AL, Wu Y, Xue J, van Spijker BAJ. Development and pilot evaluation of an online psychoeducational program for suicide prevention among university students: A randomised controlled trial. Internet Interventions. 2018;12:111–20. pmid:30135775
  33. 33. Watling D, Preece M, Hawgood J, Bloomfield S, Kõlves K. Developing an Intervention for Suicide Prevention: A Rapid Review of Lived Experience Involvement. Archives of Suicide Research. 2020:1–16. pmid:33073734
  34. 34. Dreier M, Baumgardt J, Bock T, Härter M, The 8 Lives Team, Liebherz S. Development of an online suicide prevention program involving people with lived experience: ideas and challenges. Research Involvement and Engagement. 2021;7, 60 pmid:34496972
  35. 35. Creswell J, Plano Clark V. Designing and Conducting Mixed Methods Research. 3rd ed. ed: Sage; 2017.
  36. 36. Kennedy AJ, Versace VL, Brumby SA. Research protocol for a digital intervention to reduce stigma among males with a personal experience of suicide in the Australian farming community. BMC Public Health. 2016;16(1):1204. pmid:27899094
  37. 37. Chen JA, Courtwright A, Wu KC. The Role of Stigma and Denormalization in Suicide-Prevention Laws in East Asia: A Sociocultural, Historical, and Ethical Perspective. Harv Rev Psychiatry. 2017;25(5):229–40. pmid:28696950
  38. 38. Niederkrotenthaler T, Voracek M, Herberth A, Till B, Strauss M, Etzersdorfer E, et al. Role of media reports in completed and prevented suicide: Werther v. Papageno effects. Br J Psychiatry. 2010;197(3):234–43. pmid:20807970
  39. 39. Niederkrotenthaler T, Till B. Effects of suicide awareness materials on individuals with recent suicidal ideation or attempt: online randomised controlled trial. The British Journal of Psychiatry. 2020;217(6):693–700. pmid:31843026
  40. 40. Manriquez Roa T, Biller-Andorno N. Financial incentives for participants in health research: when are they ethical? Swiss Med Wkly. 2022;152:w30166. pmid:35315269
  41. 41. Calear AL, Batterham PJ, Trias A, Christensen H. The Literacy of Suicide Scale. Crisis. 2021. pmid:34128704
  42. 42. Batterham PJ, Calear AL, Christensen H. The Stigma of Suicide Scale. Psychometric properties and correlates of the stigma of suicide. Crisis. 2013;34(1):13–21. pmid:22846447
  43. 43. Ludwig J, Liebherz S, Dreier M, Härter M, von dem Knesebeck O. Die Stigma of Suicide Scale: psychometrische Überprüfung der deutschen Kurzversion (SOSS-SF-D). Psychiatr Prax. 2020;47(08):433–9.
  44. 44. Bandura A. Guide for constructing self-efficacy scales. In: Pajares F, Urdan T, editors. Self-efficacy beliefs of adolescents. Greenwich: Information Age Publishing; 2006. p. 307–37.
  45. 45. Mehnert A, Müller D, Lehmann C, Koch U. Die deutsche Version des NCCN Distress-Thermometers: Empirische Prüfung eines Screening-Instruments zur Erfassung psychosozialer Belastung bei Krebspatienten. Zeitschrift für Psychiatrie, Psychologie und Psychotherapie. 2006;54(3):213–23.
  46. 46. Helfferich C. Die Qualität qualitativer Daten. Wiesbaden: VS Verlag für Sozialwissenschaften; 2011.
  47. 47. Kuckartz U. Qualitative Inhaltsanalyse: Methoden, Praxis, Computerunterstützung. 2018.
  48. 48. Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care. 2007;19(6):349–57. pmid:17872937
  49. 49. Guest G, Bunce A, Johnson L. How Many Interviews Are Enough? Field Methods. 2016;18(1):59–82.
  50. 50. Suri H. Purposeful Sampling in Qualitative Research Synthesis. Qualitative Research Journal. 2011;11(2):63–75.
  51. 51. Scocco P, Toffol E, Preti A, Team TSP. Psychological Distress Increases Perceived Stigma Toward Attempted Suicide Among Those With a History of Past Attempted Suicide. The Journal of Nervous and Mental Disease. 2016;204(3):194–202. pmid:26751731
  52. 52. Cohen J. Statistical Power Analysis for the Behavioral Sciences: Routledge; 1988.
  53. 53. Potter J, Wetherell M. Discourse and social psychology: beyond attitudes and behaviour. London: Sage; 1987.
  54. 54. Creswell JW. A concise introduction to mixed methods research: SAGE Publications; 2015.
  55. 55. O’Cathain A, Murphy E, Nicholl J. Three techniques for integrating data in mixed methods studies. BMJ. 2010;341:c4587. pmid:20851841
  56. 56. Evans-Lacko S, Hahn JS, Peter L-J, Schomerus G. The impact of digital interventions on help-seeking behaviour for mental health problems: a systematic literature review. Current Opinion in Psychiatry. 2022;35(3):207–18. pmid:35579875
  57. 57. Tucker RP, Haydel R, Zielinski M, Niederkrotenthaler T. Storytelling of suicide attempt recovery and its relationship with mental health treatment-seeking attitudes and behaviors: An experimental study. Journal of American College Health. 2022;70(3):801–9. pmid:32529929
  58. 58. Sheehan L, Oexle N, Bushman M, Glover L, Lewy S, Armas SA, et al. To share or not to share? Evaluation of a strategic disclosure program for suicide attempt survivors. Death Studies. 2022:1–8.
  59. 59. McGill K, Hackney S, Skehan J. Information needs of people after a suicide attempt: A thematic analysis. Patient Education and Counseling. 2019;102(6):1119–24. pmid:30679002
  60. 60. Torous J, Lipschitz J, Ng M, Firth J. Dropout rates in clinical trials of smartphone apps for depressive symptoms: A systematic review and meta-analysis. Journal of Affective Disorders. 2020;263:413–9. pmid:31969272
  61. 61. Bottomley JS, Abrutyn S, Smigelsky MA, Neimeyer RA. Mental Health Symptomatology and Exposure to Non-Fatal Suicidal Behavior: Factors That Predict Vulnerability and Resilience Among College Students. Archives of Suicide Research. 2018;22(4):596–614.
  62. 62. Corrigan PW, Watson AC. Understanding the impact of stigma on people with mental illness. World Psychiatry. 2002;1(1):16–20. pmid:16946807
  63. 63. Rüsch N, Corrigan PW, Wassel A, Michaels P, Olschewski M, Wilkniss S, et al. A stress-coping model of mental illness stigma: I. Predictors of cognitive stress appraisal. Schizophrenia Research. 2009;110(1–3):59–64. pmid:19269140
  64. 64. Corrigan PW, Sheehan L, Al-Khouja MA, Stigma of Suicide Research Team. Making Sense of the Public Stigma of Suicide. Crisis. 2017;38(5):351–9. pmid:28337924
  65. 65. Scocco P, Castriotta C, Toffol E, Preti A. Stigma of Suicide Attempt (STOSA) scale and Stigma of Suicide and Suicide Survivor (STOSASS) scale: two new assessment tools. Psychiatry Res. 2012;200(2–3):872–8. pmid:22819276
  66. 66. Corrigan PW, Sheehan L, Al-Khouja MA, Lewy S, Major DR, Mead J, et al. Insight into the Stigma of Suicide Loss Survivors: Factor Analyses of Family Stereotypes, Prejudices, and Discriminations. Arch Suicide Res. 2016:1–10.
  67. 67. Thornicroft G, Sunkel C, Alikhon Aliev A, Baker S, Brohan E, el Chammay R, et al. The Lancet Commission on ending stigma and discrimination in mental health. The Lancet. 2022. pmid:36223799