Skip to main content
Advertisement
  • Loading metrics

Perceptions of drinking water: Understanding the role of individualized water quality data in Detroit, Michigan

  • Alyssa Schubert,

    Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Project administration, Writing – original draft

    Affiliation Civil and Environmental Engineering, University of Michigan, Ann Arbor, Michigan, United States of America

  • Jacob Harrison,

    Roles Investigation, Project administration

    Affiliation Civil and Environmental Engineering, University of Michigan, Ann Arbor, Michigan, United States of America

  • Linda Kent-Buchanan,

    Roles Investigation, Project administration

    Affiliation Detroit Community Researchers, Detroit, Michigan, United States of America

  • Victor Bonds,

    Roles Investigation, Project administration

    Affiliation Detroit Community Researchers, Detroit, Michigan, United States of America

  • Sara Hughes,

    Roles Conceptualization, Methodology, Supervision, Writing – review & editing

    Affiliation School for Environment and Sustainability, University of Michigan, Ann Arbor, Michigan, United States of America

  • Shawn P. McElmurry,

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – review & editing

    Affiliation Civil and Environmental Engineering, Wayne State University, Detroit, Michigan, United States of America

  • Matthew Seeger,

    Roles Methodology, Writing – review & editing

    Affiliation Department of Communication, Wayne State University, Detroit, Michigan, United States of America

  • Nancy G. Love

    Roles Conceptualization, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – review & editing

    nglove@umich.edu

    Affiliation Civil and Environmental Engineering, University of Michigan, Ann Arbor, Michigan, United States of America

Abstract

Understanding water users’ perceptions of drinking water quality and the water service provider are important to understand for effective communication with users. Traditionally, the primary means through which water users receive information about drinking water is via the annual Consumer Confidence Report, which summarizes water quality information at the water system-scale and not at the point-of-use. In this study, we recruited 24 water users from different homes in Detroit, Michigan to assess the effect of access to individualized data on perceptions related to their drinking water quality and service provider. Each participant had a water quality sensor node, which measured five different water quality parameters, temporarily installed in their home for four weeks. Entry interviews were completed at the time of sensor node installation. After four weeks, water quality reports summarizing the individual water quality data collected by the sensor nodes were prepared and shared with participants, after which the exit interviews were completed. We found that access to individualized water quality data positively affected participants’ perceptions of drinking water quality and safety, for example, 92% of participants rated the safety of water at the faucet as at least ‘Somewhat Safe’ in the exit interview compared to 46% in the entry interview. However, participants’ perceptions of the water service provider did not change significantly in response to this information (p > 0.05). Half of the study participants expressed interest in more frequent monitoring and communication, including actionable data that allowed participants to make more informed decisions about how to better manage their water quality at home. We saw evidence of long-term changes in response to access to individualized information with 50% reporting changes in behavior related to drinking water use. We conclude that access to localized water quality data provides actionable information that Detroit, Michigan water users value.

Introduction

Community water systems in the United States are required by federal regulation to release an annual Consumer Confidence Report (CCR). The CCR is typically distributed annually with a water bill, either physically or electronically, and includes system-wide information about the source water, finished drinking water quality, and any risks or exposures associated with regulated drinking water contaminants. As one of the few mandated pieces of risk communication for water service providers, the CCR, in theory, provides the opportunity for meaningful communication between providers and water users. However, studies of the CCR find that, despite being the primary method through which water users receive information about their drinking water, it is not generally effective [14]. The lack of information specific to the receiving household or neighborhood does not allow for water users to determine specific risks to themselves and their families. Additionally, while important and required by regulations, CCR documents are often written in an inaccessible manner with confusing and highly technical language [1]. Because CCRs contain summarized information, they adopt and perpetuate the assumption that water quality is homogeneous across the entire water system. For example, CCRs do not clarify that the information provided, at least for some contaminants, does not necessarily reflect the conditions of drinking water at all individual taps in a water system. For instance, although a water system may be in regulatory compliance with the Lead and Copper Rule, and report this in the CCR, the actual levels of lead in drinking water at individual taps could be quite different.

Given the limited reach and content of CCRs, more individualized and readily accessible drinking water information may improve water user decision-making and self-efficacy around drinking water risks [3,5]. Individual self-efficacy is “the belief in one’s ability to accomplish a task successfully,” or one’s belief in oneself [6]. One tool for bolstering self-efficacy is data transparency. Frameworks for data transparency, which is characterized by accessibility of the data, have been explored in healthcare [711] and risk communication [12,13]. These studies highlight the important role of data transparency in building public trust, effective risk communication, and improved decision-making. While the application of data transparency principles, such as information substantiality and relevancy, organizational accountability, and public participation are needed for effective science communication [8,14], they are in tension with protecting privacy, securing sensitive information, and resource availability. With appropriate practices in place, data transparency and accessibility can be effective in supporting public decision-making and self-efficacy. For example, one tool for data transparency is dashboards, which combine written and visualized data, are publicly available, and often offer frequent updates at multiple scales (e.g., city or neighborhood). A recent study examining the effects of data transparency through the use of COVID-19 online dashboards, for example, found that the public made decisions based on the information available on the dashboard and public trust increased [15].

The effects of individualized (at the household-scale) drinking water information on individuals’ perceptions of risk and self-efficacy has not yet been examined. Understanding the effects of individualized water quality information can help quantify the cost/benefit tradeoffs related to data transparency. For example, households have access to responsive technologies such as the smart thermostat. Comparable access to drinking water information may assist households in making more informed decisions to optimize water quality, such as electing to use water filters or flushing water after long periods of stagnation. At the same time, there are challenges associated with expanding access to individualized water quality information, including sensor development, building the trust and technical understanding required to deploy sensor technology into water users’ homes, and the resources required to distribute the sensors widely and equitably.

To advance new approaches for drinking water quality data access, and to examine the effect of access to individualized water quality data on perceptions of drinking water, this paper aims to address the following questions using a case study in Detroit, Michigan. First, how do water users in Detroit perceive drinking water and their water service provider? What explains variation in these perceptions? Second, does access to individualized water quality data shift perceptions of drinking water and the water service provider? Third, does access to individualized water quality data shift behaviors around drinking water? Fourth, how do drinking water perceptions in Detroit, Michigan compare to national perceptions? To answer these questions, we partnered with a neighborhood-scale community organization in Detroit to install water quality sensor nodes in residents’ homes and gathered information before and after the installation about their drinking water risk perceptions and behaviors.

Case study: Detroit, Michigan

The city of Detroit holds important characteristics related to drinking water that may influence residents’ perceptions of drinking water quality and their drinking water provider. First, Detroit has received national attention due to the city’s financial hardships and subsequent use of water shutoffs as a means to encourage residents to pay water and sewer bills [16,17]. Once an epicenter of the automotive manufacturing industry, the city has experienced population decline since the 1950s, resulting in an oversized, aging water system for which the cost of maintenance is now distributed across fewer residents [17]. Oversized and aging water systems suffer from water quality deterioration due to a higher prevalence of pipe breaks and corrosion as well as increased water age, which is associated with changes in water quality parameters such as disinfectant decay and microbiological growth [18,19]. Following the city’s 2013 declaration of bankruptcy, water costs increased by 119%; as of 2021, nearly 50% of the city’s residents pay more than 3% of their household income for drinking water [17,20,21]. For context, the U.S. EPA uses 2% and 4.5% of median household income (MHI) as benchmarks of affordability for drinking water and combined drinking water and sewer bills, respectively [22]. Although MHI has been criticized as an incomplete or inadequate measurement of household affordability [22,23], we include it here because it is currently more universally applied than alternative measures. Second, in Detroit, nearly 10% of the city’s population experiences plumbing poverty through a combination of above average rates of poverty, high housing costs and lack of complete plumbing [20]. Plumbing poverty is a concept introduced by Deitz and Meehan to describe the intersectional relationship between infrastructure, space, and social inequality [24,25]. A lack of complete plumbing can manifest in several forms, including lack of hot or cold running water, a faucet in a sink, or a bathtub or shower. Third, because of the community’s experience with water-related hardships, Detroit has a highly engaged community of water activists and non-profits (e.g., People’s Water Board) [26]. Consequently, not only do persistent water hardships erode public trust, but water issues are quite visible. These characteristics are useful to understand the basis for the perceptions of drinking water in an urban legacy city.

Materials and methods

A community-partnered approach

The study site is a community in Northwest Detroit that spans about one square mile and was selected for two reasons. First, based on water quality models, the distribution system in this area is subject to variable water age with pockets of both low and high water age. Water age is the length of time water remains in pipes prior to reaching the tap and high water age can adversely impact water quality [27]. Second, the governance model in our case study neighborhood is unique. The community has an organization dedicated to improving the quality of life in the neighborhood which includes a Board of Trustees composed of community residents and stakeholders. As part of a larger project called Water and Health Infrastructure, Resilience, and Learning (WHIRL), we approached the Board with a proposal to conduct research. Upon reviewing our draft interview questions and Human Subjects protocol, we were approved by the Board to enter the community in May 2021. From June to October 2021, we attended the neighborhood’s farmers market and hosted an ‘Ask a Water Scientist’ booth to interact with the community and introduce ourselves and our study. We also attended three monthly neighborhood virtual meetings in which we presented our proposed study. Finally, after spending six months in the community, we hired two local community members who worked as community researchers in December 2021 [2830]. The community members were trained in research methods and ethics, integrated into the research team, and are co-authors of all outcome reports, including this paper. The community researchers served as study participants’ first point of contact and proved critical to building a trusting relationship between the participants and the research team. They helped to recruit participants, facilitated an environment to discuss drinking water quality, and participated in data collection and analysis.

Participant recruitment and data collection

Twenty-four individuals (one per selected household) in the neighborhood participated in the study between March and November 2022. Participants were recruited from the farmers market, neighborhood calls, or via the liaisons during community events (e.g., church, other neighborhood meetings). Participants were required to have a space in their home where a water quality sensor node could be installed near an electrical outlet. Each participant was provided three $40 gift cards for their participation (one at the time of sensor node installation, one at the halfway point, around two weeks, and one at the time of sensor node un-installation). Water quality sensor nodes were used to collect the water quality data, which were conveyed to an online server through WI-FI or cellular data [31]. One sensor node per home was installed in a location accessible to a sink or drain (e.g., under a sink or a laundry machine) to monitor water quality in the premise plumbing. The sensor nodes measured temperature, pH, pressure, electrical conductivity, and oxidation-reduction potential every five minutes. None of these water quality parameters are regulated under the National Primary Drinking Water Regulations (NPDWR) by the EPA, although pH is considered a secondary contaminant with an aesthetic impact. However, all of the measured parameters have implications for water quality, and pH, electrical conductivity, and temperature are commonly reported in CCRs. Water temperature can serve as an indicator of microbiological activity and can also influence a water user’s willingness to drink the water, and pH can be used to understand the likelihood of pipe corrosion and indicate the presence and effectiveness of a disinfectant [27]. Water users may be familiar with drinking water pH through bottled water marketing (e.g., alkaline water). Water pressure can indicate the likelihood of pipe leaks or bursts, which can occur if the pressure is too low or too high. Electrical conductivity is an indicator of the total dissolved solids (TDS) and salinity of drinking water and can affect a drinking water aesthetics. High TDS can affect the taste, sight, and smell of drinking water, and is a secondary contaminant regulated by the EPA [32]. Lastly, ORP is a measure of the aquatic environment’s oxidizing and reducing properties and is often used as an indicator of the presence of disinfectant [33]. The full water quality dataset is available in the Figshare Repository [34]. Sensor node information was supplemented with periodic measurements from grab samples for the same parameters as well as free chlorine residual. A YSI Professional Plus Multiparameter Instrument (Yellow Springs, Ohio, USA) was used to validate and supplement the online water quality sensor measurements. The instrument was used to validate pH, electrical conductivity, ORP, and temperature values. To ensure accurate measurements, the instrument was calibrated each morning before use. A Hach DR900 Multiparameter Colorimeter (Loveland, Colorado, USA) was used with DPD (N,N-diethyl-p-phenylenediamine) powder pillows for the free chlorine grab samples. The Hach meter was calibrated with blanks before each use.

The entry interview was facilitated by the community researcher while the sensor node was installed by an academic researcher. The sensor node remained in each home for four weeks. During this time, the community researchers were in regular contact with the participants, which included calling each participant at least once a week to ask about their water usage patterns. The research team visited each participant during the second week to collect additional water quality samples to validate the sensor node performance and measure free chlorine residual, which was not directly measured by the sensor node. The node was un-installed at the end of the fourth week. A water quality report was created and brought to the un-installation appointment at which time the research team reviewed the water quality report with the participant. While the academic researcher un-installed the sensor node, the community researcher completed an exit interview with the participant. We followed up with each participant six to twelve months after the sensor node was un-installed with a three-question interview to learn about any new or remaining questions participants had about their drinking water quality as well as changes in their drinking water habits.

Out of the 24 participants, 87% of respondents were over 45 years old, 71% were female, and 79% were Black or African American. A total of eight percent of respondents reported annual income greater than 50,000; however, 63% preferred not to answer. Eight percent of respondents reported children living in the study home. Most (83%) of respondents owned their study home. Approximately 83% of homes were older than 1986, the year the new installation of pipes and pipe fittings containing more than 8% lead and solder and flux containing more than 0.2% lead was disallowed by the Safe Drinking Water Act [35]. All homes had complete plumbing. Data were not directly collected about the age of the pipes or if participants’ pipes were recently replaced; however, some participants volunteered this information in the open-ended response portion of the interview. Detailed descriptive demographics are available in S1 Table and a comparison to Detroit city-wide and national demographics is available in S2 Table [36].

Interview tool and water quality report design

The first draft of the interview tool comprised both structured (Likert scale) and open-ended questions drawn from Day et al. [37], the 2020 AWWA Public Perceptions of Tap Water Poll [38], and risk perception items from the Risk Information Seeking and Processing Model [39]. After the initial interview tool design, the study team met twice with the community researchers to review each question. The community researchers gave feedback on question clarity, accessibility, and phrasing. After these two iterations, the community researchers met with a member of the community Board to facilitate a ‘mock interview,’ both to practice administration and to receive further feedback on questions. Once the feedback was incorporated, both the entry and exit interview tools were finalized and underwent Institutional Review Board (IRB) review at the University of Michigan (HUM 00199905) and Wayne State University (IRB 21-07-3786). The final entry interview tool consisted of 28 questions designed to gauge perceptions of drinking water and the water service provider as well as drinking water use and sources from which participants received information about their drinking water, plus eight demographics questions. The questions around perceptions of drinking water included questions about the quality and safety of drinking water. The terms ‘quality’ and ‘safe’ were not explicitly defined for the study participants; instead, participants responded to these questions based on their personal definitions of the terms. The questions around the water service provider were designed to gauge perceptions of the provider’s expertise and credibility. The final exit interview tool consisted of the same 28 questions plus six additional questions assessing the study’s impact on participant perceptions of their drinking water and water service provider. A complete list of questions is available in S3 Table. Most Likert scale questions were paired with an open-response question in which the participant could provide the justification or reasoning for selecting a particular Likert scale rating. The Likert scale ranged from 1 to 5, with 5 being the ‘highest’ possible response. For example, ‘Excellent’ or ‘Strongly Agree’ would be a 5, ‘Don’t Know’ or ‘No Opinion’ would be a 3, and ‘Poor’ or ‘Very Low’ would be a 1. Likert scale responses ranged across the entire scale for both interview responses in ten of the questions. ‘Don’t Know/No Opinion’ responses were included in the response analysis because they can reveal important information about shifts in respondents’ awareness or knowledge about drinking water between the entry and exit interviews.

The water quality report was written at an eighth-grade reading level and consisted of a cover sheet outlining the different types of data collected (e.g., continuous sensor node data, grab samples), a glossary of terms used in the report, and included a description of water quality parameters measured that were approximately one page long each. Each description of the parameter included a definition of the parameter, the units in which it was measured, and a graph showing the daily average of the parameter during the time the sensor node was installed. Each parameter page also included the written average of the measured parameter during the time the sensor node was installed in the context of what range of values the parameter is observed in (1) typical drinking water [40] and (2) the 2021 Detroit Water and Sewerage Department Water Quality Report [41]. The report also provided context for how these parameters relate to and indicate drinking water quality. Lastly, the report contained a table summarizing the water quality information from the grab samples, which included free chlorine residual. The research team reviewed and discussed the water quality report with each participant in person. An example report is available in the Supporting Information under S2 Text with S4 through S8 Figs and S6 Table.

Five of the questions included in the interviews were based on questions in the June 2020 Public Perceptions of Tap Water poll conducted by AWWA (Table 1). These questions were selected to investigate the similarities and differences between the opinions of a poll representative of the national demographics and the Detroit community. We changed the phrasing of the questions and multiple-choice answers slightly for clarity or continuity in the interviews. The AWWA poll was conducted among a national sample of 2,200 adults and data were weighted to approximate a target sample of U.S. adults based on demographic information [38].

thumbnail
Table 1. Questions shared between the AWWA poll and WHIRL entry interview.

Some questions were worded differently in the WHIRL entry interview for clarity.

https://doi.org/10.1371/journal.pwat.0000188.t001

The follow up survey consisted of three questions, all open-ended. Participants received a $25 gift card, in addition to the three $40 gift cards previously given, for their participation. Twenty-two out of the 24 original participants were available to complete the follow-up survey.

Data analysis

Each Likert scale question was paired with an open-ended question to allow participants to explain their rating. The responses to open-ended questions were coded according to recurring themes. Two members of the study team independently identified themes for each response, then reviewed them together, created a codebook, and re-coded all responses based on the codebook. The themes and their definitions are available in S4 Table. Paired Wilcoxon signed rank tests [42] were conducted in R to measure statistical significance between entry and exit interview Likert scale responses (α = 0.05, Benjamini-Hochberg p-value adjustment method); results are available in S5 Table. Spearman rank correlation coefficients were used to assess the relationship between responses to Likert scale questions.

Ethics

All work protocols were laid out in the IRBs described above. The IRBs were declared “exempt” by University of Michigan and Wayne State University. For all participants, we provided an informed consent form during the initial visit. We reviewed it with them prior to beginning the study and obtained verbal consent to continue. The informed consent: (i) laid out the study objectives, risks, and benefits; (ii) provided the study team contact information; and (iii) emphasized that participants may leave the study at any time. Signatures were not required because the study was exempt.

Results and discussion

The water quality parameters measured during the study period were relatively consistent across all homes [42] and most were in the range reported in the Detroit Water and Sewerage Department’s 2021 Consumer Confidence Report [41]. In contrast, we found that chlorine residual levels were consistently low in the study area and we discussed with participants the importance of having sufficient chlorine to prevent microbial growth.

Perceptions related to drinking water and the water service provider in Detroit, MI in the entry interview

Perceptions of water quality.

Questions 1 through 12 were designed to gauge perceptions of risk related to drinking water. Mean scores on the Likert scale are in Table 2 under “Risk perceptions.” Questions 1 through 6 were specifically related to the quality and safety of drinking water. Half of the respondents rated the quality of their drinking water at the faucet as ‘Good’ or ‘Excellent’ (Q1). 46% and 54% rated the safety of their drinking water at the faucet and coming from their pipes (Q3 and Q5, respectively) as at least ‘Somewhat Safe.’ There were positive relationships between responses to questions about the quality and safety of drinking water (Spearman rank coefficients range from 0.55 to 0.76, p < 0.001). However, nearly a third of participants were uncertain about water safety at either location; with 29% and 33% selecting ‘Don’t Know/No Opinion’ for questions related to drinking water safety, Q3 and Q5, respectively. Questions 7 through 12 asked participants about the substances in their drinking water and the likelihood of getting sick from them. The most common response to these questions was ‘Don’t Know/No Opinion.’ with the exception of “The likelihood of people in my household getting sick from drinking water is:…,” for which the most frequent response was ‘Extremely Low’ (46%)., Participants were particularly uncertain about if there were substances in their drinking water that are serious and could cause harm: 79% chose ‘Don’t Know/No Opinion.’

thumbnail
Table 2. Descriptive statistics (mean and standard deviation) for entry and exit interview Likert scale questions.

https://doi.org/10.1371/journal.pwat.0000188.t002

Participants who rated the safety or quality of their drinking water with at least ‘Somewhat Safe’ or ‘Good’ most frequently referenced self-determination or water aesthetics in their explanatory responses. Self-determination was coded for responses such as “I assume the water is of good quality. I do not use the faucet water for drinking water” and “Because I use it.” In other words, self-determination was coded if another theme was not explicitly referenced and the response was centered around the participant’s actions (e.g., how they choose to use, or not use, their water). Water aesthetics was coded when responses referred to the taste, smell, or sight of drinking water. Two-thirds (67%) of participants referenced water aesthetics when asked to describe why they would rate the quality of water at their faucet a certain way (Q2). For example, participants responded with statement such as: “I think it’s safe because there is no smell, the water is clear,” “Water is cloudy, [it] has to run before it clears,” “Sometimes the water comes out a rusty color,” and “Sometimes it smells like bleach, and that makes me uncomfortable.” It is possible that the self-determination theme is underpinned by themes like water aesthetics; for example, a participant has decided to use the water based on the lack of concern about smell or taste, but this analysis does not further explore the undercurrents of such a response. Just under a third (29%) of respondents used self-determination as justification for their entry interview response to questions asking about the quality and safety of their drinking water; for example, “I don’t have any problems with the water,” and “I have not been affected by the drinking water.”

While not referenced as frequently as aesthetics and self-determination in their explanations for their perception of drinking water quality, it is evident that the Detroit residents interviewed were thoughtful and concerned about the status of drinking water infrastructure and the potential presence of lead. Based on participant references to Flint, Michigan, this is likely due to the exemplification of the widely reported Flint Water Crisis, during which dangerously high levels of lead were detected in numerous residents’ drinking water, causing negative health impacts and a loss of community trust. Exemplification theory describes how exemplars, which are memorable and typically vivid words, sounds, or images, can be used to create meaning about complex situations or concepts [43]. Further, while trust is typically built slowly over time, it can be disrupted by a single, memorable event, such as the Flint Water Crisis [44]. We see references of infrastructure, lead, Flint, and other communities with water-related issues, such as Highland Park, Michigan, which is an enclave of Detroit, in the entry interview responses for questions about drinking water safety The majority of infrastructure references were negative; for example, “Because the pipes in the city are old and may contain lead.” Two responses referenced pipe upgrades, such as “Pipes have been converted to copper.” The reference to the specific pipe material, not only the recent pipe replacement, suggests that these participants have some knowledge about the role that pipe material may play in maintaining drinking water quality. When asked to provide key examples of other events participants associated with drinking water, responses were almost binary: they either referenced a household event, such as watering plants or washing the car, or lead in drinking water. One participant writes, “In the back of my mind I keep thinking about the Flint Water Crisis.”

Lack of information or uncertainty was a common theme used to justify responses of ‘Don’t Know/No Opinion’ when asked about drinking water quality, specifically when asked if there are substances in their drinking water that can cause harm. Some 46% of respondents explained their Likert scale rating to this question by referencing a lack of information; for example, “I don’t know what’s in the water,” or “Lack of knowledge about what’s safe and unsafe.”

Perceptions of water provider.

Questions 13 through 18 and Question 28 were designed to capture participants’ perceptions of their water service provider’s trustworthiness and expertise. Mean scores on the Likert scale are presented in Table 2 under “Credibility and Expertise.” Likert scale responses were distributed mainly among three main categories. In general, participants expressed either positive views or uncertainty about the water service provider. Participants selected ‘Somewhat Agree’ (32%), ‘Strongly Agree’ (23%), and ‘Don’t Know/No Opinion’ (36%) when asked if the information they received from the provider is accurate (Q14), with 9% selecting ‘Strongly Disagree.’ When asked if they trust the information they receive (Q15), 36% selected ‘Somewhat Agree’, followed by ‘Strongly Agree’ (32%), ‘Don’t Know/No Opinion’ (23%), and ‘Strongly Disagree’ (9%). Most (61%) somewhat or strongly agreed that the service provider was knowledgeable (Q16). There were strong positive relationships between responses for questions about the water service provider’s trustworthiness and credibility (Spearman rank correlation coefficients range from 0.74 to 0.94, p < 0.001). There was one weak, statistically significant relationship between participants’ rating of drinking water quality at the faucet (Q1) and trust in the water service provider (Q15) (Spearman rank correlation coefficient = 0.44, p < 0.05). All other relationships between the water quality questions water service provider questions are not statistically significant. When asked if their water service provider cared about their health and that of their family (Q17), responses were clustered primarily around ‘Strongly Agree’ (33%) and ‘Don’t Know/No Opinion’ (38%).

Only one open-ended question (Q18) was included for the trustworthiness Likert scale questions. Question 18 asked, “In a sentence or two, please describe why you have rated the previous question [my water service provider cares about the health of me and people in my household] as…”. The most frequently referenced themes were positively referenced trust (21%), negatively referenced trust (13%), and lack of information (13%). Open-ended responses ranged broadly, touching on uncertainty and a lack of communication: “Because I don’t know what’s in the water,” “Never had the water tested,” “I don’t talk to the water company” to positive interactions: “They will send someone out if I call with problems regarding my water,” and pride in the city “It’s the City of Detroit, which cares about residents.” When asked if there was anything they would like to change about how they receive information about their water (Q28), ten participants (38%) requested more frequent monitoring or communication.

Effect of individualized water quality data on perceptions

Perceptions of water quality.

Overall, mean scores for perceptions of drinking water quality were greater in the exit interview compared to the entry interview (Table 2). S1S3 Figs visualize questions with a statistically significant difference in entry and exit interview responses. When asked to rate the quality of their drinking water at the faucet (Q1), more respondents (75%) selected ‘Good’ or ‘Excellent’ compared to the entry interview (51%). The remaining 25% of participants selected ‘Just Fair,’ a decrease from 38% in the entry interview. The difference in responses for Q1 is statistically significant (paired Wilcoxon signed rank test, α = 0.05, p = 0.02, z = -2.24); see S5 Table for all Wilcoxon scores. When asked about the safety of their drinking water, there is a clearer consensus than seen from the entry interview: 92% and 71% of participants chose at least ‘Somewhat Safe’ for drinking water at the faucet (Q3) and coming from pipes (Q5), a 46% and 17% increase from the entry interview, respectively. More specifically, 67% of participants selected a higher Likert scale rating in the exit interview, when asked to rate the safety of drinking water at the faucet, than they did in the entry interview, and only one participant out of 24 chose a lower rating. The difference in responses for Q3 is statistically significant (paired Wilcoxon signed rank test, α = 0.05, p < 0.001, z = -3.45). Importantly, the number of participants choosing ‘Don’t Know/No Opinion’ decreased across all questions about water safety and quality, suggesting that the individualized water quality information helped resolve some of the uncertainty around these concepts for participants. Participants often referenced an improved understanding of pH and ORP, and how these parameters relate to water quality (e.g., disinfectant levels) and safety (e.g., likelihood of microbiological growth). This trend holds for the questions asking participants about if there are substances in their drinking water and the likelihood of them causing harm. While nearly a third of all respondents still selected ‘Don’t Know/No Opinion’ for these questions, we see more participants choosing responses on either side of this option. Positive relationships persist between ratings of drinking water quality and safety (Q1, Q3, and Q5) (Spearman rank correlation coefficients range from 0.54 to 0.62, p < 0.01).

In contrast to the entry interview, fewer participants referenced water aesthetics across all questions related to risk perceptions except for rating the safety of water quality in their pipes, which remains constant. Instead, collectively, more participants referred to self-determination or the study intervention; for example, “The water quality report justifies this rating.” Water aesthetics is a means for water users to monitor drinking water quality in the lack of more plentiful data. The shift to fewer aesthetics-related responses in the exit interview suggests that participants may de-prioritize aesthetics as a primary tool for determining water quality with more accessible information. This could be an important area of future research with a larger participant group across multiple water providers on which a correlation analysis could be sufficiently performed. Nonetheless, it is important for water users to consider both aesthetics and other modes of data when making decisions about their drinking water; indeed, water user feedback related to odor, taste, and smell is useful to utilities to address quality issues relating to aesthetics more quickly, such as a strong chlorine or sulfurous odor, water turbidity, or unusual taste [45].

Many responses about the safety of drinking water at the faucet and coming from pipes also referred to health status. For example, participants wrote, “I’m washing dishes with the water, and I haven’t gotten sick,” and “Because of the WHIRL water study report, and I have never gotten sick from it.” One potential reason for the increased number of mentions about sickness in the exit interview is the discussion the study team had with each participant around the relationship between free chlorine residual, microbial growth, and public health while reviewing the water quality report due to the relatively low chlorine residual (average = 0.29 mg/L, standard deviation = 0.17 mg/L) found in participants’ homes. We talked in-depth about the consistently low free chlorine residual found in the area and discussed flushing as a potential measure to bring fresher water into the home after it has been stagnating overnight. Further, there was a significant difference between entry and exit interview Likert scale responses to Q11, “The likelihood of people in my household getting sick from drinking water is:…” (paired Wilcoxon signed rank test, α = 0.05, p = 0.02, z = -2.33), with more participants choosing ‘Low’ or ‘Extremely Low’ in the exit interview (84%) compared to the entry interview (59%). Graphs visualizing the questions with significantly different entry and exit interview responses are available in S1S3 Figs.

During the exit interviews, fewer participants reference infrastructure, Flint, or other exemplars relative to the entry interviews. Instead, participants tended to reference the water quality report provided and self-determination, yet references to infrastructure and exemplars do not subside completely, highlighting the power exemplars have in cementing public perception. We found that lead was a critical concern for many of our participants and this often came up as a topic of conversation during our review of the water quality report. Concern for lead is likely an extension of exemplification from the Flint Water Crisis and demonstrates the participants’ accurate understanding of the health risks involved with lead exposure. Consequently, we offered lead testing to every participant after the study was complete, meaning that the results of the lead sampling did not influence the exit interview, but could have influenced the results of the follow-up survey. The lead sampling protocol is available in the Supporting Information under S1 Text. The Lead and Copper Rule has a Maximum Contaminant Level Goal (MCLG) of 0 ppb [46]. Results ranged from non-detect (< 0.5 ppb) to 15 ppb.

Perceptions of water service provider.

Questions about the accuracy and trustworthiness (Q14 and Q15) of the information received from the water service provider continued to be grouped in three main categories, mirroring the entry interview. Roughly a third each of respondents chose ‘Strongly Agree,’ ‘Somewhat Agree,’ and ‘Don’t Know/No Opinion’ for these questions. When asked if the water service provider is very knowledgeable (Q16), there is a clearer consensus than in the entry interview, with 50% of participants choosing ‘Strongly Agree.’ When asked if the service provider cares about them, 29% of participants selected a higher Likert scale rating than the entry interview, 46% selected the same, and 25% selected a lower rating. However, none of the questions asking about trustworthiness had a significant change, as determined by paired Wilcoxon signed rank tests (S5 Table, minimum p-value = 0.07), in responses between the entry and exit interview. In contrast to the changes in Likert scale selections for questions around water quality, all of which showed clear trends in the positive direction, the differences between entry and exit interview Likert scale questions around trustworthiness are not as clear. There continue to be positive relationships between questions related to water service provider trustworthiness and credibility (Spearman rank correlation coefficients range from 0.46 to 0.78, p < 0.05) with one exception. There is no longer a statistically significant relationship between responses to “The information I receive from my water service provider is accurate” and “My water service provider cares about the health of me and people in my household.” There continues to be a weak positive relationship between responses to “How would you rate the quality of the water at your faucet?” and “I trust the information I receive from my water service provider,” (Spearman rank correlation coefficient = 0.47, p < 0.05).

Two main trends were observed in the open-ended responses about the water service provider. First, more participants requested changes in how they would like to receive information about their tap water in the exit interview than the entry interview. Half (50%) of respondents requested increased monitoring and information access and transparency or more frequent communication in the exit interview compared to 38% in the entry interview; for example, “More printed information,” “More up to date on pipe conditions,” “More information about the quality of water coming through pipes,” and “More frequent communications.” This shift suggests that participants valued the information in the water quality report and further recognized the need for improved communication and organizational accountability, a principle of data transparency, in the exit interview. Second, responses to the question, “My trust in my water service provider has not changed since the start of this study,” ranged from skepticism to confidence: “Because [the water service provider] does not have our best interests at heart” to “[I was] confident in them in the beginning and the study confirmed this confidence.” Quotes from the participants who indicate their trust is unchanged highlight the trust built between themselves and the study team, not the provider; for example: “Nothing has changed with my water service provider since this study,” “The water department has not shared any information,” “This study is not going to change nothing with my water service provider.” Another participant writes, “Before the study I didn’t trust [the water service provider]–now I have more trust only because I trust the researchers.” These responses suggest that the source of the water quality data (i.e., the water service provider versus a third party) may have a role in the effects.

Effect of individualized water quality data on behavior and drinking water habits

We followed up with each study participant between six and twelve months after the exit interview was completed to learn about potential behavior changes. Twenty-two out of the 24 participants agreed to a follow up interview. Half (50%, n = 11) of respondents reported that their drinking water habits have changed since the study. Out of all 22 respondents, seven participants (32%) reported flushing their cold water tap for up to 5 minutes prior to drinking from the faucet in the morning, 5 participants (23%) reported drinking more tap water, and 2 participants (9%) reported drinking less tap water. These two participants cited specific reasons; one was influenced by recent water contamination events in the news, such as the East Palestine, Ohio spill; the other learned that their drinking water had detectable levels of lead in it due to the free lead analysis we offered after the study was completed in response to many participants’ concerns about lead. Participants also reported a desire to learn more about actions they can take to measure water quality parameters themselves, or other actions related to maintaining drinking water quality. While expressed only by three participants (14%), this type of response is interesting given that no participants expressed this desire in the entry or exit interviews. This new interest in specific actions around managing water quality suggests that the individualized data was effective in bolstering the participants’ self-efficacy and sense of agency.

We also conducted interviews with our two community researchers at the end of the study to investigate what they had learned from being immersed in the research. Their reflections are valuable because they were in close contact with the 24 participants and were uniquely positioned within the study. One key takeaway the researchers expressed was reducing the amount of bottled water used for drinking–one researcher used to cook only with bottled water and regularly purchased gallons of water per week; but reported they will now drink from the tap and has greatly reduced the amount of bottled water purchased. The second takeaway that community researchers adopted was the practice of flushing the tap before drinking water in the morning to draw fresher water after water has stagnated overnight.

Comparative analysis to national poll

The comparative analysis between WHIRL respondents and the AWWA national poll highlighted similarities and differences in beliefs. First, AWWA respondents were more likely to rate their drinking water quality highly. The majority of AWWA pollsters rated the quality of water at their faucet as ‘Excellent’ or ‘Good’ (79%). High-income earners, defined by AWWA as >$100,000 per year, more frequently chose ‘Excellent.’ In contrast, just half of WHIRL respondents rated the same in the entry interview (50%), and of this half, none had a reported income of more than $74,999. Similarly, 81% of AWWA pollsters rated their water at the faucet as ‘Very Safe’ or ‘Somewhat Safe’ compared to 46% of WHIRL respondents. Population estimates of the United States and Detroit, Michigan report 13.6% and 77.9% Black or African American, respectively, and a median income of $69,021 and $34,762, respectively(37) Given the evidence to suggest that communities of color and low-income earners are more likely to view bottle water is safer than tap water [4749] and have lived experiences [50,51] reinforcing distrust in tap water, as well as the recent water-related hardships experienced by Detroiters [16], the difference in quality rating is unsurprising.

Second, despite differences in the safety and quality ratings, AWWA and WHIRL respondents both reported relatively low faucet use. Around one-third of AWWA respondents reported drinking from the faucet several times a day (36%); another 24% reported never using the faucet. Similarly, 17% of WHIRL participants reported drinking from unfiltered tap water most of the time and 38% reported none of the time (Q19, entry interview). These results reinforce the notion that the relationship water users have with their drinking water is complex and influenced by a variety of factors; certainly more than just opinions about water quality and safety. Prior studies have shown that bottled water may be preferred over water from the faucet due to a belief that bottled water is safer, tastes better, or is more convenient, including the AWWA poll [38,51].

Third, when asked about drinking water information sources, the most common source selected by AWWA respondents was the water utility company’s website (63%) followed by local government websites (26%) and internet search engines (26%). In contrast, WHIRL participants most often selected TV (50%), the water service provider (42%), and printed notices (33%). Further, over half of AWWA pollsters report no frequent communication with their utility (60%) with just 4% reporting communication in the last year. A little over a third (38%) of WHIRL respondents report no or infrequent communication and 42% report annual communication from their water service provider. Several WHIRL participants reported monthly communication from their provider in the specific form of a monthly water bill. It’s unclear if the AWWA respondents considered a bill as sufficient communication from their utility to report–only 11% reported communication within the last month.

This comparative analysis shows the similarities in drinking water habits and differences in perceptions of risk and communication preferences around drinking water that are held by a sample proportionally representative of the U.S. population (AWWA) versus a majority Black, urban population in a legacy city with an over-sized water distribution system. This comparison highlights the utility of knowing the audience and generating community-specific solutions; for example, while the results indicated similar tap water use between the two sample sets, the channels of communication through which they received their drinking water information were quite different and could be an influential factor.

Conclusions and implications for practice

Water users are interested in and affected by individualized water quality data.

These results demonstrate initial evidence of effects of individualized, household-scale water quality data on perceptions of drinking water and the water service provider’s trustworthiness and expertise. Participant responses highlight an increase in positive perceptions of drinking water quality and safety as well as an increased interest in more frequent communication from the utility about drinking water quality data after receiving individualized drinking water quality information. This initial evidence points to the need for continued study of the effects of individualized water quality data at a larger scale, because this interest motivates a potential paradigm shift from conventional provider practices. Currently, water users’ primary access to water quality information is through the CCR, which summarizes data on a yearly basis and at the system-level. The outcomes from this study show that water users may benefit from more frequent communication with more localized data.

Individualized water quality data can be actionable.

The results of this study suggest that participants found the individualized water quality information to be actionable, i.e., data that are presented in an accessible manner allows for more informed decision-making [52]. For example, 100% of participants report that they learned more about their drinking water quality from this study (Q29). Further, the results of the follow-up survey demonstrate the behavior changes that occurred after receiving the household-scale drinking water information. Of the 11 participants who reported specific behavior changes around drinking water use, 10 cited specific reasons related to the information they received as part of the study.

Because the data shared in this study is individualized, it allows the participants to learn about risks specific to their household, and therefore make specific decisions. The highly summarized information in the CCR does not allow such specific decision-making and is thus not as actionable. For example, it is not actionable to a water user to know that 90% of lead samples taken in the water system were below the action limit, because this does not provide accurate information about lead at their tap. Further, because the information in the CCR is highly summarized, it assumes that water quality is homogeneous across large water distribution systems. This is important given that the CCR is a primary, and often only, method for communicating drinking water quality risk information with the public. Risk messages are generally used before a crisis occurs and are designed to assist individuals in understanding how to reduce their current exposure to risks, build mutual trusted relationships with the risk communicator, and increase self-efficacy [8,43,53]. While the Environmental Protection Agency has proposed new rules for the CCR, including a biannual release, the information will still be available only at the system-level [54]. The results here suggest that water quality data should be more localized, detailed, and contextualized, which increases the information relevancy and allows the information to be used for more informed decision-making and greater health protection [13].

Public needs and perceptions are community-specific.

Public participation is another principle of data transparency that emphasizes the importance of understanding the audience’s needs and preferences. The results presented here demonstrate the utility of understanding community-specific concerns. This is highlighted in the difference between the risk perceived by the nationally representative poll conducted by AWWA and the interviews conducted in this study. Further, the comparative analysis between the AWWA and WHIRL data is demonstrative of the effects of income, race, and preferred sources of information on drinking water perceptions, which is supported by previous studies [37,51,55]. We also see exemplars, such as the Flint Water Crisis and other nearby communities experiencing water-related issues, such as Highland Park, MI, referenced repeatedly as an explanation for participant perceptions.

Utilities have traditionally favored a passive, unidirectional approach to risk communication, with information flowing from the utility to users. Even in cases when a more proactive strategy is used, assumptions are often made about what information the water user needs [3,56]. For example, a study examining utility approaches to evaluating the effectiveness of CCRs found that fewer than 2% of utilities specifically asked water users about their concerns to guide the content in the CCR [3]. We found that feedback from the community researchers and other community members was especially valuable, not only for effective science communication but also for determining what interests the participants had around their drinking water. For example, lead was referenced so frequently in the entry interviews that we decided to offer lead sampling. Additionally, leaders at a Michigan drinking water treatment plant found that by listening to community concerns, they were able to develop strategic communication strategies to address water users’ concerns and create a sense of agency [57]. Understanding what information the community seeks offers the opportunity for more effective communication that meets community needs.

Lastly, we acknowledge that the goal of building upon and improving existing tools such as the CCR is usually to increase public trust with the implicit assumption that the public needs to trust the water service provider more. This assumption is inherently based upon a knowledge-deficit model [58,59] of the public’s understanding of drinking water. As suggested by Wilson et al. [60], distrust in drinking water or the water service provider can be effective as a “barometer for overarching or targeted system failures…” which can “…serve as calls to engage other knowledges…and also for experts to do more…to earn trust rather than working to convince the publics to trust under the existing circumstances.” The work in this paper demonstrates the effect of individualized water quality data on water user perceptions and gives suggestions for improving water user-provider communications. However, it also highlights some of the specific perceptions held by Detroiters due to their lived experiences, including lack of water affordability and myriad water shut-offs, which are indicative of broader systemic issues that should not be treated as a fault of the water user that needs to be corrected. Rather, these systemic issues should be fully considered and addressed by water systems as part of the trust landscape.

Overall, this work underscores the importance of understanding water users’ perceptions, beliefs, and needs, particularly in oversized and aging water systems. Participant responses suggest that water quality data on a more individualized scale is useful and more communication with the water service provider is desired. Further, participant responses demonstrate the importance of community-specific communication practices; as shown by the comparison to the AWWA poll, a one-size-fits-all prescribed communication approach may not fit the needs of a specific community. Thus, there are three broad implications from this work. First, water users are interested in and affected by individualized drinking water quality data. Second, individualized water quality data can be actionable, as demonstrated by the reported changes in water user behavior. Third, public needs and perceptions are community-specific, and communication practices can benefit from local input. In line with Sustainable Development Goal (SDG) 6, Target 6b, the research outcomes of this study help support the development of methods to strengthen local participation in drinking water quality management. Furthermore, access to quality drinking water is critical to healthy lifestyles; therefore, this work also supports SDG 3.

Study limitations and considerations

This study has several limitations and considerations for future work. First, the small sample size (n = 24) limits the generalizability and power of the study. While a larger sample size is ideal, it was not feasible to build and install sensor nodes in hundreds of homes. Second, participant demographics are not representative of the average demographics of the United States and therefore the results cannot be widely applied across all communities. At the same time, the participants represented in this study belong to a group of water users from legacy cities with aging or over-sized infrastructure; therefore, this work more fully characterizes the perceptions of risk, trustworthiness, and the communication needs of similar communities. Third, the participant pool was not randomly selected; participants were recruited from community and community researcher outreach. Therefore, the pool is composed of people who may have had preconceived notions about their drinking water quality and were willing and able to have a sensor node installed in their home. Fourth, none of the water quality parameters measured provided information about contaminants regulated by the National Primary Drinking Water Regulations, including microbiological indicators of water quality. The technology used to measure these contaminants, such as lead or total coliforms, in real-time is still developing. While the measured parameters were contextualized in the water quality reports, the outcomes of this study could be different if regulated chemical and microbiological water quality parameters were measured. Further, the study team did not explicitly define ‘quality’ and ‘safe’ for the participants. ‘Safe’ can evoke a range of perceptions and definitions based on individuals’ beliefs and experiences, and might differ from the viewpoint of a water service provider. Future work should consider how this term is defined in the context of drinking water. Fifth, the format (written, printed report) and channel (the research team) through which the water quality information was shared with participants may have had an impact on the study outcomes [61,62]. While out of the scope of this study, these effects should be carefully considered and examined in future research. Lastly, there are challenges associated with scaling this work up, such as the time and resources required to build and install sensor nodes across an entire distribution system. These limitations should be considered when designing future studies.

Given these limitations, future research should consider the development and feasibility of scalable, household-level technology for drinking water quality monitoring, including which parameters should be monitored, at what frequency should they be monitored, and how the data should be transformed and presented in line with the principles of science communication. The sensor nodes deployed in this work were first generation units, and advancements in real-time sensing technology coupled with artificial intelligence are rapidly occurring. The results from this study suggest that further investment in scalable, localized drinking water quality monitoring should be continued. Future research should also consider how water service providers can leverage the water quality data that are already collected on a much more highly resolved basis to provide more actionable data for water users. For example, some water service providers in the U.S. are currently piloting online dashboards that provide updated drinking water quality information on a quarterly basis. These pilots provide the opportunity to learn about water user opinions, needs, and perceptions of the dashboards, including the scale of the data and the frequency with which updates occur. Together, these future research directions will further advance new approaches for access to drinking water quality data.

Supporting information

S1 Fig. Likert scale responses for Question 1.

https://doi.org/10.1371/journal.pwat.0000188.s001

(TIFF)

S2 Fig. Likert scale responses for Question 3.

https://doi.org/10.1371/journal.pwat.0000188.s002

(TIFF)

S3 Fig. Likert scale responses for Question 11.

https://doi.org/10.1371/journal.pwat.0000188.s003

(TIFF)

S4 Fig. Example time series graph for oxidation reduction potential.

https://doi.org/10.1371/journal.pwat.0000188.s004

(TIFF)

S5 Fig. Example time series graph for pH.

https://doi.org/10.1371/journal.pwat.0000188.s005

(TIFF)

S6 Fig. Example time series graph for electrical conductivity.

https://doi.org/10.1371/journal.pwat.0000188.s006

(TIFF)

S7 Fig. Example time series graph for pressure.

https://doi.org/10.1371/journal.pwat.0000188.s007

(TIFF)

S8 Fig. Example time series graph for temperature.

https://doi.org/10.1371/journal.pwat.0000188.s008

(TIFF)

S1 Table. Demographic statistics of study participants.

https://doi.org/10.1371/journal.pwat.0000188.s009

(XLSX)

S2 Table. Comparison of demographics between study participants, the city of Detroit, MI, and the United States, collected from the United States Census Bureau.

https://doi.org/10.1371/journal.pwat.0000188.s010

(XLSX)

S3 Table. Complete list of question in entry and exit interviews.

https://doi.org/10.1371/journal.pwat.0000188.s011

(XLSX)

S5 Table. T-test scores for Likert scale questions.

https://doi.org/10.1371/journal.pwat.0000188.s013

(XLSX)

S6 Table. Example table comparing participant and DWSD report values.

https://doi.org/10.1371/journal.pwat.0000188.s014

(XLSX)

Acknowledgments

This work was developed as part of the Water and Health Infrastructure Resilience and Learning (WHIRL) project. We are grateful for the time and feedback provided by other WHIRL collaborators, the community partner organization, our study participants, the community researchers, and community reviewers, all of whom enhanced this work. We thank Dr. Yanna Lambrinidou, Ph.D., Adjunct Assistant Professor, Department of Science, Technology, and Society, Virginia Tech, and cofounder, Campaign for Lead Free Water for her insightful comments.

References

  1. 1. Nicholas W, Vedachalam S. Poor accessibility of water utilities’ consumer confidence reports. Util Policy. 2021;72: 101272. https://doi.org/10.1016/j.jup.2021.101272.
  2. 2. Trax JR, Snyder HD. A three-state pilot project to determine the effectiveness of a consumer confidence report template. Rural Water Research and Education Foundation. 1998, Duncan, Oklahoma.
  3. 3. Meyer-Emerick N. Are we answering the right questions? Improving CCR communication. Journal AWWA. 2004;96(8): 104–11.
  4. 4. Blette V. Drinking water public right-to-know requirements in the United States. J Water Health. 2008;6(SUPPL. 1): 43–51. pmid:18401128
  5. 5. Weisner ML, Root TL, Harris MS, Mitsova D, Liu W. The complexities of trust between urban water utilities and the public. Sustain Water Resour Manag. 2020;6(3): 1–12.
  6. 6. Hayden J. Introduction to health behavior theory. Jones & Bartlett Learning; 2022.
  7. 7. Anhalt-Depies C, Stenglein JL, Zuckerberg B, Townsend PA, Rissman AR. Tradeoffs and tools for data quality, privacy, transparency, and trust in citizen science. Biol Conserv. 2019 Oct; 238: 108195.
  8. 8. Lee Y, Li JYQ. The role of communication transparency and organizational trust in publics’ perceptions, attitudes and social distancing behaviour: A case study of the COVID-19 outbreak. Journal of Contingencies and Crisis Management. 2021;29(4): 368–84.
  9. 9. Matheus R, Janssen M, Maheshwari D. Data science empowering the public: Data-driven dashboards for transparent and accountable decision-making in smart cities. Gov Inf Q [Internet]. 2020;37(3): 101284.
  10. 10. Cooper M. Sharing Data and Results with Study Participants: Report on a Survey of Cultural Anthropologists. Journal of Empirical Research on Human Research Ethics. 2008;3(4): 19–34. pmid:19385754
  11. 11. Wang KH, Marenco L, Madera JE, Aminawung JA, Wang EA, Cheung KH. Using a community-engaged health informatics approach to develop a web analytics research platform for sharing data with community stakeholders. In: AMIA Annu Symp Proc 2017. p. 1715–23. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5977588/. pmid:29854242
  12. 12. Rawlins B. Give the emperor a mirror: toward developing a stakeholder measurement of organizational transparency. Journal of Public Relations Research. 2008;21(1): 71–99.
  13. 13. Ihlen Ø, Just SN, Kjeldsen JE, Mølster R, Offerdal TS, Rasmussen J, et al. Transparency beyond information disclosure: strategies of the Scandinavian public health authorities during the COVID-19 pandemic. J Risk Res [Internet]. 2022;25(10): 1176–89.
  14. 14. Kearns F. Getting to the Heart of Science Communication. Island Press; 2021.
  15. 15. Li VQT, Yarime M. Increasing resilience via the use of personal data: Lessons from COVID-19 dashboards on data governance for the public good. Data Policy. 2021;3. https://doi.org/10.1017/dap.2021.27.
  16. 16. Recchie A, Recchie J, Powell john a., Lyons L, Hardaway P, Ake W. Water Equity and Security in Detroit’s Water and Sewer District. [Internet]. Berkeley; 2019. Available from: https://escholarship.org/uc/item/2tv006jd.
  17. 17. Gaber N, Silva A, Lewis-Patrick M, Kutil E, Taylor D, Bouier R. Water insecurity and psychosocial distress: case study of the Detroit water shutoffs. J Public Health (Bangkok) [Internet]. 2021 Dec 10;43(4): 839–45. pmid:32930795
  18. 18. Klise KA, Murray R, Haxton T. An overview of the water network tool for resilience (WNTR). In: 1st International WDSA / CCWI 2018 Joint Conference. Ontario; 2018. p. 1–8.
  19. 19. Allen M, Clark R, Cotruvo JA, Grigg N. Drinking Water and Public Health in an Era of Aging Distribution Infrastructure. Public Works Management and Policy. 2018 Oct 1;23(4): 301–9.
  20. 20. Hughes Sara; Maloney Kathryn; Kaczmarek Anna; Newberry H. Addressing links between poverty, housing, water access and affordability in Detroit [Internet]. Ann Arbor; 2021. Available from: https://detroit.umich.edu/news-stories/addressing-links-between-poverty-housing-water-access-and-affordability-in-detroit/.
  21. 21. City of Detroit. Detroit Sustainability Action Agenda [Internet]. 2019. Available from: https://detroitmi.gov/government/mayors-office/office-sustainability/sustainability-action-agenda.
  22. 22. Conference of Mayors U.S., American Water Works Association, Water Environment Federation. Affordability Assessment Tool for Federal Water Mandates [Internet]. 2013. Available from: http://www.awwa.org/Portals/0/files/resources/water utility management/affordability/AffordabilityAssessmentTool.pdf.
  23. 23. Teodoro MP. Water and sewer affordability in the United States. AWWA Water Sci. 2019;1(2): e1129.
  24. 24. Deitz S, Meehan K. Plumbing Poverty: Mapping Hot Spots of Racial and Geographic Inequality in U.S. Household Water Insecurity. Ann Am Assoc Geogr. 2019;109(4): 1092–109.
  25. 25. Meehan K, Jurjevich JR, Chun NMJW, Sherrill J. Geographies of insecure water access and the housing–water nexus in US cities. Proc Natl Acad Sci U S A. 2020;117(46): 28700–7. pmid:33139547
  26. 26. People’s Water Board Coalition. People’s Water Board Coalition [Internet]. 2022 [cited 2023 Jun 7]. Available from: https://peopleswaterboard.org.
  27. 27. United States Environmental Protection Agency. Effects of Water Age on Distribution System Water Quality [Internet]. United States Environmental Protection Agency. Washington, D.C.; 2002. Available from: http://www.epa.gov/safewater/disinfection/tcr/regulation_revisions.html.
  28. 28. Hardy LJ, Hughes A, Hulen E, Figueroa A, Evans C, Begay RC. Hiring the experts: best practices for community-engaged research. Qualitative Research. 2016;16(5): 592–600. pmid:27833454
  29. 29. Sobeck J, Agius E. Organizational capacity building: Addressing a research and practice gap. Eval Program Plann. 2007;30(3): 237–46. pmid:17689329
  30. 30. Sobeck J, Smith-Darden J, Hicks M, Kernsmith P, Kilgore PE, Treemore-Spears L, et al. Stress, coping, resilience, and trust during the Flint Water Crisis. Behavioral Medicine. 2020;46(3–4): 202–16. pmid:32787730
  31. 31. Martinez Paz EF, Tobias M, Escobar E, Raskin L, Roberts EFS, Wigginton KR, et al. Wireless Sensors for Measuring Drinking Water Quality in Building Plumbing: Deployments and Insights from Continuous and Intermittent Water Supply Systems. ACS ES&T Engineering. 2022;2(3): 423–33.
  32. 32. Banna MH, Imran S, Francisque A, Najjaran H, Sadiq R, Rodriguez M, et al. Online drinking water quality monitoring: Review on available and emerging technologies. Crit Rev Environ Sci Technol. 2014;44(12): 1370–421.
  33. 33. James CN, Copeland RC, Lytle DA. Relationships between Oxidation-Reduction Potential, Oxidant, and pH in Drinking Water. In: AWWA Water Quality Technology Conference [Internet]. United States Environmental Protection Agency; 2004. p. 1–13. Available from: https://cfpub.epa.gov/si/si_public_record_Report.cfm?Lab=NRMRL&dirEntryId=113473.
  34. 34. Schubert A, Harrison J, Kent-Buchanan L, Bonds V, McElmurry SP, Love NG. A point-of-use drinking water quality dataset from fieldwork in Detroit, Michigan [Internet]. figshare. 2024 [cited 2024 Feb 11].
  35. 35. United States Environmental Protection Agency. Lead Ban: Preventing the Use of Lead in Public Water Systems and Plumbing Used for Drinking Water [Internet]. 1986. Available from: https://nepis.epa.gov/Exe/ZyNET.exe/10003GWO.TXT?ZyActionD=ZyDocument&Client=EPA&Index=1986+Thru+1990&Docs=&Query=&Time=&EndTime=&SearchMethod=1&TocRestrict=n&Toc=&TocEntry=&QField=&QFieldYear=&QFieldMonth=&QFieldDay=&IntQFieldOp=0&ExtQFieldOp=0&XmlQuery.
  36. 36. United States Census Bureau. United States Census Quick Facts [Internet]. 2022 [cited 2023 May 31]. Available from: https://www.census.gov/quickfacts/fact/table/US,detroitcitymichigan,MI/PST045222.
  37. 37. Day A, Islam K, O’Shay S, Taylor K, McElmurry S, Seeger M. Consumer Response to Boil Water Notifications During Winter Storm Uri. J Am Water Works Assoc. 2022;114(5): 26–33.
  38. 38. American Water Works Association. Public Perceptions of Tap Water [Internet]. 2020 Available from: https://www.awwa.org/Portals/0/AWWA/Communications/23001PDFEdits-1.pdf.
  39. 39. Yang ZJ, Aloe AM, Feeley TH. Risk Information Seeking and Processing Model: A Meta-Analysis. Journal of Communication. 2014;64(1): 20–41.
  40. 40. Crittenden JC, Trussell RR, Hand DW, Howe KJ, Tchobanoglous G. Water Treatment: Principles and Design. 3rd ed. Wiley; 2012.
  41. 41. Detroit Water and Sewerage Department. Detroit Water Quality Report [Internet]. 2021; Available from: https://issuu.com/dwsd-publicaffairs/docs/pdf_2018_water_quality_report?fr=sMDVjZTU4NTEyMg.
  42. 42. Whitley E, Ball J. Statistics Review 6: Nonparametric methods. Crit Care. 2002;6(6): 509–13. pmid:12493072
  43. 43. Sellnow-Richmond D, George A, Sellnow D. An IDEA Model Analysis of Instructional Risk Communication in the Time of Ebola. Journal of International Crisis and Risk Communication Research. 2018;1(1): 135–66.
  44. 44. Slovic P. Perceived risk, trust and democracy. The Perception of Risk. 1993;13(6): 316–26.
  45. 45. Francisque A, Rodriguez MJ, Sadiq R, Miranda LF, Proulx F. Reconciling “actual” risk with “perceived” risk for distributed water quality: A QFD-based approach. Journal of Water Supply: Research and Technology—AQUA. 2011;60(6): 321–42.
  46. 46. The United States Environmental Protection Agency. Code of Federal Regulations [Internet]. The United States; 1991. Available from: https://www.govinfo.gov/content/pkg/CFR-2021-title40-vol25/pdf/CFR-2021-title40-vol25-part141.pdf.
  47. 47. Pierce G, Gonzalez S. Mistrust at the tap? Factors contributing to public drinking water (mis)perception across US households. Water Policy [Internet]. 2017 Feb 1;19(1): 1–12. Available from: https://iwaponline.com/wp/article/19/1/1/20521/Mistrust-at-the-tap-Factors-contributing-to-public.
  48. 48. Felton R. Should We Break Our Bottled Water Habit? Consumer Reports [Internet]. 2019;1–12. Available from: https://www.consumerreports.org/epa/should-we-break-our-bottled-water-habit-a5667672175/.
  49. 49. Jensen O, Chindarkar N. Sustaining Reforms in Water Service Delivery: the Role of Service Quality, Salience, Trust and Financial Viability. Water Resources Management. 2019;33(3): 975–92.
  50. 50. Scherzer T, Barker JC, Pollick H, Weintraub JA. Water consumption beliefs and practices in a rural Latino community: Implications for fluoridation. J Public Health Dent. 2010;70(4): 337–43. pmid:20735717
  51. 51. Gorelick MH, Gould L, Nimmer M, Wagner D, Heath M, Bashir H, et al. Perceptions about water and increased use of bottled water in minority children. Arch Pediatr Adolesc Med. 2011;165(10): 928–32. pmid:21646572
  52. 52. Mostafiz R Bin, Rohli R V., Friedland CJ, Lee YC. Actionable Information in Flood Risk Communications and the Potential for New Web-Based Tools for Long-Term Planning for Individuals and Community. Front Earth Sci (Lausanne). 2022;10(February).
  53. 53. Reynolds B, Seeger M. Crisis and Emergency Risk Communication as an Integrative Model. J Health Commun. 2005;10(1): 43–55. pmid:15764443
  54. 54. USEPA. Consumer Confidence Report Rule Revisions [Internet]. 2023 [cited 2023 Jun 7]. p. 1. Available from: https://www.epa.gov/ccr/consumer-confidence-report-rule-revisions.
  55. 55. Parag Y, Timmons Roberts J. A battle against the bottles: Building, claiming, and regaining tap-water trustworthiness. Soc Nat Resour. 2009;22(7): 625–36.
  56. 56. The American Waterworks Association. Trending in an instant: A risk communication guide for water utilities [Internet]. 2019. Available from: https://www.awwa.org/Portals/0/AWWA/Communications/TrendinginanInstantFinal.pdf.
  57. 57. Steglitz B, Goetz MK. “Quality Water Matters” Resonates Amid Uncertainty About Emerging Contaminants. J Am Water Works Assoc. 2020;112(7):52–8.
  58. 58. O’Sullivan JJ, Bradford RA, Bonaiuto M, De Dominicis S, Rotko P, Aaltonen J, et al. Enhancing flood resilience through improved risk communications. Natural Hazards and Earth System Science. 2012;12(7): 2271–82.
  59. 59. Demeritt D, Nobert S. Models of best practice in flood risk communication and management. Environmental Hazards. 2014 Oct 1;13(4): 313–28.
  60. 60. Wilson NJ, Montoya T, Lambrinidou Y, Harris LM, Pauli BJ, McGregor D, et al. From “trust” to “trustworthiness”: Retheorizing dynamics of trust, distrust, and water security in North America. Environ Plan E Nat Space. 2022;6(1): 42–68.
  61. 61. Griffin RJ, Dunwoody S. The relation of communication to risk judgment and preventive behavior related to lead in tap water. Health Commun. 2000;12(1): 81–107. pmid:10938908
  62. 62. Kasperson RE, Webler T, Ram B, Sutton J. The social amplification of risk framework: New perspectives. Risk Analysis. 2022;42(7): 1367–80. pmid:35861634