Skip to main content
Browse Subject Areas

Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here.

  • Loading metrics

The development of a safe opioid use agreement for surgical care using a modified Delphi method

  • Cassandra B. Iroz,

    Roles Formal analysis, Project administration, Writing – original draft

    Affiliation Northwestern Quality Improvement, Research, & Education in Surgery (NQUIRES), Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, IL, United States of America

  • Willemijn L. A. Schäfer,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Writing – original draft

    Affiliation Northwestern Quality Improvement, Research, & Education in Surgery (NQUIRES), Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, IL, United States of America

  • Julie K. Johnson,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Supervision, Writing – original draft

    Affiliation Northwestern Quality Improvement, Research, & Education in Surgery (NQUIRES), Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, IL, United States of America

  • Meagan S. Ager,

    Roles Data curation, Investigation, Project administration, Writing – review & editing

    Affiliation Mathematica Policy Research, Chicago, IL, United States of America

  • Reiping Huang,

    Roles Formal analysis, Methodology, Writing – original draft

    Affiliation Northwestern Quality Improvement, Research, & Education in Surgery (NQUIRES), Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, IL, United States of America

  • Salva N. Balbale,

    Roles Conceptualization, Formal analysis, Writing – original draft

    Affiliations Northwestern Quality Improvement, Research, & Education in Surgery (NQUIRES), Department of Surgery, Northwestern University Feinberg School of Medicine, Chicago, IL, United States of America, Division of Gastroenterology and Hepatology, Department of Medicine, Northwestern University, Chicago, IL, United States of America, Center of Innovation for Complex Chronic Healthcare, Health Services Research & Development, Edward Hines, Jr. VA Hospital, Hines, IL, United States of America

  • Jonah J. Stulberg ,

    Roles Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Supervision, Writing – original draft

    Affiliation Department of Surgery, University of Texas McGovern Medical School, Houston, TX, United States of America

  • on behalf of the Opioid Agreement Delphi Group

    Membership of the Opioid Agreement Delphi Group is provided in Acknowledgments.



Opioids prescribed to treat postsurgical pain have contributed to the ongoing opioid epidemic. While opioid prescribing practices have improved, most patients do not use all their pills and do not safely dispose of leftovers, which creates a risk for unsafe use and diversion. We aimed to generate consensus on the content of a “safe opioid use agreement” for the perioperative settings to improve patients’ safe use, storage, and disposal of opioids.


We conducted a modified three-round Delphi study with clinicians across surgical specialties, quality improvement (QI) experts, and patients. In Round 1, participants completed a survey rating the importance and comprehensibility of 10 items on a 5-point Likert scale and provided comments. In Round 2, a sub-sample of participants attended a focus group to discuss items with the lowest agreement. In Round 3, the survey was repeated with the updated items. Quantitative values from the Likert scale and qualitative responses were summarized.


Thirty-six experts (26 clinicians, seven patients/patient advocates, and three QI experts) participated in the study. In Round 1, >75% of respondents rated at least four out of five on the importance of nine items and on the comprehensibility of six items. In Round 2, participants provided feedback on the comprehensibility, formatting, importance, and purpose of the agreement, including a desire for more specificity and patient education. In Round 3, >75% of respondents rated at least four out of five for comprehensibility and importance of all 10 updated item. The final agreement included seven items on safe use, two items on safe storage, and one item on safe disposal.


The expert panel reached consensus on the importance and comprehensibility of the content for an opioid use agreement and identified additional patient education needs. The agreement should be used as a tool to supplement rather than replace existing, tailored education.


Postoperative prescription opioid use has contributed to the ongoing opioid epidemic. A substantial portion of opioid prescriptions in the United States are for acute pain management after surgery [1, 2], and can still lead to chronic use [35]. On average, 70% to 90% of opioid pills prescribed following surgery remain unused after the initial pain episode [611]. Over 70% of surgery patients do not dispose of their unused opioids [9, 12], increasing the risk for non-medical use [1315], adverse drug events, and use of other illicit drugs [13, 16]. Therefore, safe disposal is key to preventing adverse events and diversion [1720].

Engaging patients and clinicians is critical to address the opioid epidemic. While various initiatives have successfully reduced opioid prescribing [2125], there is an ongoing need for clinicians to better engage patients in safe opioid use practices [26]. Opioid use agreements, also known as opioid contracts [27], are one tool to engage patients, are well-established in primary care [2832], and have been shown to increase patients’ participation in their care [33] and reduce opioid use [28, 31]. To date, studies on these agreements in pain clinics and primary care have shown reductions of 7% to 23% in opioid misuse [3437], although specific effects on disposal have not been measured [38]. However, currently available opioid use agreements have shown some shortcomings in their design. A national survey of providers showed that most agreements were written above recommended reading levels, highlighting the need for a more patient-centric approach [39].

Use of agreements in acute pain management offers a logical extension of current practices from chronic pain management. However, agreements have not been studied in the perioperative setting. Opioid use agreements have primarily been used in populations with long-term opioid use to treat chronic pain, whereas opioids prescribed following surgery are intended to treat acute short-term pain [27]. Additionally, the patient-clinician relationship can differ between primary care where there are often long-term relationships versus surgery, which is characterized by more episodic, acute care. It is therefore important that we understand what is needed from patient, provider, and quality improvement (QI) perspectives in a surgery-specific context.

We therefore aimed to develop and generate consensus on the content of a safe opioid use agreement to improve safe use, storage, and disposal of opioids prescribed after surgery.


Study design

We conducted a modified three-round Delphi study to generate consensus on the content of a safe opioid use agreement among a diverse stakeholder panel including patients, surgeons, nurses, pharmacists, and QI experts. The Delphi method is a reliable way to gather consensus from a group of experts [40]. It is a flexible approach, and while there are no universal guidelines, there are recommendations for best practices [41, 42]. In a Delphi study, experts rate items then reevaluate their ratings until consensus is reached. The process for our study took place from July to October 2020.


We selected experts for our panel using purposeful and snowballing sampling techniques, ensuring representation of each stakeholder group, and including participants with relevant clinical expertise. This was done by reaching out to our research team’s professional network and targeting individuals who were interested in, or had participated in, opioid reduction initiatives. We invited surgeons and inpatient and outpatient nurses from a variety of specialties (orthopedics, urology, trauma, gynecology, surgical oncology, and general surgery), pharmacists, QI experts, and patients, from one single healthcare system. We also asked participants to suggest additional stakeholders to include, particularly within the nursing groups. All clinician participants worked with adult patients at a large, private, urban academic medical center in the United States. Patient representatives were members of our healthcare system’s Patient Family Advisory Council and were asked to participate and provide their perspective on how a patient might interpret and perceive the agreement.

Rounds of the Delphi study

We followed a three-round modified Delphi approach (Fig 1). First, our research team developed a draft agreement including ten items based on existing opioid use agreements used in primary care [4347]. Items included in the draft agreement were chosen through discussions within the research team and were based on publicly available opioid agreements and our previous research on surgical opioid reduction within our healthcare system. For example, our previous work showed that communicating and setting expectations about pain relief as well as discussing safe disposal of leftover opioids were important areas for improvement, so they were included in the draft agreement [48, 49]. We developed a survey to rate the importance and comprehensibility of each item of the draft agreement. The survey included free text fields for participants to add explanations on the importance and/or comprehensibility. The survey also asked if there were any topics that were not covered in the draft agreement that they believed should be included.

Fig 1. Outline of the Delphi study.

The modified Delphi study occurred in three rounds, starting with a survey, followed by a focus group, and ending with a final survey.

Pilot testing.

Pilot testing is recommended before starting a Delphi study [41, 50]. We pilot tested the survey with individuals of various ages and with various expertise in our personal networks (n = 7) and made changes based on their feedback to the survey comprehensibility, the agreement content, and its length. Based on respondent feedback, we revised response categories to reflect validated five‐point Likert scales ranking comprehensibility as “Very poor”, “Poor”, “Fair”, “Good”, or “Very good”, and importance as “Not important”, “Slightly important”, “Moderately important”, “Important”, or “Very important”. We also added a brief introduction to the agreement, reworded several items, and changed the order.

Round 1.

In Round 1, the expert panel received the survey (S1 File) via email and completed it in Qualtrics software (Qualtrics, Provo, UT). The survey was open for 1.5 months and participants were sent one reminder. We then distributed feedback reports to the participants, including a summary of the survey results compared to their individual answers so they could reflect on their responses and revise their opinions for the following rounds (an example feedback report is available in S2 File).

Round 2.

In Round 2, we invited 15 members of the panel to participate in a focus group. Focus group participants were selected to ensure representation of the different stakeholder groups and surgical specialties. The group was limited to 15 participants to allow for interactive conversations. The focus group lasted 90 minutes, was conducted virtually, and was audio recorded and transcribed for analysis. Participants had the opportunity to provide input through the chat function. There were four moderators who were members of the research team and ensured 1) discussion of comments typed in the chat function, 2) an opportunity for all participants and participant groups to talk and state their opinions, and 3) strict time management. The focus group was semi-structured, and the presentation slides served as the moderator guide. The moderators presented background information on surgical opioid prescribing rates and research on the importance of patient education. The participants were then shown the quantitative and qualitative responses from the Round 1 survey and asked to discuss all items where less than 85% of respondents rated at least four out of five on the Likert scale (i.e., “Good” or “Very Good” for comprehensibility and “Important” or “Very Important”). While greater than 75% was defined as acceptable a priori, for the focus group we chose to discuss items with less than 85% agreement because agreement was high on the Round 1 survey, and we wanted to receive feedback on how to improve the lower scoring items. The moderators then asked the participants open-ended questions to elicit their opinions on the importance and/or comprehensibility of each of the selected items. The moderators specifically called out participants from all stakeholder groups to ensure that all groups were represented. We also used the polling function to measure agreement during the focus group, and results were summarized along with the qualitative comments and shared with participants before the Round 3 survey. The focus group concluded with a discussion on the purpose of the agreement, length, introductory text, and additional topics to cover in the agreement.

Following the focus group, the research team used the transcript to update the agreement. This included creating a document summarizing the areas where additional patient education was needed to support effectiveness of the agreement (S3 File). The language of the agreement was then reviewed and edited by a Patient Education Specialist, who was independent from the research team, to reflect a sixth to eighth grade reading level.

Round 3.

In Round 3, we repeated the survey from Round 1 with the revised items. The survey was sent to the entire 36-member panel, along with a feedback report that included detailed explanations of changes made. The survey was open for one month and participants were sent up to one reminder. In the final agreement, we included items for which more than 75% of participants rated at least four out of five on the Likert scale for importance and comprehensibility. There is no standard definition of consensus for Delphi studies with recommendations ranging from 51% to 100%, but 75% is a common benchmark [41, 42, 51].

Data analysis

Survey responses were exported from Qualtrics (Qualtrics, Provo, UT) to Microsoft Excel (Microsoft Corporation, Version 2304) and quantitative values were summarized using descriptive statistics. Qualitative responses from the free text fields in the Round 1 and Round 3 surveys as well as the transcript of the focus group from Round 2 were summarized using a simple thematic analysis approach and lean coding [52, 53]. First, two researchers (CBI & WLS) reviewed all qualitative comments provided on the Round 1 and Round 3 surveys and the transcript from the focus group. Second, the two researchers discussed the qualitative data to develop codes inductively which were applied to the data through group discussion. For example, quotes relating to desired reading level of the items were labeled as “reading level.” Third, they created a table grouping the broad codes to identify overarching themes [52]. For example, the “reading level” code was assigned to the theme “comprehensibility.” Finally, themes and example quotes were discussed with two additional team members and refined for clarity (JKJ & SNB).


Northwestern University’s Institutional Review Board reviewed the study and determined that it did not qualify as human subjects research (STU00212619). All participants were informed about the purpose of the research, procedures, potential risks and benefits, and that participation was fully voluntary and could be stopped at any time. Each expressed their consent to participate in writing for the survey and again verbally for the focus group.


Our panel consisted of 36 experts including nine pharmacists, seven nurses, seven surgeons, seven patients/patient advocates, three QI experts, one nurse practitioner, one anesthesiologist, and one emergency medicine physician. Thirteen experts participated in all three rounds (Table 1).

Table 1. Participants for each round of the Delphi study.

Round 1 results

In the Round 1 survey, more than 75% of respondents rated nine items as “Important” or “Very Important” and six items as “Good” or “Very good” for comprehensibility (Table 2). We summarized the qualitative comments into ten themes (Table 3). Qualitative feedback from participants revealed a desire for greater specificity, issues with comprehensibility including concerns about medical terminology such as “respiratory depression”, concerns about the effectiveness of the agreement in practice, complexity of wording, questions on the purpose of the agreement, concerns about Item 10 related the Prescription Monitoring Program (PMP), and a desire for language on shared responsibility. Minor updates were made in the agreement (i.e., correcting spelling and grammar) prior to the Round 2 focus group.

Table 2. Percentage of participants who rated at least four out of five for Round 1 and Round 2.

Green cells represent items that met the threshold for consensus (>75% of respondents rating at least four out of five) and red cells represent items that did not meet the threshold for consensus.

Table 3. Summary of qualitative feedback.

Themes are presented with example quotations from each round.

A total of 19 participants (53%) responded “Yes” to the question asking if there were additional topics that were not covered that they believe should be included. Participants reported the need for additional patient education, including information on alternative pain management strategies, information about how to safely discontinue opioid use, and additional risks of opioids including sedation, interaction with alcohol, and the risk for addiction. Further, participants wanted more specificity on how patients should communicate with their healthcare provider, and the types of information that patients should share, such as notifying their provider about over-the-counter medications. Another suggestion was to include an item about patients not receiving opioid prescriptions from other providers. Finally, one participant was concerned that the agreement would make patients hesitant to take their prescribed opioids.

Round 2 results

Fifteen members of the expert panel were invited to and participated in the focus group (Table 1). Themes from the focus group are summarized in Table 3. During the focus group, the panel discussed how to improve the comprehensibility for Items 2, 3, and 9. For Item 5 on opioid interactions, the group discussed the wording and layout as well as which medications were important to include in the list with potential interactions. In both Rounds 1 and 2, some participants suggested adding more detail to various items. Through discussion in the focus group, it was decided that by keeping the items more general, the agreement could be more easily adapted to different surgical specialties and practices. The participants identified additional education needs for Items 2, 3, 5, 7, and 9. Examples of such education included when, how, and with whom to communicate about pain, individualized examples of potential drug interactions, and information about how to safely store and dispose of opioids. The focus group also included discussion on the potential effectiveness of the agreement, purpose of the agreement, and questions about the Prescription Monitoring Program.

The Patient Education Specialist reviewed the agreement after the focus group and made some wording adjustment (such as changing “medication” to “medicine) and suggested we include generic as well as brand names for medications listed in Item 5.

Round 3 results

For the Round 3 survey, 29 of the original 36 (80.6%) experts responded (Table 1). All ten items met the final threshold for inclusion with greater than 75% of respondents rating the item as “Important” or “Very important” and “Good” or “Very good” for comprehensibility (Table 2). There were far fewer qualitative comments on the Round 3 than the Round 1 survey. The remaining comments contained a continued desire for specificity, comments on the improved comprehensibility with additional slight modifications, a minor formatting suggestion, and comments on the purpose of the agreement (Table 3). Minor edits were made based on these comments and the final safe opioid use agreement is available in Fig 2. The final agreement included seven items on safe use, two items on safe storage, and one item on safe disposal.

Fig 2. The safe opioid use agreement.

The Safe Opioid Use Agreement for the perioperative setting, as developed by the Delphi study participants included ten statements and an introduction describing the purpose of the agreement.


Through a modified Delphi study, a diverse expert panel of clinicians, QI experts, and patients reached consensus on the content of a safe opioid use agreement for the perioperative setting. Ten items on the safe use, storage, and disposal of opioids prescribed for postoperative pain management were included and refined throughout the three rounds.

Our modified Delphi study was conducted in three rounds, with a focus group in the second round to engage our expert panel in a live exchange of ideas and gather opinions from different stakeholder groups. While the traditional Delphi approach relies on participants never interacting with one another [54, 55], including a focus group is a common modification, especially in healthcare quality studies [56]. One of the benefits of the Delphi approach is that the “grassroots involvement” can be highly motivating for participants [54].

In a few instances, the percentage of participants who rated at least four out of five for comprehensibility or importance decreased from Round 1 to Round 3 (Table 2). It is important to note that the sample size was larger (n = 36) in Round 1 compared to Round 3 (n = 29). Despite these decreases, there were still greater than 75% of participants rating at least four out of five, which met our a priori definition of consensus. We also found no additional comments in the qualitative feedback from Round 3 that would indicate any new concerns.

A common theme in the qualitative comments was a desire for more specificity or more detail on the included items. Perhaps the most important lesson learned from the Delphi study was the need for the research team to be clear that the purpose of the agreement was not to replace existing patient education on safe use, storage, and disposal of prescription opioids, but to act as a behavioral modification tool to improve compliance with safe opioid practice and to supplement thorough pain management education. Despite this consensus during the focus group, comments from the survey in Round 3 indicated that participants might not have understood this goal and wanted to continue to add details to the agreement to ensure all the information patients need was included in the document. This clarity on the purpose of the agreement needs to be a central message during the implementation of the agreement. There is a risk that healthcare providers will forego personalized patient education during the clinical visit or remove existing patient education tools because of the presence of the agreement, which would be antithetical to the intended use.

Due to the recurring theme of a desire for more specificity or more detail, we outlined the additional patient education needs suggested by the expert panel (S3 File). The purpose of this document is to cue the person administering the agreement to additional education they might need to provide to ensure the patient is able to understand and comply with the item in the agreement. Clinical practice guidelines recommend “patient and family-centered, individually tailored education” [57]. Clinicians will continue to need to have individualized, patient-centered discussions on opioid safety and this format can help guide that discussion. In the next phase of implementation, we will interview clinicians about this document and other needs for opioid patient education materials.

Lack of disposal of unused prescription opioids is a significant problem and creates risk for diversion and misuse. Despite efforts to reduce opioid prescribing [2125] there remains a need to focus on patient engagement in safe opioid use. The number of pills used for a given pain episode can vary greatly [9], meaning there will likely always be a risk of unused pills, potential for diversion, and need for safe disposal. Item 9, which was specifically about disposal was rated as “Important” or “Very Important” by 90% of the participants, underscoring the necessity to discuss safe disposal with patients. Additionally, Item 8, which relates to the issue of diversion, was rated “Important” or “Very Important” by 97% of the participants.

Reducing opioid use in surgery requires multicomponent efforts to be successful [24]. The safe opioid use agreement, developed through the Delphi study, can serve as one piece of that multifaceted strategy. As we found in this study, the agreement cannot stand alone, but should be used as one component of a broader strategy to improve prescribing and patient education on pain management strategies. Additionally, the agreement might be especially helpful for patients who are at higher risk of opioid misuse, including those who use opioids before surgery [58].

Other studies have found limited effectiveness of opioid agreements in primary and chronic care settings [29, 38, 59]. Limitations of these previous studies include inconsistent use of the agreement, sampling bias, lack of a consistent definition of opioid misuse or abuse, and few studies provided sufficient description or shared the actual text of their agreement. Approaches to reducing opioid addiction have been categorized as primary (preventing new cases of opioid addiction), secondary (identifying early cases of opioid addiction), and tertiary (ensuring access to effective addiction treatment) [60]. Previous studies have mostly focused on secondary prevention, working with populations who chronically use opioids, whereas this study was intended to develop an agreement for primary prevention, reducing the risks for patients prescribed opioids for acute pain management. To increase the likelihood our agreement would be effective, we followed a robust Delphi approach to develop content that was relevant to the patient population and easy to understand. We engaged a diverse panel of key stakeholders from various professions and specialties to generate consensus on the content of the agreement and had the language reviewed by a Patient Education Specialist. The effectiveness of our agreement is still unknown. The next step for our research team is to study the effectiveness of the agreement at improving safe use, storage, and disposal of opioids prescribed for postoperative pain management.

Strengths and limitations

We noted a few limitations of our study. First, the agreement was developed only in English, limiting its use in diverse patient populations. Second, we only sought experts from one healthcare system, which perhaps limits the applicability in other settings. This focus on our own healthcare system was purposeful, as we wanted to develop, implement, and test the agreement locally before disseminating to other settings. Reliability and validity of results from Delphi studies have been questioned, but including experts in the field of study strengthens the validity [41, 50, 61]. It is possible, and perhaps likely, that a similar experts panel with different participants would have developed a slightly different final agreement. Third, not all 36 members of the expert panel participated in the focus group. This was intentional to allow for more interactive discussion. However, one limitation with this approach is that the rest of the panel did not hear what was discussed in the focus group and responses in Round 3 continued to request more detail, a topic discuss in the focus group. Fourth, while we provided feedback reports, we cannot guarantee that participants reviewed them and considered their peers’ feedback in their Round 3 responses. Finally, the nature of hierarchical relationships in medicine between the various stakeholder groups might mean that some individuals did not speak freely in the focus group (e.g., patients might have resisted disagreeing with physicians). However, given the purpose of the study (i.e., consensus building) it was important to have all stakeholder groups in one focus group and we believe we mitigated the bias through the moderation process.

There are some notable strengths in our approach. First, we involved a diverse group of stakeholders with different perspectives, expertise, and roles. The inclusion of pharmacists, nurses, surgeons, QI experts, and physicians in other specialties (i.e., anesthesiology and emergency medicine) provided a well-rounded clinical perspective. Including patients and patient advocates was also essential to understand the impact this agreement might have on patients and how they understand the agreement. Second, we pilot tested our survey which is recommended as a best practice [41, 50]. Third, we had a high response rate throughout the entire study, particularly since attrition is a common issue with Delphi studies [62].


Through engaging a diverse expert stakeholder panel of physicians, nurses, pharmacists, QI experts, and patients, we were able to develop a safe opioid use agreement tailored to the surgical setting. The focus group further engaged the panel to support implementation of the agreement in the next steps of our study. The agreement cannot, and is not intended to, stand alone as the only aspect of patient education on safe opioid use after surgery and instead is intended to be used as a behavior change mechanism supplemented with additional, individualized patient education. Future research will test the effectiveness of the agreement at improving safe use, storage, and disposal of opioids prescribed to manage postoperative pain.

Supporting information

S1 File. Opioid agreement Delphi questionnaire.

Participants completed the Round 1 and Round 3 surveys virtually through a secure online platform.


S2 File. Example feedback report to participants.

After each round, participants received an individualized feedback report that compared their responses to the summary of responses from the Delphi panel.


S3 File. Education document.

A document was drafted to address the additional patient education needs identified by the participants.



We would like to thank the Opioid Agreement Delphi Group for their time, effort, insight and experience: Amanda Vlcek, MHA, LCSW, Anne Bobb, Anne Stey, MD, MSc, Betsy Ross, RN, Christine Schilling, RN, Cindy Barnard, PhD, MBA, MSJS, Colleen Garrity, RN, CAPA, Daniel Kufer, David Bentrem, MD, David Kalainov, MD, MBA, Elizabeth Shepard, PharmD, BCPS, BCCCP, Gregory Auffenberg, MD, Hector Nunez, Howard Kim, MD, MS, Jeff Gardner, Jennifer Carvajal, APRN, CNP, Jill Baker, Joseph Posluszny, MD, Joshua Halpern, MD, MS, Karen Burnett, Kelly Blanchard, RN, Kevin Bajer, PharmD, BCPS, Kristen March, BS, PharmD, BCPS, Laura Batz Townsend, Laura Lane, PharmD, Linda Louis, RN, Magdy Milad, MD, MS, Michelle Starzyk, RN, Mileta Kemeza, PharmD, Morgan Aleshire, BSN, RN, CMSRN, Paloma Toledo, MD, MPH, Rachael Freeman, PharmD, BCPS, Rachel Joseph, PharmD, BCPS, Rebecca Jett, PharmD, Sterling Elliott, PharmD, BCMTMS, and Susan Jacobson. At the time of the study, all participants of the Opioid Agreement Delphi Group were affiliated with Northwestern Medicine. The group was led by Sterling Elliott, PharmD, BCMTMS (email:


  1. 1. Levy B, Paulozzi L, Mack KA, Jones CM. Trends in Opioid Analgesic-Prescribing Rates by Specialty, U.S., 2007–2012. Am J Prev Med. 2015;49(3):409–13. pmid:25896191
  2. 2. Guy GP Jr, Zhang K. Opioid Prescribing by Specialty and Volume in the U.S. Am J Prev Med. 2018;55(5):e153–e5.
  3. 3. Sun EC, Darnall BD, Baker LC, Mackey S. Incidence of and Risk Factors for Chronic Opioid Use Among Opioid-Naive Patients in the Postoperative Period. JAMA Internal Medicine. 2016;176(9):1286–93. pmid:27400458
  4. 4. Brummett CM, Waljee JF, Goesling J, Moser S, Lin P, Englesbe MJ, et al. New Persistent Opioid Use After Minor and Major Surgical Procedures in US Adults. JAMA surgery. 2017;152(6):e170504–e. pmid:28403427
  5. 5. Hah JM, Bateman BT, Ratliff J, Curtin C, Sun E. Chronic Opioid Use After Surgery: Implications for Perioperative Management in the Face of the Opioid Epidemic. Anesth Analg. 2017;125(5):1733–40. pmid:29049117
  6. 6. Egan KL, Gregory E, Sparks M, Wolfson M. From dispensed to disposed: evaluating the effectiveness of disposal programs through a comparison with prescription drug monitoring program data. Am J Drug Alcohol Ab. 2017;43(1):69–77.
  7. 7. Ringwalt C, Gugelmann H, Garrettson M, Dasgupta N, Chung AE, Proescholdbell SK, et al. Differential prescribing of opioid analgesics according to physician specialty for Medicaid patients with chronic noncancer pain diagnoses. Pain Res Manag. 2014;19(4):179–85. pmid:24809067
  8. 8. Morris BJ, Mir HR. The opioid epidemic: impact on orthopaedic surgery. J Am Acad Orthop Surg. 2015;23(5):267–71. pmid:25911660
  9. 9. Hill MV, McMahon ML, Stucke RS, Barth RJ Jr., Wide Variation and Excessive Dosage of Opioid Prescriptions for Common General Surgical Procedures. Ann Surg. 2017;265(4):709–14.
  10. 10. Bicket MC, Long JJ, Pronovost PJ, Alexander GC, Wu CL. Prescription Opioid Analgesics Commonly Unused After Surgery: A Systematic Review. JAMA surgery. 2017;152(11):1066–71. pmid:28768328
  11. 11. Howard R, Fry B, Gunaseelan V, Lee J, Waljee J, Brummett C, et al. Association of Opioid Prescribing With Opioid Consumption After Surgery in Michigan. JAMA surgery. 2019;154(1):e184234–e. pmid:30422239
  12. 12. Bartels K, Mayes LM, Dingmann C, Bullard KJ, Hopfer CJ, Binswanger IA. Opioid Use and Storage Patterns by Patients after Hospital Discharge following Surgery. PLoS One. 2016;11(1):e0147972. pmid:26824844
  13. 13. Jones CM, Paulozzi LJ, Mack KA. Sources of prescription opioid pain relievers by frequency of past-year nonmedical use United States, 2008–2011. JAMA Intern Med. 2014;174(5):802–3. pmid:24589763
  14. 14. Substance Abuse and Mental Health Services Administration. Key substance use and mental health indicators in the United States: Results from the 2017 National Survey on Drug Use and Health (HHS Publication No. SMA 18–5068, NSDUH Series H-53). Rockville, MD: Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration. Retrieved from; 2018.
  15. 15. Substance Abuse and Mental Health Services Administration. Key Substance Use and Mental Health Indicators in the United States: Results from the 2016 National Survey on Drug Use and Health. (HHS Publication No. SMA 17–5044, NSDUH Series H-52). Rockville, MD: Center for Behavioral Health Statistics and Quality, Substance Abuse and Mental Health Services Administration; 2017.
  16. 16. Centers for Disease Control and Prevention. Annual Surveillance Report of Drug-Related Risks and Outcomes—United States, 2017. Surveillance Special Report 1. Centers for Disease Control and Prevention, U.S. Department of Health and Human Services; 2017 August 31, 2017.
  17. 17. Stulberg J, editor Opioids and Surgery. ACS Quality and Safety Conference; 2017 July 21–24, 2017; New York, NY.
  18. 18. Stulberg J, editor Pain Management and Opioid Safety. Midwest Podiatry Conference; 2017 April 30, 2017; Chicago, IL.
  19. 19. Stulberg J, editor Acute Pain Management and Opioid Use. Opioid Stewardship and Managing the Opioid Crisis: A Health-Care Perspective; 2018 February 13, 2018; Chicago, IL.
  20. 20. Stulberg J, editor Minimizing Opioid Prescribing in Surgery. Surgical Oversight and Quality Committee Meeting; 2018 May 10, 2018; Chicago, IL.
  21. 21. Hill MV, Stucke RS, McMahon ML, Beeman JL, Barth RJ Jr., An Educational Intervention Decreases Opioid Prescribing After General Surgical Operations. Ann Surg. 2017.
  22. 22. Chiu AS, Jean RA, Hoag JR, Freedman-Weiss M, Healy JM, Pei KY. Association of lowering default pill counts in electronic medical record systems with postoperative opioid prescribing. JAMA surgery. 2018;153(11):1012–9. pmid:30027289
  23. 23. Howard R, Waljee J, Brummett C, Englesbe M, Lee J. Reduction in opioid prescribing through evidence-based prescribing guidelines. JAMA surgery. 2018;153(3):285–7. pmid:29214318
  24. 24. Burns S, Urman R, Pian R, Coppes OJM. Reducing new persistent opioid use after surgery: a review of interventions. Current pain and headache reports. 2021;25(5):1–5.
  25. 25. Lovecchio F, Premkumar A, Stepan JG, Albert TJ. Fighting back: institutional strategies to combat the opioid epidemic: a systematic review. HSS Journal®. 2019;15(1):66–71. pmid:30863235
  26. 26. Dowell D, Ragan KR, Jones CM, Baldwin GT, Chou R. CDC clinical practice guideline for prescribing opioids for pain—United States, 2022. MMWR Recommendations and Reports. 2022;71(3):1–95. pmid:36327391
  27. 27. Fishman SM, Kreis PG. The opioid contract. The Clinical journal of pain. 2002;18(4):S70–S5.
  28. 28. Fishman SM, Bandman TB, Edwards A, Borsook D. The Opioid Contract in the Management of Chronic Pain. Journal of Pain and Symptom Management. 1999;18(1):27–37. pmid:10439570
  29. 29. McAuliffe Staehler TM, Palombi LC. Beneficial opioid management strategies: A review of the evidence for the use of opioid treatment agreements. Substance Abuse. 2020;41(2):208–15. pmid:31900073
  30. 30. Starrels JL, Wu B, Peyser D, Fox AD, Batchelder A, Barg FK, et al. It made my life a little easier: primary care providers’ beliefs and attitudes about using opioid treatment agreements. J Opioid Manag. 2014;10(2):95–102. pmid:24715664
  31. 31. Hariharan J, Lamb GC, Neuner JM. Long-term opioid contract use for chronic pain management in primary care practice. A five year experience. Journal of general internal medicine. 2007;22(4):485–90. pmid:17372797
  32. 32. Fishman SM, Mahajan G, Jung S-W, Wilsey BL. The trilateral opioid contract: bridging the pain clinic and the primary care physician through the opioid contract. Journal of pain and symptom management. 2002;24(3):335–44.
  33. 33. Quill TE. Partnerships in patient care: a contractual approach. Annals of Internal Medicine. 1983;98(2):228–34. pmid:6824257
  34. 34. Wiedemer NL, Harden PS, Arndt IO, Gallagher RM. The opioid renewal clinic: a primary care, managed approach to opioid therapy in chronic pain patients at risk for substance abuse. Pain medicine (Malden, Mass). 2007;8(7):573–84. pmid:17883742
  35. 35. Goldberg KC, Oddone EZ, Simel DL. Effect of an opioid management system on opioid prescribing and unscheduled visits in a large primary care clinic. Journal of Clinical Outcomes Management 2005;12(12):621–8.
  36. 36. Manchikanti L, Manchukonda R, Damron KS, Brandon D, McManus CD, Cash K. Does adherence monitoring reduce controlled substance abuse in chronic pain patients? Pain Physician. 2006;9(1):57–60. pmid:16700282
  37. 37. Manchikanti L, Manchukonda R, Pampati V, Damron KS, Brandon DE, Cash KA, et al. Does random urine drug testing reduce illicit drug use in chronic pain patients receiving opioids? Pain Physician. 2006;9(2):123–9. pmid:16703972
  38. 38. Starrels JL, Becker WC, Alford DP, Kapoor A, Williams AR, Turner BJ. Systematic review: treatment agreements and urine drug testing to reduce opioid misuse in patients with chronic pain. Ann Intern Med. 2010;152(11):712–20. pmid:20513829
  39. 39. Laks J, Alford DP, Patel K, Jones M, Armstrong E, Waite K, et al. A National Survey on Patient Provider Agreements When Prescribing Opioids for Chronic Pain. Journal of General Internal Medicine. 2021:1–6. pmid:33420560
  40. 40. Dalkey N, Helmer O. An experimental application of the Delphi method to the use of experts. Management science. 1963;9(3):458–67.
  41. 41. Hasson F, Keeney S, McKenna H. Research guidelines for the Delphi survey technique. Journal of advanced nursing. 2000;32(4):1008–15. pmid:11095242
  42. 42. Diamond IR, Grant RC, Feldman BM, Pencharz PB, Ling SC, Moore AM, et al. Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies. Journal of clinical epidemiology. 2014;67(4):401–9. pmid:24581294
  43. 43. OPIOID START TALKING: Michigan Department of Health and Human Services; [Available from:
  44. 44. Michigan Opioid Legislation Hospital Compliance Checklist: MHA Keystone Center; 2018 [Available from:
  45. 45. INFORMED CONSENT FOR OPIOID TREATMENT FOR NON-CANCER/CANCER PAIN: Blaustein Pain Treatment Center/Johns Hopkins Medicine 2016 [Available from:
  47. 47. Patient/Provider Opioid Agreement Tufts Medical Center; [Available from:
  48. 48. Khorfan R, Shallcross ML, Yu B, Sanchez N, Parilla S, Coughlin JM, et al. Preoperative patient education and patient preparedness are associated with less postoperative use of opioids. Surgery. 2020;167(5):852–8. pmid:32087946
  49. 49. Coughlin JM, Shallcross ML, Schafer WLA, Buckley BA, Stulberg JJ, Holl JL, et al. Minimizing Opioid Prescribing in Surgery (MOPiS) Initiative: An Analysis of Implementation Barriers. J Surg Res. 2019;239:309–19. pmid:30908977
  50. 50. Keeney S, Hasson F, McKenna HP. A critical review of the Delphi technique as a research methodology for nursing. International journal of nursing studies. 2001;38(2):195–200. pmid:11223060
  51. 51. McPherson S, Reese C, Wendler MC. Methodology update: Delphi studies. Nursing research. 2018;67(5):404–10. pmid:30052591
  52. 52. Creswell JW, Báez JC. 30 essential skills for the qualitative researcher: Sage Publications; 2020.
  53. 53. Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC medical research methodology. 2013;13(1):1–8. pmid:24047204
  54. 54. McKenna HP. The Delphi technique: a worthwhile research approach for nursing? Journal of advanced nursing. 1994;19(6):1221–5. pmid:7930104
  55. 55. Whitman NI. The Delphi technique as an alternative for committee meetings. SLACK Incorporated Thorofare, NJ; 1990. p. 377–9.
  56. 56. Boulkedid R, Abdoul H, Loustau M, Sibony O, Alberti C. Using and reporting the Delphi method for selecting healthcare quality indicators: a systematic review. PloS one. 2011;6(6):e20476. pmid:21694759
  57. 57. Chou R, Gordon DB, de Leon-Casasola OA, Rosenberg JM, Bickler S, Brennan T, et al. Management of Postoperative Pain: a clinical practice guideline from the American pain society, the American Society of Regional Anesthesia and Pain Medicine, and the American Society of Anesthesiologists’ committee on regional anesthesia, executive committee, and administrative council. The journal of pain. 2016;17(2):131–57. pmid:26827847
  58. 58. Bicket MC, Gunaseelan V, Lagisetty P, Fernandez AC, Bohnert A, Assenmacher E, et al. Association of opioid exposure before surgery with opioid consumption after surgery. Regional Anesthesia & Pain Medicine. 2022;47(6):346–52.
  59. 59. Bosch‐Capblanch X, Abba K, Prictor M, Garner P. Contracts between patients and healthcare practitioners for improving patients’ adherence to treatment, prevention and health promotion activities. Cochrane Database of Systematic Reviews. 2007(2). pmid:17443556
  60. 60. Kolodny A, Courtwright DT, Hwang CS, Kreiner P, Eadie JL, Clark TW, et al. The prescription opioid and heroin crisis: a public health approach to an epidemic of addiction. Annual review of public health. 2015;36:559–74. pmid:25581144
  61. 61. Goodman CM. The Delphi technique: a critique. Journal of advanced nursing. 1987;12(6):729–34. pmid:3320139
  62. 62. Barrett D, Heale R. What are Delphi studies? Evidence Based Nursing. 2020;23(3):68–9. pmid:32430290