Skip to main content

Advancing implementation science in community settings: the implementation strategies applied in communities (ISAC) compilation

Abstract

Background

Implementation strategies have predominantly been operationalized and studied in clinical settings. Implementation strategies are also needed to improve evidence-based intervention (EBI) integration in community settings, but there is a lack of systematic characterization of their use, which limits generalizability of findings. The goals of this study were to determine which implementation strategies are most used to deliver primary prevention EBIs in community settings, develop a compilation and pragmatic strategy selection process with accompanying guidance tools, and understand practitioners’ preferences for dissemination.

Methods

Purposive and snowball sampling was used to recruit community setting researchers and practitioners delivering primary prevention EBIs (nutrition, physical activity, tobacco prevention) in community settings: education, social services, city planning and transportation, workplaces, recreation/sport, faith-based, and other public health organizations. Semi-structured interviews were conducted using a guide based on the reach, effectiveness, adoption, implementation, maintenance (RE-AIM) framework. Participants were asked to describe barriers experienced and strategies used to overcome them within each RE-AIM dimension. Practitioners were also asked about preferred dissemination strategies, prompted by Diffusion of Innovations theory concepts of sources (who provides information) and channels (how information is provided). A rapid deductive approach was used to analyze findings with a coding matrix aligned with the interview guide.

Results

Researchers (n = 10) and practitioners (n = 8) across all targeted settings and intervention outcomes completed interviews. Interviewees shared unique implementation strategies (N = 40) which were used to overcome barriers related to multiple RE-AIM dimensions, most commonly implementation (n = 29) and adoption (n = 27). Most frequently mentioned implementation strategies were conduct pragmatic evaluation (n = 31), provide training (n = 26), change adaptable program components (n = 26), and leverage funding sources (n = 21). Webinars (n = 6) and listservs/newsletters (n = 5) were the most mentioned dissemination channels; national public health organizations (n = 13) were the most mentioned sources.

Conclusions

Results reflect commonly used implementation strategies in community settings (e.g., training, technical assistance) and add novel strategies not reflected in current taxonomies. Dissemination preferences suggest the need to involve broad-reaching public health organizations. The resultant compilation (Implementation Strategies Applied in Communities) and strategy selection process provide resources to assist researchers and practitioners in applying strategies and improving EBI delivery in community settings.

Table 1

Background

Translating evidence based interventions (EBIs) into real world settings is challenging: only an estimated 14% of original research is delivered as intended, and this process takes 15–17 years on average [1,2,3]. As a result, many EBIs are not reaching community members who could benefit from them. These issues are well recognized in implementation science [4,5,6]. However, implementation science methods, models, and measures are predominantly grounded and applied in clinical settings [7,8,9,10,11]. While translating EBIs into practice is challenging in clinical settings, community (non-clinical) settings face distinct challenges that may require specific methods, models and measures [12,13,14].

Research to practice challenges also hinder achieving health equity (i.e., opportunities for all to achieve optimal health) [15,16,17,18,19]. Barriers to integrating EBIs can be exacerbated in settings with lower resources, which often reach community members facing health disparities [13, 16, 20]. For example, adoption may be affected by competing demands and burdensome programs; implementation may be impacted by EBIs that are considered a poor fit or by a lack of implementation support; reach may be lower due to participation barriers (e.g., work schedules or limited transportation) and limited culturally appropriate promotion; and maintenance can be difficult due to lower resources and capacity among community settings [21]. When differences between settings are not addressed, health outcomes may improve in higher-resource but not lower-resource settings, leading to inequalities generated by EBIs [22,23,24].

Taken together, to improve overall public health impact in community settings and avoid exacerbating health disparities, there is a need to improve reach, adoption, implementation, and maintenance of EBIs – especially in settings with lower resources. The field of implementation science offers opportunities for researchers and practitioners to improve implementation outcomes including adoption, implementation, and maintenance through use of implementation strategies, defined as active methods or techniques to move research to practice [25]. Researchers have developed a strong foundation of implementation strategies, with multiple compilations to choose from, such as the Expert Recommendations for Implementing Change (ERIC) compilation [10], Effective Practice and Organization of Care [26], and Leeman et al.’s classes of implementation strategies [27]. While these compilations have created valuable contributions to the field, challenges remain: most compilations originated in clinical healthcare settings and use clinical terminology [10, 26, 27]; others have been adapted for single settings and services (e.g., youth behavioral health) [28]. Overall, existing compilations are not easily applied to diverse community settings [12].

Community settings are organizations that deliver public health EBIs but have missions beyond health care [13]. They include education, social services, city planning and transportation, workplaces, recreation/sport, faith-based [13], and other non-clinical public health settings (e.g., the national Cooperative Extension System and non-clinical services provided by public health departments) [12, 13]. Community settings deliver programs in settings where people live, learn, work, and play [29, 30]. In the field of chronic disease prevention, EBIs delivered in these settings are typically primary prevention interventions [13], directed at modifiable risk factors (e.g., dietary quality, physical activity, tobacco use) [13, 31]. These EBIs are increasingly delivered at the policy, systems, or environmental levels (vs. individual or interpersonal levels) [32, 33] to address social determinants of health [23, 29, 34].

Considering the diverse community settings EBIs are implemented in and the multi-level, population-focused nature of the EBIs, it may be more difficult to integrate EBIs in community than clinical settings. While there are determinants at many levels that affect EBI integration, the inner setting [35] in particular is different between clinical and community settings. First, community settings often have missions, cultures, capacity, and resources that are not focused on public health [13]. For example, the primary focus may be on youth development, education, or profit (e.g., retail settings) [36,37,38]. Second, community settings often do not have consistent resources or funding to support public health EBIs [13, 35]. Third, community settings may lack access to health-related data [13], and evaluating intervention effectiveness is often a challenge [39,40,41]. Fourth, the success of interventions in community settings may rely on recruiting the priority population to participate (e.g., in health promotion programs) [39, 42] compared to clinical setting interventions that reach existing patient populations. Thus, there is a need for relevant implementation strategies to support community setting researchers and practitioners in overcoming these barriers.

While implementation strategies are needed to improve EBI integration in community settings, there is a lack of systematic characterization of their use. This may be a result of perceived limited applicability, lack of relevant language, and/or dissemination challenges for existing compilations [12,13,14]. When practitioners and researchers in community settings use implementation strategies, they are often not described with terminology that matches available compilations [12, 14]. Based on implementation strategies used in community settings that are specified by name, the scope of use appears to be limited [12]. Practitioners and researchers in community settings often report using “training and hoping” [43] and other implementation strategies addressing individual level-barriers, such as educational meetings and materials or outreach visits, instead of using the full spectrum of available implementation strategies to address barriers at organizational in addition to individual levels [14].

These challenges have slowed the advancement of implementation science in community settings and resulted in a lack of information on which implementation strategies are used in community settings and their impact on EBI delivery [12, 14]. To address these gaps, previous work introduced community setting researchers and practitioners to implementation strategies by providing terminology and examples [12]. This included language tailored from the ERIC compilation and provided relevant community setting examples [12]. For instance, “patients” was changed to “participants” or “priority population” and “clinical innovation” was changed to “evidence-based intervention” [12]. However, questions remain related to which implementation strategies are relevant in community settings and which may not have been captured through tailoring a clinical setting compilation.

In addition, there is a need to provide guidance to researchers and practitioners in community settings on selecting relevant implementation strategies to overcome barriers and capitalize on facilitators. Existing implementation strategy selection methods (e.g., concept mapping, group model building, conjoint analysis, implementation mapping, and causal pathway development) can be complex and costly, and are limited in the extent to which they are used in community settings [44,45,46,47]. A simplified approach is needed for community settings to engage in the implementation strategy selection process without dedicated staffing or time [48]. For instance, a pragmatic process with a guidance tool for community settings could be a useful starting point for considering implementation determinants or outcomes and matching implementation strategies [49]. To increase practitioners’ use of a pragmatic compilation and guidance tool, there is a need to understand how to disseminate this work in open-access, community-friendly methods [50]. Given these considerations, the goals of this study were to (1) determine which implementation strategies are commonly used in community settings delivering primary prevention EBIs, and which implementation determinants and outcomes they are used to address, (2) develop a compilation of Implementation Strategies Applied in Communities (ISAC) and a pragmatic strategy selection process with accompanying guidance tools based on this information, and (3) understand practitioners’ preferences for learning about the compilation.

Methods

Study design

The study employed an emic (i.e., focusing on the “insider’s perspectives”), qualitative design following an adapted freelisting approach [51, 52]. That is, instead of beginning with and adapting an existing implementation strategy compilation, the goal was to understand implementation barriers and strategies used to overcome them in participants’ own words. This research was deemed exempt by the University of Nebraska Medical Center IRB, #0257-23-EX.

Participants and recruitment

Purposive and snowball sampling [53, 54] were used to recruit public health researchers and practitioners operating across seven types of community settings: education, social services, city planning and transportation, workplaces, recreation/sport, faith-based, and other non-clinical public health settings [13]. Purposive sampling was used to identify researchers and practitioners through the lead author’s (LB) previous experience in community-based implementation science and by engaging with the Community Participation Action Group of the Consortium for Cancer Implementation Science (CCIS) [55, 56]. Snowball sampling was conducted by asking researcher interviewees for recommendations of community partners to interview.

To engage with prospective participants, up to three recruitment emails (Supplemental File 1) were sent including the purpose of the research and a link to a screening survey (Supplemental Files 2 and 3) regarding participation eligibility. To be eligible, researchers had to have experience conducting community-engaged research to improve the adoption, implementation, or maintenance of an EBI in physical activity, nutrition, or tobacco prevention (primary prevention for chronic diseases) [57,58,59] in non-clinical settings. For practitioners, the eligibility requirement was centered on experience managing or coordinating these types of EBIs. The screening survey also queried data pertaining to (1) which of the noted settings respondents conducted research/worked in, (2) years of research/work experience, (3) and examples of up to three EBIs respondents had implemented, including the intervention level (individual or interpersonal and/or policy, systems, or environmental) [32, 60] and primary outcomes (physical activity, nutrition, or tobacco use). Researchers also self-reported their expertise level in implementation science (beginner, intermediate, advanced) [61]; practitioners were not asked this question.

Recruitment emails were sent to 23 researchers and 37 practitioners (n = 60). Of these potential participants, 10 researchers and eight practitioners (n = 18) completed the screening survey and participated in an interview. Those who did not participate either did not complete the screening survey (n = 33) (two provided reasons, noting time and employer constraints), could not be reached to due to undeliverable email addresses (n = 6), or completed the screening survey but were ineligible to participate (n = 2) because they answered “no” to the eligibility question (“Through your community-engaged research, have you worked to improve the adoption, implementation, or maintenance of an evidence-based intervention aimed at improving physical activity, nutrition, or tobacco patterns/practices?”). Three of the 60 potential interviewees requested a colleague be recruited in their place; two of these contacts subsequently completed interviews.

Data collection

Semi-structured interviews were conducted and recorded using Zoom from May to August 2023. Interviews lasted an average of 42 min (range of 25–58 min). All interviewers were female with a PhD or master’s degree and experienced in qualitative research and the RE-AIM (reach, effectiveness, adoption, implementation, maintenance) framework [62]. At the time of the study, LB was a Research Scientist, WC was a Project Manager, HL was a Medical Instructor, MW was a Postdoctoral Fellow, and EP was a Graduate Research Assistant. Participants received a $25 gift card as compensation.

Researcher and practitioner interview guides (Supplemental Files 4 and 5) were based on the RE-AIM framework [62]. Participants were first asked to describe the EBIs they have implemented, then were guided through each RE-AIM dimension and asked what barriers occurred during associated research or practice approaches and what strategies were used to overcome barriers. Practitioner interviewees were also asked about preferred dissemination methods for the resultant implementation strategy compilation following Diffusion of Innovations theory [63], regarding sources (who should provide the information) and channels (how should the information be provided). A knowledgeable public health and practice expert was engaged to pilot test and refine interview language before data collection.

Analysis

Microsoft Excel was used for descriptive statistics including characteristics of the participants and the EBIs captured from the screening survey. A rapid deductive analysis approach was used to analyze semi-structured interview findings [64,65,66]. To facilitate this process, a template was developed for structured note-taking according to the interview guide, including RE-AIM dimensions, barriers addressed, descriptions of the implementation strategies used to overcome barriers, and a brief descriptive implementation strategy “code” [67] to aid in comparisons across interviews. Another template was developed to capture practitioners’ dissemination preferences and descriptions, which included dissemination strategies and dissemination strategy codes. Two researchers independently listened to one interview and noted findings using the template to test and refine prior to moving forward with data analysis. Audit trails were maintained for all steps of the analysis process [68].

Interviewers completed the rapid analysis template during or after each interview to ensure all information relevant to the analysis was captured. Interviewers also recommended strategy language by completing the implementation strategy code and dissemination strategy code columns. Then, a second researcher (LB, WC, HL, or GMM) listened to interview recordings and made revisions or additions to the interview notes to ensure agreement and finalize the data. Next, two researchers (LB and WC) performed content analysis [67, 69] by collapsing and sorting these “codes” to condense similarly worded implementation strategies (e.g., “train implementers” and “provide training to implementers”) and to categorize dissemination strategies by recommended sources or channels. One researcher (LB) reviewed the collapsed list of strategies and wrote definitions based on the noted descriptions and another (BH) reviewed strategy names and definitions for clarity and suggested revisions, leading to a final list.

To determine when data saturation was reached, the research team adapted the “new information threshold” method, which assesses saturation during analysis by calculating the point when no to little information is acquired from additional data [70]. This is accomplished through calculating the amount of information present in a set of initial data (the “base”) and in sets of subsequently acquired data (the “run”). The new information threshold is calculated by dividing the amount of new information in each run by the amount of information in the base. The first step in completing this process is to prospectively determine the base size, run length, and new information threshold. Base sizes are recommended as four to six data collection events, as most information is generated early in the qualitative research process, and run length are recommended as two to three. For this study, the research team selected a higher base size (six interviews, three each researchers and practitioners) and run length (three interviews) as the population was diverse in terms of roles and settings. The new information threshold was set at  5%. New implementation strategies identified in a run were divided by number of implementation strategies present in the base until the ratio was  5%. Saturation was reached after 15 interviews, and the remaining scheduled interviews were completed.

Next, additional steps were taken to compare the resulting ISAC compilation strategies with ERIC implementation strategies to determine similarities. One researcher (LB) reviewed the final list of ISAC implementation strategies, searched for ISAC implementation strategy terms and synonyms in the ERIC compilation (strategy names and definitions), and reviewed all remaining ERIC strategies. ERIC strategies with similar content (but different language) were linked to the most relevant ISAC strategy. A second researcher (BH) reviewed both compilations and the list of similar strategies, noted areas of disagreement, and reconciled through discussion and agreement with LB.

Finally, data were analyzed to develop two guidance tools for selecting implementation strategies. Through the CCIS Action Group, researchers and practitioners expressed a preference for tools based on two commonly used frameworks: RE-AIM [62] and the Consolidated Framework for Implementation Research (CFIR) [35]. The pragmatic RE-AIM guidance tool was designed to provide guidance on which RE-AIM dimension (implementation outcome) each implementation strategy was used to address. It was developed by calculating the frequency of RE-AIM dimensions mentioned for each final implementation strategy. RE-AIM dimensions were indicated in the tool as relevant to each implementation strategy if they were mentioned at least two times or mentioned once for implementation strategies with frequency of  3 mentions.

To ensure broad applicability across determinant frameworks, the second tool was developed as a guidance tool aligning with CFIR and other determinant frameworks (e.g., Practical, Robust Implementation and Sustainability Model [71]; Integrated-Promoting Action on Research Implementation in Health Services [72, 73]; and Exploration, Preparation, Implementation, Sustainment [74]). It was designed to provide guidance on which determinant each implementation strategy was used to overcome. Additional coding was completed to assess which CFIR domain (individuals, innovation, inner setting, outer setting, and implementation process) described each barrier resolved by an implementation strategy. A research assistant with experience supporting implementation science investigations in community settings reviewed the final list and applied codes. A second researcher (WC) reviewed these codes and indicated inconsistencies; the lead researcher (LB) reviewed and reconciled. The guidance tool was developed by calculating the frequency of determinant framework domains that were resolved by each final implementation strategy. CFIR domains were indicated in the tool as relevant to each implementation strategy if they were mentioned at least two times or mentioned once for implementation strategies with frequency of  3 mentions.

Results

Interviewee and EBI characteristics

Ten researchers (56%) and eight practitioners (44%) across all seven targeted community settings and three intervention outcomes completed interviews (n = 18). Researchers had on average 9.4 (SD ± 6.40) years of experience, and half rated themselves as intermediate-level implementation scientists (n = 5, 50%). Practitioners had 15.8 (SD ± 5.97) years of experience on average. Based on information from organizational websites, participants were located in all United States census regions, with most (n = 6 researchers, 60%, and n = 4 practitioners, 50%) in the southern region.

On the screening survey question prompting respondents to list up to three EBIs delivered, researchers listed an average of 2.0 (SD ± 0.83) and practitioners listed an average of 2.8 (SD ± 0.66). All interviewees, including one practitioner interviewee who was suggested by a researcher colleague, listed unique EBIs which were subsequently discussed in interviews. Practitioners reported delivering EBIs across both levels (individual or interpersonal; policy, systems, or environmental levels) more frequently than researchers; both researchers and practitioners often focused on multiple public health outcomes. Half (n = 9, 50%) of the interviewees responded that they conduct community-engaged research in (researchers) or work in (practitioners) multiple settings, for a total of 19 research and 35 practice settings. See Table 1 for characteristics of interviewees and EBIs.

Table 1 Researcher and practitioner interviewees’ demographic and evidence-based intervention characteristics

Implementation strategies applied in communities

Across all interviews, implementation strategies were mentioned a total of 432 times, with an average of 24 mentions per interviewee. Following the rapid deductive analysis approach, a total of 40 unique ISAC strategies remained (Table 2). ISAC strategies reported the most across interviewees were conduct pragmatic evaluation (n = 31), provide training (n = 26), change adaptable program components (n = 26), leverage funding sources (n = 21), structure grant requirements (n = 19), and tailor recruitment strategies (n = 19). See Table 2 for the full list of implementation strategies, definitions, and examples. Twenty-four ISAC strategies (60%) were similar to 37 of the 73 ERIC implementation strategies (51%); see Table 3.

Table 2 Implementation strategies Applied in communities (ISAC), with definitions and examples
Table 3 Comparison of implementation strategies in the ISAC compilation to the ERIC compilation

Of the 40 ISAC strategies, most (n = 26, 65%) were used to overcome barriers related to multiple RE-AIM dimensions, including implementation (n = 29, 73%), adoption (n = 27, 67%), reach (n = 14, 35%), maintenance (n = 13, 33%), and effectiveness (n = 7, 18%) (Table 4). Implementation barriers were commonly described in relation to achieving fidelity (n = 22), with fewer focused on cost (n = 7). As few ISAC strategies were described regarding assessing participants’ long-term outcomes (i.e., the individual-level maintenance dimension), individual-level and organizational-level maintenance barriers were reported together. See Table 4.

Table 4 Implementation outcomes (RE-AIM dimensions) influenced by each ISAC strategy

Most of the ISAC strategies (n = 28, 70%) were used to overcome barriers related to multiple CFIR domains. Barriers were classified as occurring at level of the outer setting (n = 27, 68%), inner setting (n = 21, 53%), implementation process (n = 21, 53%), innovation (n = 12, 53%), and individuals (n = 11, 21%). See Table 5 for barriers overcome by determinant level.

Table 5 Implementation determinants influenced by each ISAC strategy

ISAC dissemination strategies

Across practitioner interviews, preferred strategies for disseminating ISAC strategies were mentioned a total of 53 times, with an average of seven mentions per interviewee. After collapsing and sorting similar dissemination strategies (e.g., names of specific organizations were collapsed into “Local public health organization” or “National public health organization”), there were 20 unique dissemination source (n = 7) and channel (n = 13) strategies. National public health organizations in the United States, such as professional associations, accreditors, and non-profit organizations (n = 13), and peer learning networks (n = 3) were the most mentioned sources for disseminating ISAC strategies. Webinars (n = 6), listservs/newsletters (n = 5), and conferences (n = 3) were the most mentioned dissemination channels.

Discussion

An emic approach to understanding implementation strategies used by researchers and practitioners in community settings was used to develop ISAC. Given the unique challenges and contextual differences between clinical and community settings, an understanding of common strategies used by community researchers and practitioners was needed [12]. One question in implementation science is how many compilations of implementation strategies are necessary – whether one broad framework should apply to all settings (and be tailored for specific projects) or whether multiple setting-specific compilations are needed. We agree that too many theories, frameworks, and models perpetuates challenges including duplicating efforts and slowing progress. However, we argue that settings with shared language and similar barriers need compilations with familiar terminology and setting-specific implementation strategies. As community-based implementation scientists, we attempted to use ERIC implementation strategies with community partners, but the terminology and strategies did not resonate. Our previous work providing simple terminology changes [12] was inefficient, leading to the conceptualization of the ISAC compilation.

This study uncovered unique implementation strategies (n = 16) not included in the ERIC compilation, especially related to developing pragmatic, adaptable interventions and evaluation methods (“Design pragmatic programs”, “Develop adaptable programs”, “Reassess inclusion criteria”, “Conduct pragmatic evaluation”) and using engaged processes to meet the needs of community partners (“Engage potential partners”, “Build partner relationships”, “Meet community partners’ needs”). Further, 36 ERIC strategies were not reflected in the ISAC compilation, reinforcing the utility of a separate implementation strategy compilation specific to community settings, while emphasizing some overlap between how EBI adoption, implementation, and maintenance can be supported in community or clinical settings. Given this, we recommend researchers choose a compilation of best fit, continue to cite ERIC as seminal work on implementation strategies, and integrate both ISAC (or other compilations [27, 28]) and ERIC as appropriate. For example, the Food is Medicine (FIM) movement in the United States is a public health and clinical approach to improving food security, nutrition, and health among Americans with low income that occurs between community settings (e.g., food banks and other community-based organizations) and healthcare spaces [75]. Future work that integrates ERIC and ISAC may lead to suitable implementation strategies to advance FIM innovations that are fit to context.

In addition to selecting implementation strategies that fit context, selecting appropriate strategies relevant to implementation determinants is also needed. The results of this study reveal that most of the barriers interviewees addressed through implementation strategies were at the outer or inner setting level. However, previously reported common implementation strategies in community settings are training, educational materials, and outreach visits [14], which are designed to overcome barriers at the individual level (e.g., knowledge, skills) versus the inner or outer setting level. This indicates that commonly used implementation strategies reported in the literature may not be appropriate to address practice barriers. The ISAC compilation and strategy selection process could lead to greater use of relevant implementation strategies to address common barriers.

Given the established need for pragmatic implementation strategy selection guidance, we developed a four-step process informed by Leeman et al.’s streamlined approach [76], recommendations for accelerating the contextual inquiry process [77, 78], and our experiences selecting implementation strategies in community settings [44]. First, as recommended in Leeman et al.’s approach and other recent guidance on shortening the contextual inquiry process [76,77,78], we recommend reviewing available information on EBI integration, including both peer-reviewed (e.g., systematic reviews) and grey literature (e.g., practice reports) to identify multi-context implementation barriers and facilitators that have previously been captured. Then, a decision can be made on whether further contextual inquiry is needed to determine barriers and facilitators to EBI integration in the unique implementation setting. If further contextual inquiry is deemed necessary (e.g., there is minimal supporting evidence) to better understand implementation determinants, this could be done through formal research (e.g., using CFIR or other determinant frameworks [35, 71,72,73,74]). Alternatively, the process could be more pragmatic; for example, a team could complete a “premortem” session to uncover potential implementation outcomes (RE-AIM dimensions) to be addressed by implementation strategies [79, 80].

Second, respecting practitioners’ preference for a strength-based approach that includes both overcoming barriers and capitalizing on facilitators, we recommend working with partners to review practice materials and determine what implementation strategies are already in place – recognizing that they may not be referred to as implementation strategies (e.g., resources and supports) [81]. Third, in alignment with the contextual inquiry approach in step 1, we recommend using guidance tools (Tables 4 and 5) to select implementation strategies either by implementation outcomes (RE-AIM dimensions) or determinant framework levels. For example, a team may agree that strategies are primarily needed to influence adoption when introducing a new EBI. The team could use Table 4 to review strategies used to improve adoption, tailor potential strategies already in place, and come to consensus on additional strategies needed to integrate the EBIs. As another example, a review of the literature or contextual inquiry may identify that barriers to implementing built environment approaches to physical activity are primarily at the levels of the inner setting (e.g., relative priority, available resources) and the innovation (e.g., cost, complexity) [34, 82, 83]. Using a similar process, teams could review strategies indicated to alleviate inner setting and innovation barriers in Table 5 and select relevant strategies.

Fourth, we recommend teams tailor the selected strategies with health equity considerations in mind. Ideally, strategies should be selected at multiple levels to avoid focusing only on improving individual skills or knowledge without considering existing systems and structures. This includes considering the implementation strategies’ costs and the organization’s available resources, history, and other contextual factors [15, 20]. Next, the strategies should carefully be tailored based on cultural considerations and partner input to promote equity and avoid exacerbating disparities (e.g., by considering whether the strategy will be sufficient to improve adoption or implementation in settings with lower resources) [15, 20]. Finally, to promote “evolvability” (i.e., adaptations leading to sustainability and equitable impact) of EBIs and implementation strategies over time [20], teams are encouraged to consider from the onset how they will track modifications, such as through the Framework for Reporting Adaptations and Modifications-Expanded (FRAME) and the Framework for Reporting Adaptations and Modifications to Evidence-based Implementation Strategies (FRAME-IS) [84, 85].

The strategy selection process and ISAC compilation can be used by community setting researchers and practitioners to develop the evidence base regarding tailoring and testing ISAC strategies to determine what works to aid EBI adoption, implementation, and maintenance, for whom, and under what conditions. For this to occur, ISAC first needs to be disseminated to both researchers and practitioners. Results of the study inform how to disseminate ISAC to practitioners beyond peer-reviewed journal articles intended for research audiences. Trusted public health organizations’ role in sharing this compilation through webinars, listservs, and newsletters will be important. In general, the results of this study align with similar work regarding practitioners’ dissemination preferences; however, in prior investigations practitioners have preferred email and listserv communications [86,87,88,89,90]. The primary results reported in this study regarding the use of webinars may be an effect of increased virtual communication during and after the COVID-19 pandemic. In general, researchers value dissemination to practice audiences, but often are unsure of where to begin dissemination efforts [89, 90]. The results presented here provide suggestions for where to best place effort to reach practice audiences. Future research should test dissemination through specific sources and channels to determine how to best increase adoption of ISAC and associated tools by practitioner audiences.

Moving forward, the practical application of ISAC among research and practice settings should build on a few important considerations. First, as highlighted by this study, many researchers and practitioners already use implementation strategies to help overcome EBI barriers at multiple levels. However, gaps in terminology and alignment with implementation science theories, models, and frameworks have existed. ISAC may help to minimize this gap by providing standard terminology for implementation strategy application and outcomes regarding RE-AIM that could eventually be compared across community settings and EBIs. Comparing strategies that are already occurring may also help researchers and practitioners capture and report available assets or facilitators and advocate for funds to focus on overcoming barriers with more complex implementation strategies. Efforts to support community implementation researchers and practitioners to name, define, specify (e.g., the actor, action, target, timing, frequency, implementation outcome, and justification of strategies), track, and evaluate ISAC strategies over time in response to overcoming barriers of EBIs is needed to move the state of the science forward [9, 91].

As one final consideration, implementation outcomes are typically organizational-level RE-AIM dimensions: adoption, implementation, and maintenance. Reach has also been operationalized as an implementation outcome as there are implementation (and dissemination) strategies that can be used by researchers or practitioners to better reach the priority population [92, 93]. Effectiveness is not typically included as an outcome of implementation strategies (the “stuff” we do to help people and places do the “thing.”) [94]. However, we included effectiveness in this study and subsequent guidance tools because of the challenges associated with assessing effectiveness in real world settings on an ongoing basis, especially as programs are maintained long term with less academic or researcher control [20, 95]. In this vein, effectiveness may more accurately be considered as a proximal implementation outcome [8]. For example, improved program evaluation may lead to program maintenance through the documentation and sharing of results with funders and other invested parties [96, 97].

Limitations

A robust approach to identifying implementation strategies used to overcome EBI barriers among community researchers and practitioners was used by an expert group who are active in the community implementation science field. However, this work is not without limitations. First, some of the interviewees had previous relationships with the research team. Given the topic of interviews was not overly sensitive, familiarity may have aided the conversation regarding ISAC. Given the rapid analysis method used, interview findings were not returned to participants to provide feedback, which may have enhanced interpretation.

Next, the qualitative approach used limited the number of participants who provided input on ISAC (e.g., compared to the 71 expert researchers who participated in the Delphi study to develop the ERIC compilation) [10]. We contend that, while our sample was smaller, our methods were sufficient to develop a comprehensive compilation and we were able to gather more rich, nuanced information than can be gleaned from surveys. Strengths and measures for achieving trustworthiness and transferability [67] include data collection from both researchers and practitioners, the RE-AIM-based interview guide that prompted for barriers and strategies used to overcome them, and the assessment of saturation. However, compiling the list of ISAC strategies and identifying the associated implementation determinants and outcomes was conducted as an initial step to moving forward the application and evaluation of implementation strategies in community settings. The initial ISAC compilation will likely be refined and updated over time, as other compilations have been, as more evidence is generated on strategy use across diverse settings (e.g., geographical regions, urban and rural areas) and EBIs [10, 98].

Lastly, related to the ISAC guidance tools, we opted to use broad domains instead of specific constructs (e.g., “inner setting” instead of “communications” or “culture”). As other scholars have found [49], it is difficult to identify links between specific determinants and specific implementation strategies, and single strategies are often linked to multiple barriers (or multiple strategies are needed to address a single barrier) [8]. As well, when frameworks or guidance tools become too granular, they do not apply to diverse settings [99]. Thus, using broad domains instead of specific constructs may be more useful for the barrier-strategy matching process.

Conclusions

In summary, future steps to broadly disseminate the ISAC compilation so community researchers and practitioners will have a wider range of relevant strategies available to them are needed. Our hope is that others will use the ISAC compilation and label strategies by name to advance generalizable implementation science knowledge. Additional efforts to aid community researchers and practitioners in documenting the necessary components of ISAC strategy application may also be needed to build the evidence base on which ISAC strategies work, in which contexts, and for which implementation researchers and practitioners.

Data availability

The datasets analyzed during the current study are available from the corresponding author on reasonable request.

Abbreviations

CCIS:

Consortium for Cancer Implementation Science

CFIR:

Consolidated Framework for Implementation Research

EBI:

Evidence-based interventions

ERIC:

Expert Recommendations for Implementing Change

FIM:

Food is Medicine

ISAC:

Implementation Strategies Applied in Communities

RE-AIM:

Reach, Effectiveness, Adoption, Implementation, Maintenance

References

  1. Balas E, Boren S. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray A, editors. Yearb Med inform 2000 patient-centered Syst. Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH; 2000. pp. 65–70.

    Google Scholar 

  2. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011;104:510–20.

    Article  PubMed  PubMed Central  Google Scholar 

  3. Khan S, Chambers D, Neta G. Revisiting time to translation: implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes Control. 2021;32:221–30.

    Article  PubMed  Google Scholar 

  4. Brownson R, Jones E. Bridging the gap: translating research into policy and practice. Prev Med. 2009;49:313–5.

    Article  PubMed  Google Scholar 

  5. Tabak R, Khoong EC, Chambers DA, Brownson R. Bridging Research and Practice: models for dissemination and implementation research. Am J Prev Med. 2012;43:337–50.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Proctor E, Ramsey AT, Saldana L, Maddox TM, Chambers DA, Brownson RC. FAST: a Framework to assess speed of translation of Health innovations to Practice and Policy. Glob Implement Res Appl. 2022;2:107–19.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Curran G, Bauer M, Mittman B, Pyne J, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50:207–26.

    Article  Google Scholar 

  8. Smith JD, Li DH, Rafferty MR. The implementation Research Logic Model: a method for planning, executing, reporting, and synthesizing implementation projects. Implement Sci. 2020;15:84.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, Measurement challenges, and Research Agenda. Adm Policy Ment Health. 2011;38:65–76.

    Article  PubMed  Google Scholar 

  10. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM et al. A refined compilation of implementation strategies: results from the Expert recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10.

  11. Bauer M, Damschroder L, Hagedorn H, Smith J, Kilbourne A. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3.

  12. Balis LE, Houghtaling B, Harden SM. Using implementation strategies in community settings: an introduction to the Expert recommendations for Implementing Change (ERIC) compilation and future directions. Transl Behav Med. 2022;12:965–78.

    Article  PubMed  Google Scholar 

  13. Mazzucca S, Arredondo EM, Hoelscher DM, Haire-Joshu D, Tabak RG, Kumanyika SK, et al. Expanding implementation research to Prevent Chronic diseases in Community settings. Annu Rev Public Health. 2021;42:135–58.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Wolfenden L, Reilly K, Kingsland M, Grady A, Williams CM, Nathan N, et al. Identifying opportunities to develop the science of implementation for community-based non-communicable disease prevention: a review of implementation trials. Prev Med. 2019;118:279–85.

    Article  PubMed  Google Scholar 

  15. Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16:28.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Shelton RC, Hailemariam M, Iwelunmor J. Making the connection between health equity and sustainability. Front Public Health. 2023;11:1226175.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Woodward EN, Singh RS, Ndebele-Ngwenya P, Melgar Castillo A, Dickson KS, Kirchner JE. A more practical guide to incorporating health equity domains in implementation determinant frameworks. Implement Sci Commun. 2021;2:61.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Woodward EN, Matthieu MM, Uchendu US, Rogal S, Kirchner JE. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implement Sci. 2019;14:26.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Shelton RC, Adsul P, Oh A, Moise N, Griffith DM. Application of an antiracism lens in the field of implementation science (IS): recommendations for reframing implementation research with a focus on justice and racial equity. Implement Res Pract. 2021;2:263348952110494.

    Google Scholar 

  20. Shelton RC, Chambers DA, Glasgow RE. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting Health Equity over Time. Front Public Health. 2020;8:134.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Glasgow RE. Use of the PRISM- RE-AIM Framework to Address Health Equity [Internet]. 2022 [cited 2023 Sep 11]. https://re-aim.org/resources-and-tools/figures-and-tables/

  22. Lorenc T, Petticrew M, Welch V, Tugwell P. What types of interventions generate inequalities? Evidence from systematic reviews. J Epidemiol Community Health. 2013;67:190–3.

    Article  PubMed  Google Scholar 

  23. Hall M, Graffunder C, Metzler M. Policy approaches to advancing Health Equity. J Public Health Manag Pract. 2016;22:S50–9.

    Article  PubMed  Google Scholar 

  24. Shelton RC, Brownson RC. Enhancing impact: a call to action for Equitable implementation science. Prev Sci. 2023. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11121-023-01589-z.

    Article  PubMed  PubMed Central  Google Scholar 

  25. Rabin B, Brownson R, Haire-Joshu D, Kreuter M, Weaver N. A glossary for dissemination and Implementation Research in Health. J Public Health Manag Pract. 2008;14:117–23.

    Article  PubMed  Google Scholar 

  26. Effective Practice And Organisation Of Care (EPOC). EPOC Taxonomy [Internet]. Zenodo. 2021 [cited 2022 Dec 27]. https://zenodo.org/record/5105850

  27. Leeman J, Birken SA, Powell BJ, Rohweder C, Shea CM. Beyond implementation strategies: classifying the full range of strategies used in implementation science and practice. Implement Sci. 2017;12:125.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Cook CR, Lyon AR, Locke J, Waltz T, Powell BJ. Adapting a compilation of implementation strategies to Advance School-based implementation research and practice. Prev Sci. 2019;20:914–35.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Pronk N, Kleinman DV, Goekler SF, Ochiai E, Blakey C, Brewer KH. Promoting Health and Well-being in healthy people 2030. J Public Health Manag Pract. 2021;27:S242.

    Article  PubMed  Google Scholar 

  30. McLeroy K, Bibeau D, Steckler A, Glanz K. An ecological perspective on health promotion programs. Health Educ Q. 1988;15:361–77.

    Article  Google Scholar 

  31. Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and implementation research on Community-Based Cancer Prevention. Am J Prev Med. 2010;38:443–56.

    Article  PubMed  Google Scholar 

  32. Frieden T. A Framework for Public Health: the Health Impact pyramid. Am J Public Health. 2010;100:590–5.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Golden SD, Earp JAL. Social Ecological approaches to individuals and their contexts: Twenty Years of Health Education & Behavior Health Promotion Interventions. Health Educ Behav. 2012;39:364–72.

    Article  PubMed  Google Scholar 

  34. Houghtaling B, Balis L, Pradhananga N, Cater M, Holston D. Healthy eating and active living policy, systems, and environmental changes in rural Louisiana: a contextual inquiry to inform implementation strategies. Int J Behav Nutr Phys Act. 2023;20:132.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci. 2022;17:75.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Balis L, Harden S. Scaling out a 4-H healthy Meeting Initiative: challenges in implementation and comprehensive evaluation. J Nutr Educ Behav. 2019;51:1020–4.

    Article  PubMed  Google Scholar 

  37. Swindle T, Johnson SL, Davenport K, Whiteside-Mansell L, Thirunavukarasu T, Sadasavin G, et al. Closing the gap: a mixed-methods exploration of barriers and facilitators to Evidence-Based Practices for Obesity Prevention in Head Start. J Nutr Educ Behav. 2019;51:1067–e10791.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Houghtaling B, Misyak S, Serrano E, Dombrowski RD, Holston D, Singleton CR, et al. Using the Exploration, Preparation, implementation, and Sustainment (EPIS) Framework to Advance the Science and Practice of Healthy Food Retail. J Nutr Educ Behav. 2023;55:245–51.

    Article  PubMed  Google Scholar 

  39. Balis LE, Harden SM. Replanning a Statewide walking Program through the iterative use of the Reach, effectiveness, adoption, implementation, and maintenance Framework. J Phys Act Health. 2021;18:1310–7.

    Article  PubMed  Google Scholar 

  40. Balis L, Strayer IIIT. Evaluating take the stairs, Wyoming! Through the RE-AIM Framework: challenges and opportunities. Front Public Health. 2019;7:368.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Balis LE, Strayer TE III, Ramalingam N, Harden SM. Beginning with the end in mind: contextual considerations for scaling-out a community-based intervention. Front Public Health. 2018;6:1–14.

    Article  Google Scholar 

  42. Harden SM, Balis LE, Armbruster S, Estabrooks PA. A natural experiment to determine if FitEx works: impact of a statewide walking program. Transl Behav Med. 2024;14:98–105.

    Article  PubMed  Google Scholar 

  43. Fixsen D, Naoom S, Blase K, Friedman R, Wallace F, Implementation Research. A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network; Report No.: FMHI Publication #231.

  44. Balis LE, Houghtaling B. Matching barriers and facilitators to implementation strategies: recommendations for community settings. Implement Sci Commun. 2023;4:144.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res. 2017;44:177–94.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Fernandez ME, ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6:136.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Beidas RS, Dorsey S, Lewis CC, Lyon AR, Powell BJ, Purtle J, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. 2022;17:55.

    Article  PubMed  PubMed Central  Google Scholar 

  49. Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42.

    Article  PubMed  PubMed Central  Google Scholar 

  50. Brownson R, Eyler A, Harris J, Moore J, Tabak R. Getting the Word Out: New approaches for disseminating Public Health Science. J Public Health Manag Pract. 2018;24:102.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Keddem S, Barg FK, Frasso R. Practical Guidance for studies using freelisting interviews. Prev Chronic Dis. 2021;18:200355.

    Article  Google Scholar 

  52. Scarduzio JA. Emic Approach to Qualitative Research. The International Encyclopedia of Communication Research Methods. John Wiley & Sons, Ltd; 2017. pp. 1–2.

  53. Polit DF, Beck CT. Generalization in quantitative and qualitative research: myths and strategies. Int J Nurs Stud. 2010;47:1451–8.

    Article  PubMed  Google Scholar 

  54. Parker C, Scott S, Geddes A. Snowball Sampling. SAGE Res Methods Found. 2019. http://dx.doi.org/10.4135/.

  55. National Cancer Institute, Division of Cancer Control and Population Sciences (DCCPS). Consortium for Cancer Implementation Science (CCIS) [Internet]. 2023 [cited 2024 Jan 3]. https://cancercontrol.cancer.gov/is/initiatives/ccis

  56. Oh AY, Emmons KM, Brownson RC, Glasgow RE, Foley KL, Lewis CC, et al. Speeding implementation in cancer: the National Cancer Institute’s implementation Science centers in Cancer Control. J Natl Cancer Inst. 2023;115:131–8.

    Article  PubMed  Google Scholar 

  57. Islami F, Goding Sauer A, Miller KD, Siegel RL, Fedewa SA, Jacobs EJ, et al. Proportion and number of cancer cases and deaths attributable to potentially modifiable risk factors in the United States. CA Cancer J Clin. 2018;68:31–54.

    Article  PubMed  Google Scholar 

  58. Dietz WH, Douglas CE, Brownson RC. Chronic Disease Prevention: Tobacco Avoidance, Physical Activity, and Nutrition for a healthy start. JAMA. 2016;316:1645–6.

    Article  PubMed  Google Scholar 

  59. Arem H, Loftfield E. Cancer Epidemiology: a survey of modifiable risk factors for Prevention and Survivorship. Am J Lifestyle Med. 2017;12:200–10.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Glanz K, Rimer B, Lewis FM. Health Behavior and Health Education. 3rd Edition. San Francisco: Jossey-Bass; 2002.

  61. Padek M, Colditz G, Dobbins M, Koscielniak N, Proctor EK, Sales AE, et al. Developing educational competencies for dissemination and implementation research training programs: an exploratory analysis using card sorts. Implement Sci. 2015;10:114.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Glasgow R, Harden S, Gaglio B, Rabin B, Smith M, Porter G et al. RE-AIM planning and evaluation Framework: adapting to New Science and Practice with a 20-Year review. Front Public Health. 2019;7.

  63. Rogers E. Diffusion of Innovations, 4th Edition. New York, NY: Free Press; 2010.

  64. Gale R, Wu J, Erhardt T, Bounthavong M, Reardon C, Damschroder L et al. Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implement Sci. 2019;14.

  65. Finley E, Hamilton A, Kowalski C, Midboe A, Nevedal A, Young J. Rapid Qualitative Methods in Implementation Science: Techniques and Considerations. 15th Annual Conference on the Science of Dissemination and Implementation in Health, Washington, DC; 2022.

  66. Hamilton A. Qualitative Methods in Rapid Turn-around Health Services Research. VA HSR&D Cyberseminar Spotlight on Women’s Health; 2013.

  67. Graneheim U, Lundman B. Qualitative content analysis in nursing research: concepts, procedures and measures to achieve trustworthiness. Nurse Educ Today. 2004;24:105–12.

    Article  CAS  PubMed  Google Scholar 

  68. Cutcliffe J, McKenna H. Expert qualitative researchers and the use of audit trails. Methodol Issues Nurs Res. 2004;45:126–35.

    Google Scholar 

  69. Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62:107–15.

    Article  PubMed  Google Scholar 

  70. Guest G, Namey E, Chen M. A simple method to assess and report thematic saturation in qualitative research. Soundy A, editor. PLOS ONE. 2020;15:e0232076.

  71. Feldstein AC, Glasgow RE, Practical A. Robust implementation and sustainability model (PRISM) for integrating Research findings into Practice. Jt Comm J Qual Patient Saf. 2008;34:228–43.

    PubMed  Google Scholar 

  72. Harvey G, Kitson A. Implementing evidence-based practice in Healthcare: a Facilitation Guide. United Kingdom: Taylor & Francis; 2015.

    Book  Google Scholar 

  73. Ritchie MJ, Drummond KL, Smith BN, Sullivan JL, Landes SJ. Development of a qualitative data analysis codebook informed by the i-PARIHS framework. Implement Sci Commun. 2022;3:98.

    Article  PubMed  PubMed Central  Google Scholar 

  74. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in Public Service Sectors. Adm Policy Ment Health Ment Health Serv Res. 2011;38:4–23.

    Article  Google Scholar 

  75. Houghtaling B, Short E, Shanks CB, Stotz SA, Yaroch A, Seligman H, et al. Implementation of Food Is Medicine Programs in Healthcare settings: a narrative review. J Gen Intern Med. 2024. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11606-024-08768-w.

    Article  PubMed  PubMed Central  Google Scholar 

  76. Leeman J, Rohweder C, Lafata JE, Wangen M, Ferrari R, Shea CM, et al. A streamlined approach to classifying and tailoring implementation strategies: recommendations to speed the translation of research to practice. Implement Sci Commun. 2024;5:65.

    Article  PubMed  PubMed Central  Google Scholar 

  77. Davis M, Beidas RS. Refining contextual inquiry to maximize generalizability and accelerate the implementation process. Implement Res Pract. 2021;2:263348952199494.

    Google Scholar 

  78. Davis M, Siegel J, Becker-Haimes EM, Jager-Hyman S, Beidas RS, Young JF et al. Identifying common and Unique Barriers and facilitators to implementing evidence-based practices for suicide Prevention across Primary Care and Specialty Mental Health settings. Arch Suicide Res. 2021;1–23.

  79. Gilmartin H, Lawrence E, Leonard C, McCreight M, Kelley L, Lippmann B, et al. Brainwriting Premortem: a Novel Focus Group Method To Engage Stakeholders and identify preimplementation barriers. J Nurs Care Qual. 2019;34:94–100.

    Article  PubMed  Google Scholar 

  80. Brow K, Pesce M, Ganjineh B, Armbruster S, Harden S. Informing a weight management intervention for endometrial cancer survivors: A RE-AIM-based brainwriting premortem. 14th Annual Conference on Science of Dissemination and Implementation, Washington, DC; 2021.

  81. Houghtaling B, Pradhananga N, Holston D, Cater M, Balis L. A mixed method evaluation of practitioners’ perspectives on implementation strategies for healthy eating and active living policy, Systems, and environmental changes. J Public Health Manag Pract: under review.

  82. Balis LE, Grocke-Dewey M. Built environment approaches: extension personnel’s preferences, barriers, and facilitators. Front Public Health. 2022;10:960949.

    Article  PubMed  PubMed Central  Google Scholar 

  83. Balis LE, Vincent J. Implementation strategies to support built Environment approaches in Community settings. Health Promot Pract. 2023;24:502–13.

    Article  PubMed  Google Scholar 

  84. Wiltsey Stirman S, Baumann A, Miller C. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14.

  85. Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implement Sci. 2021;16:36.

    Article  PubMed  PubMed Central  Google Scholar 

  86. Strayer TE III, Kennedy L, Balis L, Ramalingam N, Wilson M, Harden S. Cooperative Extension gets moving, but how? Exploration of Extension Health Educators’ sources and channels for information-seeking practices. Am J Health Promot. 2020;34:198–205.

    Article  PubMed  Google Scholar 

  87. Strayer IIITE, Balis LE, Ramalingam NS, Harden SM. Dissemination in extension: health specialists’ Information Sources and Channels for Health Promotion Programming. Int J Environ Res Public Health. 2022;19:16673.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Brownson RC, Ballew P, Brown KL, Elliott MB, Haire-Joshu D, Heath GW, et al. The Effect of disseminating evidence-based interventions that promote physical activity to Health departments. Am J Public Health. 2007;97:1900–7.

    Article  PubMed  PubMed Central  Google Scholar 

  89. Brownson RC, Jacobs JA, Tabak RG, Hoehner CM, Stamatakis KA. Designing for Dissemination among Public Health Researchers: findings from a National Survey in the United States. Am J Public Health. 2012;103:1693–9.

    Article  Google Scholar 

  90. Shato T, Kepper MM, McLoughlin GM, Tabak RG, Glasgow RE, Brownson RC. Designing for dissemination among public health and clinical practitioners in the USA. J Clin Transl Sci. 2024;8:e8.

    Article  PubMed  Google Scholar 

  91. Smith JD, Norton WE, Mitchell SA, Cronin C, Hassett MJ, Ridgeway JL, et al. The longitudinal implementation strategy Tracking System (LISTS): feasibility, usability, and pilot testing of a novel method. Implement Sci Commun. 2023;4:153.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Estabrooks PA. An overview of dissemination and implementation science in physical activity and Health Promotion. Kinesiol Rev. 2023;12:4–18.

    Article  Google Scholar 

  93. Reilly KL, Kennedy S, Porter G, Estabrooks P. Comparing, contrasting, and integrating dissemination and implementation outcomes included in the RE-AIM and implementation outcomes frameworks. Front Public Health. 2020;8:430.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Curran GM. Implementation science made too simple: a teaching tool. Implement Sci Commun. 2020;1:27.

    Article  PubMed  PubMed Central  Google Scholar 

  95. Harden S, Balis L, Wilson M, Strayer T. Assess, plan, do, evaluate, and report: iterative cycle to remove academic control of a community-based physical activity program. Prev Chronic Dis. 2021;18.

  96. Luke D, Calhoun A, Robichaux C, Elliot M, Moreland-Russell S. The Program Sustainability Assessment Tool: a New Instrument for Public Health Programs. Prev Chronic Dis. 2014;11.

  97. Balis L, Palmer S, Isack M, Yaroch A. Building evaluation and dissemination capacity: An assessment of contextual factors to design technical assistance. 16th Annual Conference on the Science of Dissemination and Implementation. Washington, DC, 2023.

  98. Nathan N, Powell BJ, Shelton RC, Laur CV, Wolfenden L, Hailemariam M, et al. Do the Expert recommendations for Implementing Change (ERIC) strategies adequately address sustainment? Front Health Serv. 2022;2:905909.

    Article  PubMed  PubMed Central  Google Scholar 

  99. Ramanadhan S. Pragmatic Qualitative Analysis for Implementation Science [Internet]. Brown University; [cited 2024 Jan 12]. https://events.brown.edu/advance-ctr/event/267625-advance-ri-ctr-implementation-science-seminar

Download references

Acknowledgements

The authors would like to thank Nila Pradhahanga for her assistance with analysis.

Funding

This work was supported by the National Institutes of Health, National Cancer Institute, Consortium for Cancer Implementation Science, project 172721.0.009.01.001 (LB, BH, WC); National Heart, Lung, and Blood Institute at the National Institutes of Health projects K01 HL166957-01 (GMM) and K12HL138030 (HL). Funders did not play a role in the research.

Author information

Authors and Affiliations

Authors

Contributions

LB, BH, and SH conceptualized the study. LB acquired funding. LB developed the methodology with assistance from BH, WC, and HL. LB, WC, HL, MW, and EP collected data. LB, BH, WC, HL, and GMM analyzed data. LB and BH developed the original draft, and all authors contributed to reviewing and editing.

Corresponding author

Correspondence to Laura E. Balis.

Ethics declarations

Ethics approval and consent to participate

This research was deemed exempt by the University of Nebraska Medical Center IRB, #0257-23-EX.

Consent for publication

Not applicable.

Competing interests

Authors LB and BH are members of the Editorial Board of International Journal of Behavioral Nutrition and Physical Activity. LB and BH were not involved in the journal’s peer review process of, or decisions related to, this manuscript.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Balis, L.E., Houghtaling, B., Clausen, W. et al. Advancing implementation science in community settings: the implementation strategies applied in communities (ISAC) compilation. Int J Behav Nutr Phys Act 21, 132 (2024). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12966-024-01685-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12966-024-01685-5

Keywords