Free
Medical Education  |   November 2020
Status of Entrustable Professional Activities (EPA) Implementation at Colleges of Osteopathic Medicine in the United States and Future Considerations
Author Notes
  • From the West Virginia School of Osteopathic Medicine in Lewisburg (Dr Linsenmeyer), the Des Moines University - College of Osteopathic Medicine in Iowa (Dr Wimsatt), American Association of Colleges of Osteopathic Medicine in Bethesda, Maryland (Dr Speicher), Rowan University School of Osteopathic Medicine in Stratford, New Jersey (Dr Basehore), and the Kirksville College of Osteopathic Medicine in Missouri (Dr Sexton). 
  • Financial Disclosures: None reported. 
  • Support: None reported. 
  •  *Address correspondence to Machelle Linsenmeyer, EdD, West Virginia School of Osteopathic Medicine, 400 Lee St N, Lewisburg, WV, 24901-1274. Email: alinsenmeyer@osteo.wvsom.edu.
     
Article Information
Medical Education   |   November 2020
Status of Entrustable Professional Activities (EPA) Implementation at Colleges of Osteopathic Medicine in the United States and Future Considerations
The Journal of the American Osteopathic Association, November 2020, Vol. 120, 749-760. doi:https://doi.org/10.7556/jaoa.2020.129
The Journal of the American Osteopathic Association, November 2020, Vol. 120, 749-760. doi:https://doi.org/10.7556/jaoa.2020.129
Abstract

Context: Competency-based medical education, developmental milestones for residency training, and the single graduate medical education (GME) accreditation system have emerged over the last decade, necessitating new ways to adequately prepare graduates to meet new standards in proficiency, including the 13 Core Entrustable Professional Activities (EPA) for Entering Residency. The American Association of Colleges of Osteopathic Medicine (AACOM) Entrustable Professional Activities (EPA) steering committee has implemented an information-gathering process to provide suggestions for supporting a variety of EPA-related implementation efforts at colleges of osteopathic medicine (COMs) across the country.

Objective: To review the status of EPA implementation at COMs nationally.

Methods: An explanatory mixed-methods design was used to guide information gathering and synthesis of a 41-question survey and interview feedback; the overarching premise of this design was to use qualitative data to build upon initial quantitative findings. This survey was delivered electronically through a link emailed to liaisons at each main, branch, and satellite campus of the 47 schools within the AACOM institutional database. After survey administration, follow-up structured interviews were conducted according to an 18-question script with a purposive sample of 16 institutions with EPA implementation levels ranging from “moderate implementation with reporting” to “full implementation with reporting.” Post-interview, the interview notes were analyzed and results were aggregated for comparison with the original survey findings.

Results: Of the 47 schools surveyed, 42 responded (89.4%). To maintain uniformity in data coding and analysis, 36 of 47 (76.6%) of COMs with independently submitted survey responses were retained in the review. The majority of those respondents (23 of 36; 64%) indicated that their institution was above “somewhat knowledgeable” toward “expert” regarding knowledge of EPAs, but 23 (64%) also indicated “no confidence” or “somewhat confident” regarding EPA implementation. Postinterview results showed that the majority of schools were equally distributed across the “foundational implementation” (10; 28%), “slight implementation” (11; 31%), and “moderate implementation” (11, 31%) categories, with a few schools indicating “no implementation” (2; 5%) or “progressive implementation” (2; 5%).

Conclusion: The results of this review indicate that most osteopathic medical schools are at the early stages of EPA implementation, with emphasis varying by program year in terms of the specific EPAs addressed. Many schools appear engaged in curricular change efforts that will support the advancement of EPA use within their institutions. Faculty development was identified as a continued critical need for a majority of institutions.

Competency-based medical education, developmental milestones for residency training, and the single graduate medical education (GME) accreditation system have emerged during the last decade and bolstered the need for both osteopathic and allopathic medical schools to consider ways to adequately prepare graduates to meet new standards in competency, proficiency, and progression toward independence in clinical practice. The concept of entrustable professional activities (EPAs) emerged several years ago as a mechanism to facilitate the translation of competency-based frameworks into work-based tasks and support faculty decisions in granting autonomy to trainees at designated levels of supervision. As described by Olle ten Cate, who first conceptualized EPAs, they are 

units of professional practice, defined as tasks or responsibilities that trainees are entrusted to perform unsupervised once he or she has attained sufficient specific competency. EPAs are independently executable, observable, and measurable in their process and outcome, and there­fore, suitable for entrustment decisions.1

 
EPAs are useful for training and assessment in daily practice.1 In response to these new and refined ideas, the Association of American Medical Colleges (AAMC) developed a set of 13 Core Entrustable Professional Activities for Entering Residency in 2014.2 In the same year, American Association of Colleges of Osteopathic Medicine (AACOM) Board of Deans approved an initiative charging its Society of Osteopathic Medical Educators (SOME) to further examine and provide osteopathic medical consideration and per­spective into the 13 EPAs developed by the AAMC. Under this charge, SOME synthesized a steering committee consisting of 13 members from colleges of osteopathic medicine (COMs), many of whom held leadership positions in a broad range of areas including curriculum, assessment, clinical education, and faculty development, along with a student representative of the Council of Osteopathic Student Government Presidents. This EPA steering committee worked closely with representative COMs liaisons, appointed by their respective Deans, who provided important insight and feedback from all COMs to help guide the work of the committee and identify relevant items within the AAMC-identified activities requiring osteopathic consideration. As with the steering committee members, the liaisons held leadership positions (Senior Associate Dean, Associate Dean, Assistant Dean) in a broad range of areas, including curriculum, assessment, and clinical education. The initial work of the EPA steering committee with liaisons resulted in a guiding document for osteopathic medical schools titled, “Osteopathic Considerations for Core Entrustable Professional Activities (EPAs) for Entering Residency.”3 
In addition to assessing what is currently available on EPAs and identifying relevant osteopathic knowledge and skills to add to the existing AAMC EPAs, the EPA steering committee was also charged with developing documents and resources to help guide and support the work of introducing EPAs within each osteopathic school. As a continuation of AACOM's support of EPA development at osteopathic medical schools, the AACOM EPA steering committee established 4 subcommittees in August 2016 to oversee the development of shared resources in several key areas: (1) faculty development; (2) curriculum; (3) instructional resources; and (4) assessment planning. In November 2017, the AACOM EPA steering committee published “Entrustable Professional Activities for Entering Residency: Establishing Common Osteopathic Performance Standards in the Transition From Medical School to Residency.”4 This document grew out of work initiated by the 4 subcommittees. It also put forth several challenges that institutions could potentially encounter when implementing EPAs in undergraduate medical education (UME) such as accessing valid and reliable assessment tools, locating resources to conduct assessment over time and across a variety of contexts and settings, managing an increasing volume of clinical assessments, and developing tools to assess behaviors beyond traditional knowledge, skill, and attitudinal competencies (eg, conscientiousness, honesty, recognition of personal limitations). In April 2018, the EPA Assessment Planning Subcommittee published “Assessment Considerations for Core Entrustable Professional Activities for Entering Residency,”5 which included a review of existing assessment tools that could potentially support the development of EPA-specific assessment plans to match the unique context and needs of each institution. 
The purpose of this article is to highlight research conducted by the AACOM EPA steering committee to assess the status of EPA implementation at COMs in the United States. An additional goal was to provide evidence-based suggestions for supporting EPA-related implementation efforts. 
Methods
To explore our hypothesis that EPAs are in relatively early stages of development across the country, an explanatory mixed methods research design was selected to guide the study (Figure 1). The overarching premise of an explanatory design is that qualitative data can be used to explain or build upon initial quantitative findings.6 In this case, a survey instrument was designed and administered, then follow-up interviews were conducted with campus leaders to supplement the initial survey findings and to explore the types of resources regarded as most useful by the osteopathic medical education community. Final analyses involved the synthesis of quantitative and qualitative findings. 
Figure 1.
This chart shows a conceptualization of the process taken in the Explanatory Design: Follow-up Explanations Model6 where the initial design phase starts with the collection and analysis of quantitative data (QUAN), followed by the subsequent collection and analysis of qualitative data (QUAL) based on the results of the quantitative phase.
Figure 1.
This chart shows a conceptualization of the process taken in the Explanatory Design: Follow-up Explanations Model6 where the initial design phase starts with the collection and analysis of quantitative data (QUAN), followed by the subsequent collection and analysis of qualitative data (QUAL) based on the results of the quantitative phase.
Initially, a team of AACOM EPA steering committee/subcommittee members assisted by AACOM staff created a survey (eAppendix) to determine the extent of EPA implementation across COMs. The survey included 41 questions focused on institutional knowledge of EPAs, confidence with EPA implementation, level of EPA implementation, curricular location for EPA-related teaching and assessment, methods of assessment and reporting, implementation challenges and successes, and the identification of needed resources that the AACOM EPA steering committee could potentially work to provide. The survey instrument featured EPA numbering and brief explanations as follows: 

EPA 1: Gather a history and perform a physical examination.

 

EPA 2: Prioritize a differential diagnosis following a clinical encounter.

 

EPA 3: Recommend and interpret common diagnostic and screening tests.

 

EPA 4: Enter and discuss orders and prescriptions.

 

EPA 5: Document a clinical encounter in the patient record.

 

EPA 6: Provide an oral presentation of a clinical encounter.

 

EPA 7: Form clinical questions and retrieve evidence to advance patient care.

 

EPA 8: Give or receive a patient handover to transition care responsibility.

 

EPA 9: Collaborate as a member of an interprofessional team.

 

EPA 10: Recognize a patient requiring urgent or emergent care and initiate evaluation and management.

 

EPA 11: Obtain informed consent for tests and/or procedures.

 

EPA 12: Perform general procedures of a physician. (These procedures include basic cardiopulmonary resuscitation [CPR]; bag and mask ventilation; venipuncture; inserting an intravenous line; and osteopathic manipulative medicine [OMM].)

 

EPA 13: ldentify system failures and contribute to a culture of safety and improvement.

 
The survey was delivered electronically through a link emailed to liaisons at each main, branch, and satellite campus of 47 schools within the AACOM institutional database. The second step involved descriptive statistics, which were used to analyze the the survey data and document key findings for follow-up during structured interviews. 
Next, a structured interview script was developed based on the results of the survey analyses using a purposive sample of 16 institutions with EPA implementation levels ranging from “moderate implementation with reporting” to “full implementation with reporting.” The 18 structured interview questions fell into 4 distinct categories: faculty development (4 questions), curriculum (3 questions), assessment (8 questions), and web tools/resources (3 questions). One additional question addressed the institution's willingness to share resource materials. To conduct the interviews, a team of 4 EPA steering committee members (ML, LW, PB, PS) contacted 16 campus administrators using a combination of telephone and email communications. Postinterview, these same steering committee members thematically analyzed the qualitative findings using the interview notes, then aggregated the results for comparison with the original survey findings. 
In the final step, each interviewer provided analysis of the initial survey items in light of the interview responses and reflected on the extent to which campuses were reporting on strict interpretation of the 13 core EPAs or an interpretation of the competencies underlying each of the 13 EPAs. Interviewers also noted differences between the survey and interview response patterns. 
Results
Of the 47 osteopathic medical schools surveyed, all but 5 responded (42; 89.4%). A preliminary review of the responses revealed inconsistencies in data reporting among schools, with some independently reporting by campus location and others submitting combined responses by main campus administration to represent both the main and branch/satellite locations. To maintain uniformity in data coding and analysis, 36 medical schools with independently submitted survey responses were retained in the review. In instances where schools submitted a combined response representing both main and branch/satellite locations, the branch/satellite campus responses were excluded and the main campus responses representing all campuses were kept. 
Four survey items provided the most comparable information on EPA-related implementation efforts across campuses: knowledge of EPAs, confidence with EPA implementation, level of EPA implementation, and EPA-related instruction within the curriculum. These items provided the basis for additional analysis and follow-up. 
When asked to rate their institution's overall knowledge of EPAs, the majority of respondents (23 of 36; 64%) indicated that their institution was somewhere on the scale between “somewhat knowledgeable” to “expert” (from 50 to 100 on a scale where 0=no knowledge, 50=somewhat knowledgeable, and 100=expert; Figure 2). The orange dotted line in Figure 2 notes the division between institutions that indicated a range between “somewhat knowledgeable” and “expert” (above) and those that indicated “no knowledge” to “somewhat knowledgeable” (below). 
Figure 2.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's overall knowledge of EPAs.” Abbreviations: EPA, Entrustable Professional Activity.
Figure 2.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's overall knowledge of EPAs.” Abbreviations: EPA, Entrustable Professional Activity.
When asked to rate their institution's overall confidence with EPA implementation, the majority of institutions (23 of 36; 64%) indicated “no confidence” or “somewhat confident” with EPA implementation (Figure 3), again on a scale where 0=no confidence, 50=somewhat confident, and 100=expert. The orange dotted line in Figure 3 notes the division between institutions that ranged from “somewhat confident” to “expert” (above) and “somewhat confident” to “no confidence” (below). 
Figure 3.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's confidence level with EPA implementation.” Abbreviations: EPA, Entrustable Professional Activity.
Figure 3.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's confidence level with EPA implementation.” Abbreviations: EPA, Entrustable Professional Activity.
Institutions reported widely varying levels of EPA implementation that ultimately required clarification during a subsequent interview. Most institutions (11 of 36; 31%) reported “moderate implementation.” Eight institutions (8 of 36; 22%) included EPA reporting (“moderate implementation with reporting,” “progressive implementation with reporting,” or “full implementation with reporting”) as a component of EPA implementation on their campuses. Nine institutions (9 of 36; 25%) reported either “foundational” or “slight” implementation. 
When asked in what year(s) their institution provides instruction addressing the knowledge, skill, and attitudinal competencies that underlie entrustability for each EPA, institutions reported a relatively large volume of preclinical teaching (Year 1 and 2 averages) directed toward EPA 1 (69 of 72; 96%) and a relatively small volume focused on EPAs 4 (29 of 72; 40%), 8 (24 of 72; 33%), 10 (37 of 72; 51%), 11 (30 of 72; 42%) and 13 (29 of 72; 40%). In the clerkship years (Year 3 and 4 averages), institutions reported the delivery of a larger amount of instruction related to the competencies underlying EPAs 5 (64 of 72; 89%) and 6 (65 of 72; 90%). Self-reported implementation data is shown in Table 1. 
Table 1.
Entrustable Professional Activity (EPA) Instruction in the Preclinical and Clinical Years Provided by Colleges of Osteopathic Medicine (n=72)
EPA Preclinical and clinical averages* No. of institutions providing instruction (%) No. of institutions not providing instruction (%)
EPA 1 Y1/Y2 Average 69 (95.8%) 3 (4.2%)
Y3/Y4 Average 53 (73.6%) 19 (26.4%)
EPA 2 Y1/Y2 Average 61 (84.7%) 11 (15.3%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 3 Y1/Y2 Average 52 (72.2%) 20 (27.8%)
Y3/Y4 Average 61 (84.7%) 11 (15.3%)
EPA 4 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 5 Y1/Y2 Average 55 (76.4%) 17 (23.6%)
Y3/Y4 Average 64 (88.9%) 8 (11.1%)
EPA 6 Y1/Y2 Average 46 (63.9%) 26 (36.1%)
Y3/Y4 Average 65 (90.3%) 7 (9.7%)
EPA 7 Y1/Y2 Average 51 (70.8%) 21 (29.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 8 Y1/Y2 Average 24 (33.3%) 48 (66.7%)
Y3/Y4 Average 54 (75.0%) 18 (25.0%)
EPA 9 Y1/Y2 Average 60 (83.3%) 12 (16.7%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 10 Y1/Y2 Average 37 (51.4%) 35 (48.6%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 11 Y1/Y2 Average 30 (41.7%) 42 (58.3%)
Y3/Y4 Average 57 (79.2%) 15 (20.8%)
EPA 12 Y1/Y2 Average 56 (77.8%) 16 (22.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 13 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 50 (69.4%) 22 (30.6%)

* Preclinical indicates Year 1 (Y1) and Year 2 (Y2) Combined; Clinical indicates Year 3 (Y3) and Year 4 (Y4) combined.

Table 1.
Entrustable Professional Activity (EPA) Instruction in the Preclinical and Clinical Years Provided by Colleges of Osteopathic Medicine (n=72)
EPA Preclinical and clinical averages* No. of institutions providing instruction (%) No. of institutions not providing instruction (%)
EPA 1 Y1/Y2 Average 69 (95.8%) 3 (4.2%)
Y3/Y4 Average 53 (73.6%) 19 (26.4%)
EPA 2 Y1/Y2 Average 61 (84.7%) 11 (15.3%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 3 Y1/Y2 Average 52 (72.2%) 20 (27.8%)
Y3/Y4 Average 61 (84.7%) 11 (15.3%)
EPA 4 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 5 Y1/Y2 Average 55 (76.4%) 17 (23.6%)
Y3/Y4 Average 64 (88.9%) 8 (11.1%)
EPA 6 Y1/Y2 Average 46 (63.9%) 26 (36.1%)
Y3/Y4 Average 65 (90.3%) 7 (9.7%)
EPA 7 Y1/Y2 Average 51 (70.8%) 21 (29.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 8 Y1/Y2 Average 24 (33.3%) 48 (66.7%)
Y3/Y4 Average 54 (75.0%) 18 (25.0%)
EPA 9 Y1/Y2 Average 60 (83.3%) 12 (16.7%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 10 Y1/Y2 Average 37 (51.4%) 35 (48.6%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 11 Y1/Y2 Average 30 (41.7%) 42 (58.3%)
Y3/Y4 Average 57 (79.2%) 15 (20.8%)
EPA 12 Y1/Y2 Average 56 (77.8%) 16 (22.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 13 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 50 (69.4%) 22 (30.6%)

* Preclinical indicates Year 1 (Y1) and Year 2 (Y2) Combined; Clinical indicates Year 3 (Y3) and Year 4 (Y4) combined.

×
Given the variation noted in the survey responses across campuses, interviews were conducted with osteopathic medical schools where EPA implementation included EPA reporting elements or schools that categorized themselves as “moderate implementation with reporting” or higher (16 schools). These schools represented mostly private schools (13 of 16; 81.2%) and had a fairly even distribution between schools founded as legacy schools between 1900 and 1980 (5 of 16; 31%), those started between 1981 and 2000 (3 of 16; 19%), those founded between 2001 and 2010 (4 of 16; 25%), and those begun between 2011 and 2019 (4 and 16; 25%). During the interviews, campus representatives elaborated on their definitions of implementation and described initiatives undertaken in support of EPA-related teaching and assessment. Table 2 shows a comparison of the original survey responses selected and the EPA implementation category that was determined through interview follow-ups and the distribution by type and age of COM. 
Table 2.
Institutional Classification of EPA Implementation Pre- and Postinterview (n=16)
Original survey response Postinterview categorization No. of institutions Type of institution Age of institution
Full implementation with reporting Slight implementation 1 Private Early 2000s
Full implementation with reporting No implementation 1 Private Late 2000s
Full implementation Slight implementation 1 Private Late 2000s
Progressive implementation with reporting Moderate implementation 1 Private Late 1900s
Progressive implementation with reporting Foundational implementation 1 Private Late 1900s
Progressive implementation with reporting No implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation 2 Public Private Legacy Late 2000s
Progressive implementation Foundational implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation Slight Implementation Slight Implementation Slight Implementation Slight Implementation 4 Private Private Private Private Legacy Legacy Early 2000s Early 2000s
Moderate implementation with reporting Moderate implementation with reporting Moderate implementation with reporting Foundational implementation Foundational implementation Foundational implementation 3 Private Private Private Late 1900s Early 2000s Late 2000s

Abbreviation: EPA, entrustable professional activity.

Table 2.
Institutional Classification of EPA Implementation Pre- and Postinterview (n=16)
Original survey response Postinterview categorization No. of institutions Type of institution Age of institution
Full implementation with reporting Slight implementation 1 Private Early 2000s
Full implementation with reporting No implementation 1 Private Late 2000s
Full implementation Slight implementation 1 Private Late 2000s
Progressive implementation with reporting Moderate implementation 1 Private Late 1900s
Progressive implementation with reporting Foundational implementation 1 Private Late 1900s
Progressive implementation with reporting No implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation 2 Public Private Legacy Late 2000s
Progressive implementation Foundational implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation Slight Implementation Slight Implementation Slight Implementation Slight Implementation 4 Private Private Private Private Legacy Legacy Early 2000s Early 2000s
Moderate implementation with reporting Moderate implementation with reporting Moderate implementation with reporting Foundational implementation Foundational implementation Foundational implementation 3 Private Private Private Late 1900s Early 2000s Late 2000s

Abbreviation: EPA, entrustable professional activity.

×
Figure 4 represents the EPA implementation results for all 36 osteopathic medical schools after synthesizing the information received on both the surveys and follow-up interviews. 
Figure 4.
Summary entrustable professional activities implementation results for 36 colleges of osteopathic medicine.
Figure 4.
Summary entrustable professional activities implementation results for 36 colleges of osteopathic medicine.
Postinterview results showed that the majority of schools were equally distributed across the “foundational implementation” (10 of 36; 28%), “slight implementation” (11 of 36; 31%), and “moderate implementation” (11 of 36; 31%) categories, with a few schools indicating “no implementation” (2 of 36; 5%) or “progressive implementation” (2 of 36; 5%). The reasons reported by institutions for their initial survey classifications varied—some selected a classification level based on what they planned to do rather than on what they were actually doing, while others listed practices not fully aligned with the core EPA guiding principles. There was also misinterpretation at 1 institution relating to the word “reporting,” which was assumed to mean grade reporting instead of the reporting of entrustment levels. 
Institutions noted success in EPA implementation related to mapping within the curricula, providing professional development, offering faculty support, and developing innovative solutions to EPA assessment (eg, EPA mobile application; clinical distinction experience). Many of the schools mentioned curricular change initiatives launched in support of planning toward EPA assessment. 
Participants’ comments on successes included: 
  • ■ “Mapping the curriculum to the EPAs”
  • ■ “Mapping project to capture where/when/at what level we are teaching and assessing the EPAs in our curriculum”
  • ■ “Independently mapp[ing] EPAs 3 and 12 for all of our [program] years 1 and 2 courses”
  • ■ “Embedd[ing]them in current teaching and simulation, which has increased collaborative activities”
  • ■ “Develop[ing] an app for use in [program] years 3 and 4”
  • ■ “Faculty development sessions with all faculty, giving faculty the opportunity to contribute to the development and implementation process”
  • ■ “Strong faculty support”
  • ■ “Our [program] year 3, 8-week clinical distinction experience”
The resources most often reported as useful were guiding documents from AACOM and the American Association of Medical Colleges (AAMC),7-8 the AAMC Core EPA Listserv, various journal publications, professional presentations, and workshops. Medical school administrators also mentioned that experienced, intrainstitution faculty and staff shared EPA knowledge and experience with the campus community in support of EPA development. They identified barriers to EPA implementation in 4 areas: institutional buy-in, faculty development, time, and resources. Specifically, institutions noted the following challenges: 
  • ■ “More buy-in with faculty and administration”
  • ■ “Time and faculty knowledge and buy-ins of the EPAs”
  • ■ “Faculty knowledge and buy in”
  • ■ “Asking attendings to assess students over and above the rotation evaluation when they are already very pressed for time”
  • ■ “Preceptor resistance to increased number of evaluation items”
  • ■ “Reluctance of preceptors/assessors to take on what they consider additional”/"one more thing”
  • ■ “[Adding] another layer of our assessment”
  • ■ “Manpower to do the assessments and observations”
  • ■ “Getting preceptors on board with and trained to assess”
  • ■ “Extensive faculty development required”
  • ■ “Time and faculty development”
  • ■ “We need more faculty development”
  • ■ “Difficult to engage preceptors in faculty development activities around the EPAs”
  • ■ “Lack of information and understanding, time, prioritization”
  • ■ “Time! Resources for data management”
  • ■ “Limited by technology”
When asked to suggest resources that could be developed in support of implementation, respondents mentioned an EPA implementation guide, a showcase of best practices, faculty development resources, and assessment tools. 
When asked to indicate future AACOM annual conference topics of most interest, the average respondents’ interest ranking on a scale of 1 (low) to 5 (high) was highest for “clinical teaching and assessment” (mean, 4.52) followed closely by “faculty development” (mean, 4.38; Figure 5). Faculty development was also noted as a need during postinterviews, with 10 of 16 institutions (63%) indicating either no faculty development related to EPAs or limitations in the area of faculty development. 
Figure 5.
Summary of responses from colleges of osteopathic medicine regarding conference topics that would be of most interest regarding entrustable professional activities implementation.
Figure 5.
Summary of responses from colleges of osteopathic medicine regarding conference topics that would be of most interest regarding entrustable professional activities implementation.
Discussion
Most of the COMs included in this survey evaluation were in the early stages of EPA implementation, with emphasis varying by program year in terms of the specific EPAs addressed. Many schools appeared engaged in curricular change efforts that will support the advancement of EPA use within their institutions. Faculty development was identified, through both the survey and postinterviews, as a continued critical need for a majority of institutions. 
Some differences noted in the survey/interview responses appear related to implementation of initiatives not truly in alignment with EPA guiding principles. This resulted in the overrating of implementation level by a few of the institutions initially surveyed, with responses clarified during the interview process. Misinterpretations fell into 4 thematic categories. First, several schools implemented competency-based rubrics or case logs that were then aligned with specific EPAs “after the fact,” meaning that they were inferring vs measuring the whole. It will be important for educators to note that while EPAs consist of multiple competencies, these competencies can change depending on the clinical context, complexity of the task, patient case, and more. EPAs are intended for holistic assessment with all elements integrated to be true EPA assessment.9-12 Second, several schools mentioned assessing students at a single point in time or on just 1 clerkship rotation. Best practices in EPA assessment emphasize measurement across “a trajectory to independence” that involves observations conducted across multiple contexts, difficulty levels, clinical encounter durations, etc.9,13-14 Third, several schools mentioned using assessment instruments affiliated with Likert or performance quality scales rather than entrustment scales. Others focused on the competencies underlying the EPAs and disregarded elements of trustworthiness deemed essential to the EPA assessment process. Institutions should reference the extant literature, which indicates that EPA assessment should focus on achievement of independence as measured by trained observers who rate the level of supervision required at that point in training, rather than assessing decision-making based solely on knowledge, skill, or attitudinal competencies. Such assessment provides an indication of readiness for entrustment by including assessment of additional elements of trustworthiness (reliability, integrity, and humility), risks and benefits, and trust propensity.9-12,14-19 Fourth, EPA assessment requires significant faculty development and resources, but several schools mentioned concerns about assessors without appropriate training completing EPA assessments for students and a lack of resources. EPA assessment requires faculty development to ensure that assessors understand the difference in competency versus EPA assessment. Shared mental models regarding what constitutes entrustment at the level of entering residence is also an important need for faculty. Faculty development can also ensure buy-in and reproducibility over time.15,17,20-23 
This pilot study addresses an important gap in the literature by exploring the current status of EPA implementation at osteopathic medical schools in the US. It also represents a first attempt at assembling evidence-based suggestions for support of EPA-related planning across campuses through use of a multi-stage, mixed methods study design. The results reveal interesting variation in the types of implementation efforts undertaken and resources found beneficial by campus leadership. These findings should prove particularly helpful in identifying variables for consideration when designing EPA-related faculty development, resources, and support services. 
Limitations
This study was limited to the responses of 36 osteopathic medical schools (mostly main campuses) and, thus, does not represent the voice of all COMs, branch campuses, or additional locations in the US. It also does not represent the COMs that were not fully accredited at the time of the survey. Targeted research to explore differences in patterns of EPA implementation across newer versus established schools is still needed. Additionally, curricula and assessment at the 36 schools reviewed may overlap in unique ways beyond the scope of this study (eg, independent schools vs branch campuses vs schools with additional locations). The liaisons who were selected by their Deans to represent the voice of an institution may have limited the voice from all constituents of an institution. The interviews were dependent on researchers from the EPA steering committee, which may have produced some bias even within the guided interview questions. Finally, this study focused on implementation only. 
The EPA steering committee has yet to fully investigate the reporting of EPAs or the use of EPAs as a “handover” to graduate medical education. This is a task that the AACOM EPA steering committee will be continuing moving forward. The full results of the survey will take multiple articles to summarize and the committee will provide additional reports on results in future articles and workshops at the AACOM annual conference. In particular, the steering committee has an interest in looking at the EPAs individually to identify specific needs and issues for each EPA. By doing this, we may be able to identify more targeted suggestions and resources for areas such as faculty development or assessment. As a direct outcome of this study, the AACOM EPA steering committee will continue to provide targeted resources, workshops, and poster sessions at the AACOM annual conference, and communications to support implementation efforts and highlight methods that osteopathic medical schools are using to move this initiative forward. 
Conclusion
COMs are proceeding down a deliberate path toward understanding and implementing the Core EPAs for Entering Residency. As expected, the level of implementation varies from early adopters developing assessment tools to schools with no implementation at this time. Surveying schools led to some misinformation that needed clarification; however, upon follow-up, themes were revealed that elucidated both the national status of this educational movement and important needs for continuing progress. The AACOM EPA steering committee will use this information to plan programming and tools for dissemination to all schools, with the goal of ultimately better preparing COMs graduates for entering residency programs. 
Acknowledgements
The authors acknowledge the American Association of Osteopathic Medicine for its vision and support in advancing the EPA initiative nationally. The authors thank the EPA liaisons who took time to respond to the surveys and follow-up questions as well as all faculty who are working to improve the educational process for osteopathic medical graduates. 
References
ten Cate, O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. . 2013;5(1):157-158. doi: 10.4300/JGME-D-12-00380.1
Association of American Medical Colleges (AAMC). Core entrustable professional activities for entering residency: curriculum developers’ guide; 2017. https://www.aamc.org/system/files/c/2/484778-epa13toolkit.pdf Accessed August 26, 2020.
American Association of Colleges of Osteopathic Medical Education (AACOM). Osteopathic considerations for core entrustable professional activities (EPAs) for entering residency; 2016. https://www.aacom.org/docs/default-source/med-ed-presentations/core-epas.pdf?sfvrsn=10 Accessed August 26, 2020.
Basehore PM, Mortensen LH, Katsaros E, et al. Entrustable professional activities for entering residency: establishing common osteopathic performance standards in the transition from medical school to residency. . J Am Osteopath Assoc. 2017;117(11):712-718. doi: 10.7556/jaoa.2017.137 [CrossRef] [PubMed]
Linsenmeyer M, Wimsatt L, Speicher M, Powers J, Miller S, Katsaros E. Assessment considerations for core entrustable professional activities for entering residency. J Am Osteopath Assoc. 2018;118(4):243-251, e16-e21. doi: 10.7556/jaoa.2018.049 [CrossRef] [PubMed]
Creswell JW, Plano Clark, VL. Designing and Conducting Mixed Methods Research. 1st ed. Sage Publications; 2006.
Association of American Medical Colleges. Core Entrustable Professional Activities for Entering Residency: Curriculum Developers’ Guide. Association of American Medical Colleges; 2014. https://www.aamc.org/initiatives/coreepas/publicationsandpresentations/. Accessed July 6, 2019.
American Association of Colleges of Osteopathic Medicine. Osteopathic Considerations for Core Entrustable Professional Activities (EPAs) for Entering Residency. American Association of Colleges of Osteopathic Medicine; 2016. http://www.aacom.org/docs/default-source/med-ed-presentations/core-epas.pdf?sfvrsn=20. Accessed July 6, 2019.
ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983-1002. doi: 10.3109/0142159x.2015.1060308 [CrossRef] [PubMed]
ten Cate O. Entrustment as assessment: recognizing the ability, the right, and the duty to act. J Grad Med Educ. . 2016;8(2):261-262. doi: 10.4300/JGME-D-16-00097.1 [CrossRef] [PubMed]
ten Cate, O. Entrustment decision-making in competency-based teaching and assessment in health professions education. Med Sci Educ. . 2016;26(suppl 1):5-7. doi: 10.1007/s40670-016-0342-8 [CrossRef]
ten Cate O. Trust, competence, and the supervisor's role in postgraduate training. BMJ. . 2006;333(7571):748-751. doi: 10.1136/bmj.38938.407569.94 [CrossRef] [PubMed]
ten Cate O, Hart D, Ankel F, et al. Entrustment decision making in clinical training. Acad Med. . 2016;91(2):191-198. doi: 10.1097/ACM.0000000000001044 [CrossRef] [PubMed]
ten Cate O. A primer on entrustable professional activities. Korean J Med Educ. 2018;30(1):1-10. doi: 10.3946/kjme.2018.76 [CrossRef] [PubMed]
Meyer EG, Chen HC, Uijtdehaage S, Durning SJ, Maggio LA. Scoping review of entrustable professional activities in undergraduate medical education. Acad Med. . 2019;94(7):1040-1049. doi: 1097/ACM.0000000000002735 [CrossRef] [PubMed]
Peters H, Holzhausen Y, Boscardin C, ten Cate O, Chen HC. Twelve tips for the implementation of EPAs for assessment and entrustment decisions. Med Teach. . 2017;39(8):802-807. doi: 10.1080/0142159X.2017.1331031
Lomis K, Amiel JM, Ryan MS, et al. Implementing an entrustable professional activities framework in undergraduate medical education: early lessons from the AAMC core entrustable professional activities for entering residency pilot. Acad Med. . 2017;92(6):765-770. doi: 10.1097/ACM.0000000000001543 [CrossRef] [PubMed]
ten Cate O. Managing risks and benefits: key issues in entrustment decisions. Med Educ. . 2017;51(9):879-881. doi: 10.1111/medu.13362 [CrossRef] [PubMed]
ten Cate O. Trusting graduates to enter residency: what does it take? J Grad Med Educ. . 2014;6(1):7-10. doi: 10.4300/JGME-D-13-00436.1 [CrossRef] [PubMed]
Calaman S, Hepps JH, Bismilla Z, et al. The creation of standard-setting videos to support faculty observations of learner performance and entrustment decisions. Acad Med. . 2016;91(2):204-209. doi: 10.1097/ACM.0000000000000853 [CrossRef] [PubMed]
Lamba S, Wilson B, Natal B, Nagurka R, Anana M, Sule H. A suggested emergency medicine boot camp curriculum for medical students based on the mapping of core entrustable professional activities to emergency medicine level 1 milestones. Adv Med Educ Pract. . 2016;7:115-124. doi: 10.2147/AMEP.S97106 [PubMed]
Winn AS, Marcus CH, Sectish TC, et al. Association of american medical colleges core entrustable professional activities for entering residency. Acad Med. . 2016;91(11):S13. doi: 10.1097/ACM.0000000000001369 [CrossRef]
Favreau MA, Tewksbury L, Lupi C, et al. Constructing a shared mental model for faculty development for the core entrustable professional activities for entering residency. Acad Med. 2017;92(6):759-764. doi: 10.1097/ACM.0000000000001511 [CrossRef] [PubMed]
Figure 1.
This chart shows a conceptualization of the process taken in the Explanatory Design: Follow-up Explanations Model6 where the initial design phase starts with the collection and analysis of quantitative data (QUAN), followed by the subsequent collection and analysis of qualitative data (QUAL) based on the results of the quantitative phase.
Figure 1.
This chart shows a conceptualization of the process taken in the Explanatory Design: Follow-up Explanations Model6 where the initial design phase starts with the collection and analysis of quantitative data (QUAN), followed by the subsequent collection and analysis of qualitative data (QUAL) based on the results of the quantitative phase.
Figure 2.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's overall knowledge of EPAs.” Abbreviations: EPA, Entrustable Professional Activity.
Figure 2.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's overall knowledge of EPAs.” Abbreviations: EPA, Entrustable Professional Activity.
Figure 3.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's confidence level with EPA implementation.” Abbreviations: EPA, Entrustable Professional Activity.
Figure 3.
Responses from 36 colleges of osteopathic medicine when asked, “Rate your institution's confidence level with EPA implementation.” Abbreviations: EPA, Entrustable Professional Activity.
Figure 4.
Summary entrustable professional activities implementation results for 36 colleges of osteopathic medicine.
Figure 4.
Summary entrustable professional activities implementation results for 36 colleges of osteopathic medicine.
Figure 5.
Summary of responses from colleges of osteopathic medicine regarding conference topics that would be of most interest regarding entrustable professional activities implementation.
Figure 5.
Summary of responses from colleges of osteopathic medicine regarding conference topics that would be of most interest regarding entrustable professional activities implementation.
Table 1.
Entrustable Professional Activity (EPA) Instruction in the Preclinical and Clinical Years Provided by Colleges of Osteopathic Medicine (n=72)
EPA Preclinical and clinical averages* No. of institutions providing instruction (%) No. of institutions not providing instruction (%)
EPA 1 Y1/Y2 Average 69 (95.8%) 3 (4.2%)
Y3/Y4 Average 53 (73.6%) 19 (26.4%)
EPA 2 Y1/Y2 Average 61 (84.7%) 11 (15.3%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 3 Y1/Y2 Average 52 (72.2%) 20 (27.8%)
Y3/Y4 Average 61 (84.7%) 11 (15.3%)
EPA 4 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 5 Y1/Y2 Average 55 (76.4%) 17 (23.6%)
Y3/Y4 Average 64 (88.9%) 8 (11.1%)
EPA 6 Y1/Y2 Average 46 (63.9%) 26 (36.1%)
Y3/Y4 Average 65 (90.3%) 7 (9.7%)
EPA 7 Y1/Y2 Average 51 (70.8%) 21 (29.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 8 Y1/Y2 Average 24 (33.3%) 48 (66.7%)
Y3/Y4 Average 54 (75.0%) 18 (25.0%)
EPA 9 Y1/Y2 Average 60 (83.3%) 12 (16.7%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 10 Y1/Y2 Average 37 (51.4%) 35 (48.6%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 11 Y1/Y2 Average 30 (41.7%) 42 (58.3%)
Y3/Y4 Average 57 (79.2%) 15 (20.8%)
EPA 12 Y1/Y2 Average 56 (77.8%) 16 (22.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 13 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 50 (69.4%) 22 (30.6%)

* Preclinical indicates Year 1 (Y1) and Year 2 (Y2) Combined; Clinical indicates Year 3 (Y3) and Year 4 (Y4) combined.

Table 1.
Entrustable Professional Activity (EPA) Instruction in the Preclinical and Clinical Years Provided by Colleges of Osteopathic Medicine (n=72)
EPA Preclinical and clinical averages* No. of institutions providing instruction (%) No. of institutions not providing instruction (%)
EPA 1 Y1/Y2 Average 69 (95.8%) 3 (4.2%)
Y3/Y4 Average 53 (73.6%) 19 (26.4%)
EPA 2 Y1/Y2 Average 61 (84.7%) 11 (15.3%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 3 Y1/Y2 Average 52 (72.2%) 20 (27.8%)
Y3/Y4 Average 61 (84.7%) 11 (15.3%)
EPA 4 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 5 Y1/Y2 Average 55 (76.4%) 17 (23.6%)
Y3/Y4 Average 64 (88.9%) 8 (11.1%)
EPA 6 Y1/Y2 Average 46 (63.9%) 26 (36.1%)
Y3/Y4 Average 65 (90.3%) 7 (9.7%)
EPA 7 Y1/Y2 Average 51 (70.8%) 21 (29.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 8 Y1/Y2 Average 24 (33.3%) 48 (66.7%)
Y3/Y4 Average 54 (75.0%) 18 (25.0%)
EPA 9 Y1/Y2 Average 60 (83.3%) 12 (16.7%)
Y3/Y4 Average 58 (80.6%) 14 (19.4%)
EPA 10 Y1/Y2 Average 37 (51.4%) 35 (48.6%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 11 Y1/Y2 Average 30 (41.7%) 42 (58.3%)
Y3/Y4 Average 57 (79.2%) 15 (20.8%)
EPA 12 Y1/Y2 Average 56 (77.8%) 16 (22.2%)
Y3/Y4 Average 60 (83.3%) 12 (16.7%)
EPA 13 Y1/Y2 Average 29 (40.3%) 43 (59.7%)
Y3/Y4 Average 50 (69.4%) 22 (30.6%)

* Preclinical indicates Year 1 (Y1) and Year 2 (Y2) Combined; Clinical indicates Year 3 (Y3) and Year 4 (Y4) combined.

×
Table 2.
Institutional Classification of EPA Implementation Pre- and Postinterview (n=16)
Original survey response Postinterview categorization No. of institutions Type of institution Age of institution
Full implementation with reporting Slight implementation 1 Private Early 2000s
Full implementation with reporting No implementation 1 Private Late 2000s
Full implementation Slight implementation 1 Private Late 2000s
Progressive implementation with reporting Moderate implementation 1 Private Late 1900s
Progressive implementation with reporting Foundational implementation 1 Private Late 1900s
Progressive implementation with reporting No implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation 2 Public Private Legacy Late 2000s
Progressive implementation Foundational implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation Slight Implementation Slight Implementation Slight Implementation Slight Implementation 4 Private Private Private Private Legacy Legacy Early 2000s Early 2000s
Moderate implementation with reporting Moderate implementation with reporting Moderate implementation with reporting Foundational implementation Foundational implementation Foundational implementation 3 Private Private Private Late 1900s Early 2000s Late 2000s

Abbreviation: EPA, entrustable professional activity.

Table 2.
Institutional Classification of EPA Implementation Pre- and Postinterview (n=16)
Original survey response Postinterview categorization No. of institutions Type of institution Age of institution
Full implementation with reporting Slight implementation 1 Private Early 2000s
Full implementation with reporting No implementation 1 Private Late 2000s
Full implementation Slight implementation 1 Private Late 2000s
Progressive implementation with reporting Moderate implementation 1 Private Late 1900s
Progressive implementation with reporting Foundational implementation 1 Private Late 1900s
Progressive implementation with reporting No implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation 2 Public Private Legacy Late 2000s
Progressive implementation Foundational implementation 1 Public Legacy
Progressive implementation Progressive implementation Progressive implementation Progressive implementation Slight Implementation Slight Implementation Slight Implementation Slight Implementation 4 Private Private Private Private Legacy Legacy Early 2000s Early 2000s
Moderate implementation with reporting Moderate implementation with reporting Moderate implementation with reporting Foundational implementation Foundational implementation Foundational implementation 3 Private Private Private Late 1900s Early 2000s Late 2000s

Abbreviation: EPA, entrustable professional activity.

×