Free
JAOA/AACOM Medical Education  |   April 2018
Assessment Considerations for Core Entrustable Professional Activities for Entering Residency
Author Notes
  • From the West Virginia School of Osteopathic Medicine in Lewisburg (Dr Linsenmeyer); Des Moines University College of Osteopathic Medicine in Iowa (Dr Wimsatt); Midwestern University Arizona College of Osteopathic Medicine in Glendale (Dr Speicher); Campbell University School of Osteopathic Medicine in Lillington, North Carolina (Dr Powers); Alabama College of Osteopathic Medicine in Dothan (Dr Miller); Western University of the Health Sciences College of Osteopathic Medicine of the Pacific in Pomona, California (Dr Katsaros). 
  • Financial Disclosures: None reported. 
  • Support: None reported. 
  •  *Address correspondence to Machelle Linsenmeyer, EdD, West Virginia School of Osteopathic Medicine, 400 Lee St N, Lewisburg, WV 24901-1274. Email: alinsenmeyer@osteo.wvsom.edu
     
Article Information
Medical Education / Graduate Medical Education / Curriculum
JAOA/AACOM Medical Education   |   April 2018
Assessment Considerations for Core Entrustable Professional Activities for Entering Residency
The Journal of the American Osteopathic Association, April 2018, Vol. 118, 243-251. doi:10.7556/jaoa.2018.049
The Journal of the American Osteopathic Association, April 2018, Vol. 118, 243-251. doi:10.7556/jaoa.2018.049
Web of Science® Times Cited: 1
Abstract

Context: In the process of analyzing entrustable professional activities (EPAs) for use in medical education, ten Cate and others identified challenges, including the need for valid and reliable EPA assessment strategies.

Objective: To provide osteopathic medical schools with a database of assessment tools compiled from the literature to assist them with the development and implementation of robust, evidence-based assessment methods.

Methods: MEDLINE, ERIC, PubMed, and other relevant databases were searched using MeSH keywords for articles outlining robust, evidence-based assessment tools that could be used in designing assessments for EPAs 1 through 6.

Results: A total of 55 publications were included in content analysis and reporting. All but 2 of the assessment articles were conducted in an undergraduate or graduate medical education setting. The majority of the 55 articles related to assessment of competencies affiliated with EPA 2 (16 articles) and EPA 4 (15 articles). Four articles focused on EPA 3.

Conclusion: Osteopathic medical schools can use this database of assessment tools to support the development of EPA-specific assessment plans that match the unique context and needs of their institution.

In 2015, the American Association of Colleges of Osteopathic Medicine (AACOM) convened a panel of 13 osteopathic educators to serve on an Entrustable Professional Activities (EPAs) Steering Committee. The goal of the committee was to advance the implementation of EPAs within osteopathic medical education. The committee worked in collaboration with EPA liaisons from 43 osteopathic medical schools and branch campuses, a student representative from the Council of Osteopathic Student Government Presidents, and representatives from the Educational Council on Osteopathic Principles (ECOP) to advance its work nationally. Its first publication, Osteopathic Considerations for Core Entrustable Professional Activities (EPAs) for Entering Residency,1,2 described the distinct osteopathic clinical skills medical students must competently perform on the first day of residency. This reference2 provided descriptions, guidelines, approaches, and the rationale for implementing the 13 Core EPAs using osteopathic approaches. 
As a continuation of AACOM's support of osteopathic medical schools in the EPA movement, the EPA Steering Committee established 4 subcommittees in August 2016 to oversee the development of shared resources in several key areas: (1) faculty development; (2) curriculum; (3) instructional resources; and (4) assessment planning. This article focuses on the work of the EPA Assessment Planning Subcommittee to date. 
Overview
Entrustable professional activities were introduced as a novel outcomes-based model for assessing essential, observable work tasks that define a specialty or profession.3 They are described as tasks observed in daily practice rather than specific knowledge, skills, and attitudes assessed in isolation. For example, in an oral case presentation, a trainee may require excellent skills in eliciting a history, performing a physical examination, determining a differential diagnosis, developing a plan, and communicating with others. While each of these individual skills can be assessed, the task and how the various knowledge, skills, and so on are used can vary depending on the workplace context (eg, outpatient vs emergency department) and a number of other factors. The overall performance on the task and all affiliated competencies measured requires documentation in a variety of patient contexts and settings to qualify as an entrustable activity. 
The key principles underlying EPA assessment and decision making (ie, workplace learning and trust) are generalizable to the continuum of physician training.4 EPAs offer medical educators a workplace-relevant approach to the assessment of physicians-in-training. At the same time, the structuring of EPAs requires the development of new measures capable of addressing the underlying elements of entrustment decision making. Entrustment decisions must be based on clear definitions of trust, specific types and levels of targeted entrustments, and explicit definitions of what each development stage means in terms of moving students from novice learners to physicians capable of providing safe, unsupervised, and professional care.4,5 
In the process of analyzing EPAs for use in medical education, researchers have identified several challenges, including the need for valid and reliable EPA assessment strategies.6,7 Assessment guidelines outlined by ten Cate et al5 call for the use of targeted assessment activities that are consistently observable, measurable, time-limited, and geared toward practice-relevant tasks suitable for entrustment. If assessments are not designed in this manner, validity and reliability may prove challenging.8 
In this article, we provide an overview of existing assessment strategies that can be used as a basis for conceptualizing potential approaches to the assessment of EPAs in osteopathic medical education. The aim is to assist osteopathic medical schools with the development of robust, evidence-based assessment methods and encourage the design and implementation of innovations in EPA assessment. 
Methods
During an initial meeting of the AACOM EPA Assessment Planning Subcommittee in August 2016, the group identified the lack of tools to support assessment planning as a critical gap in resources. Members felt that several types of information could be assembled to assist osteopathic medical schools with the development of robust, evidence-based assessment tools. The group agreed that some of the existing research on clinical assessment could prove valuable in supporting the design and customization of EPA assessment methods initiated across campuses. A decision was made to research the evidence in support of EPA-related assessment tool development and implementation with a focus on EPA 1 through EPA 6 because a higher level of entrustment should be targeted for these EPAs before entry into clinical training for all medical students. The other EPAs, although important, may require a lower level of entrustment or be assessed on a smaller scale before entry into clinical training. Each subcommittee member investigated assessment approaches affiliated with competencies embedded within 2 different EPAs and worked with his or her campus librarian to conduct the search of major publication databases. 
Three overarching questions were used to guide the literature review: 
  • 1. What research evidence exists to support the assessment of core competencies embedded within each of the first 6 EPAs?
  • 2. What are the identified strengths and weaknesses of assessment instruments currently used to assess the competencies embedded in EPA 1 through EPA 6?
  • 3. What opportunities exist for the future development of assessment tools that specifically target the measurement of EPA-related competency attainment?
Using advice from the academic librarians, the research team developed a strategy to search MEDLINE, ERIC, PubMed, and other relevant databases. MeSH keywords included sets of EPA-related terms such as “history,” “medical history taking,” “students, medical,” and “educational measurement,” among others (eAppendix 1). Articles with “Entrustable Professional Activities,” “EPAs,” or EPA-related competency wordings in the title were also included. All searches were conducted between August 2016 and March 2017. While the scope of the search was primarily focused on the most recent 10-year timeframe, several relevant articles that fell outside of this timeframe provided useful information related to EPA 2 and EPA 4 and were, therefore, included in the results. Articles were excluded from the review if the following was true: the article originated from a nonmedical field (eg, veterinary, pharmacy, dentistry), the full text was not accessible, the focus of the article was on how to design and implement an EPA rather than on how to assess learner entrustability on EPA-related competencies, the article assessed curricula or practitioners rather than learners, an English version of the article could not be obtained, it described a self-assessment solely (not including self-assessments used as part of a multisource feedback or objective structured clinical examination), or the article described the process used to develop the tool only. Also excluded from the review were editorials, commentaries, interviews, debates, and book reviews. The researchers then revisited the searches in January 2018 to update the findings before final publication. 
The group systematically searched the key journals and publication databases. To ensure consistency and accuracy during the data collection process, the subcommittee developed a classifying system to extract the most relevant information from each article. The categories and categorical elements were selected from analysis of pertinent literature, conference presentations, committee discussion, etc. The classification system ultimately included the following categories: assessment category, EPA relevance, affiliated competency domains, level of learner, targeted specialty area(s), skills assessed, assessment type, feedback mechanisms, content validity, internal structure, and type of internal structure (Figure). Articles were not placed into the classification system/database until they had met the selection criteria. 
Figure.
Classification system to extract the most relevant information from each article on entrustable professional activity (EPA) assessment. Abbreviations: GME, graduate medical education; OMM, osteopathic manipulative medicine; OSCE, objective structured clinical examination; UME, undergraduate medical education.
Figure.
Classification system to extract the most relevant information from each article on entrustable professional activity (EPA) assessment. Abbreviations: GME, graduate medical education; OMM, osteopathic manipulative medicine; OSCE, objective structured clinical examination; UME, undergraduate medical education.
Although most journals examined in this review clearly stated each article's primary topic in the title, abstract, or key words, every article was reviewed in detail by subcommittee members with expertise in medical education research. Each reviewer (M.L., L.W., M.S., J.P., S.M., and E.K.) undertook a primary review of competencies embedded in 2 different EPAs and independently coded the data using the classification categories. Each reviewer also served as a secondary reviewer and cross-checked the codings of the primary reviewer. Any disagreements were resolved by discussion, consensus, and consultation with a third member of the review team. Finally, the researchers perused the Association of American Medical Colleges’ Toolkits for the 13 Core EPAs9 to ensure that key assessment tools were not missed in this compilation. 
Results
A total of 55 articles were included in the content analyses and reporting. All but 1 of the articles were conducted in an undergraduate or graduate medical education setting. Although our review focused on empirical articles, important conceptual and theoretical findings were included in the reporting to enhance understanding of the theoretical and conceptual frameworks that form the basis for the development of EPA assessment tools. 
The majority of articles were related to the assessment of competencies affiliated with EPA 2 (16 articles [29.1%]) and EPA 4 (15 articles [27.3%]). Four articles (7.3%) focused on EPA 3-related competency assessment, 6 (10.9%) on EPA 1, 8 (14.5%) on EPA 5, and 6 (10.9%) on EPA 6. A majority of the articles targeted the assessment of medical students (34 [61.8%]). In contrast, 12 (21.8%) related to the assessment of residents, 7 (12.7%) of both medical students and residents, and 2 (3.6%) did not specify the learner level. 
All but 3 articles focused on 2 or more competency areas. Learner competency related to patient care was the focus of assessment in all of the articles except 3. At least half of the articles targeted interpersonal and communication skills and/or medical knowledge, and slightly fewer than half targeted practice-based learning and improvement. The 2 competencies addressed the least were systems-based practice and professionalism. 
All but 3 articles identified the type of assessment tool used to measure learner competency. Assessment tools most frequently reported in the literature were knowledge tests (n=21), simulation (n=16), and assessment rubrics (n=11). In contrast, product evaluations (n=7), long practice observations (n=3), short practice observations (n=2), and case-based discussions (n=2) were least often identified. The reviewers noted inconsistencies in the reporting of feedback to learners and inconsistencies in the provision of reliability and validity evidence across articles. In most instances, it was not clear how (or whether) postassessment feedback was provided to learners. When the feedback method was mentioned, written feedback was the most commonly used approach. Twenty-three articles included a description of how instrument reliability was determined, and 15 provided validity evidence. 
Although the type of response scale was not itemized in our final report, we noted that response scales embedded with the various assessment tools varied considerably, within a wide range of behavioral anchors, dichotomous measures (eg, yes/no; performed/didn't perform), and agreement, satisfaction, and/or confidence scales represented. This finding highlighted to the subcommittee the importance of selecting appropriate response scales when the focus of assessment is intended as an entrustment decision. Entrustment scales will be needed to frame assessment results in terms of the type and designated level of supervision required. In this sense, the language of entrustment scale construction represents a shift in approach from competency-based “person-descriptors” to EPA-based “work-descriptors” as described by ten Cate et al,6 who suggested using 5 levels of supervision to reflect increasing trust in trainee autonomy. 
The subcommittee encountered several entrustment and co-activity scales commonly found in the undergraduate and graduate medical education literature (Table 1) but not yet widely reported in research on EPA-related competency assessment.4,5,10 Additionally, the toolkits document9 provides a thorough review of scales and suggested modifications to scales to better fit the undergraduate medical education system. As mentioned previously, the content and format of EPAs represent a distinct shift toward more authentic assessment of work-based skills. During the design and development phase, the existing literature can be used to help map out EPAs as shown in Table 2 using an EPA description and EPA matrix.6,11 A detailed description of all 55 articles included in this review are shown in eAppendix 2. 
Table 1.
Entrustment Scales for UME, GME, and Beyonda
Level of Supervision UME GME and Beyond
Chen et al4 ten Cate et al5 Ottawa Scale10
1 Not allowed to practice EPA Is present and observes “I had to do”
2 Allowed to practice EPA only under proactive, full supervision Acts with direct supervision “I had to talk them through”
3 Allowed to practice EPA only under reactive/on-demand supervision Acts with indirect supervision “I had to prompt them from time to time”
4 Allowed to practice EPA unsupervised Acts without supervision “I needed to be in the room just in case”
5 Allowed to supervise others in practice of EPA Provides supervision “I did not need to be there”

a The UME levels were proposed by Chen et al,4 whereas the scales for GME and beyond are currently used in evaluating learners.

Abbreviations: EPA, entrustable professional activity; GME, graduate medical education; UME, undergraduate medical education.

Table 1.
Entrustment Scales for UME, GME, and Beyonda
Level of Supervision UME GME and Beyond
Chen et al4 ten Cate et al5 Ottawa Scale10
1 Not allowed to practice EPA Is present and observes “I had to do”
2 Allowed to practice EPA only under proactive, full supervision Acts with direct supervision “I had to talk them through”
3 Allowed to practice EPA only under reactive/on-demand supervision Acts with indirect supervision “I had to prompt them from time to time”
4 Allowed to practice EPA unsupervised Acts without supervision “I needed to be in the room just in case”
5 Allowed to supervise others in practice of EPA Provides supervision “I did not need to be there”

a The UME levels were proposed by Chen et al,4 whereas the scales for GME and beyond are currently used in evaluating learners.

Abbreviations: EPA, entrustable professional activity; GME, graduate medical education; UME, undergraduate medical education.

×
Table 2.
Components of a Fully Described EPA6
Component Description
1. Title of the EPA Should be concise and informative (ie, readily understood). As it only reflects work, it should not be stated as a learning objective or skill, merely as an activity. Limit to ≤10 words. Use neutral infinitive tense to avoid the association with individuals (eg, “discharging patients” instead of “discharges a patient”).
2. Specification and limitations Should clearly list what is and is not included, given the level of the intended trainees. Include the context and targeted transition (eg, entering residency, fellowship, autonomous practice).
3. Most relevant domains of competence Should relate the EPA to the competency framework used. Those domains of competencies or competencies of the framework that are most applicable may be mentioned.
4. Required experience, knowledge, skills, attitude, and behavior Trainees should be aware what knowledge, skills, and attitudes are expected before they can be trusted to carry out the EPA to help them prepare for entrustment. It may also be helpful to understand which workplace experiences are considered necessary before entrustment (type of rotation, type of patients, number of procedures).
5. Assessment information sources to assess progress and ground a summative entrustment decision Supervisors should be aware of which sources of information should be used to determine progress. Sources can be observed behavior or skill at the bedside or at morning report meetings; a skills test; information from colleagues, nurses, and patients; a double-checked procedure; a case-based discussion; and other sources. For trainees as well as supervisors, it is important to state how many times an EPA or its constituent parts must have been observed to enable taking a summative entrustment decision, and to state who takes the decision. It is highly recommended that multiple staff members sign off on such decisions. Supervisors should feel personal responsibility for these important decisions.
6. Entrustment for which level of supervision is to be reached at which stage of training? The consequence of an entrustment decision is stated as the permission to act under a designated level of supervision (eg, indirect supervision, distant supervision) not generally permitted before that time. Next, it is necessary to state at which transition of training trainees must ultimately master the EPA at the designated level. Graduation should require that all core EPAs of the program be mastered. When building an individual workplace curriculum, it is useful to estimate when this trainee is expected to receive the entrustment decision, based on prior training and expected rotations and experiences.
7. Expiration date Optional but recommended. Entrustment should drop if no maintenance of competence for this EPA happens, for example, over 1 to 5 years, depending on the EPA. Revalidation may require marginal or a more substantive check.

Source: Reproduced with permission from the Association for Medical Education in Europe. Modified from ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using Entrustable Professional Activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983-1002. doi:10.3109/0142159X.2015.1060308

Table 2.
Components of a Fully Described EPA6
Component Description
1. Title of the EPA Should be concise and informative (ie, readily understood). As it only reflects work, it should not be stated as a learning objective or skill, merely as an activity. Limit to ≤10 words. Use neutral infinitive tense to avoid the association with individuals (eg, “discharging patients” instead of “discharges a patient”).
2. Specification and limitations Should clearly list what is and is not included, given the level of the intended trainees. Include the context and targeted transition (eg, entering residency, fellowship, autonomous practice).
3. Most relevant domains of competence Should relate the EPA to the competency framework used. Those domains of competencies or competencies of the framework that are most applicable may be mentioned.
4. Required experience, knowledge, skills, attitude, and behavior Trainees should be aware what knowledge, skills, and attitudes are expected before they can be trusted to carry out the EPA to help them prepare for entrustment. It may also be helpful to understand which workplace experiences are considered necessary before entrustment (type of rotation, type of patients, number of procedures).
5. Assessment information sources to assess progress and ground a summative entrustment decision Supervisors should be aware of which sources of information should be used to determine progress. Sources can be observed behavior or skill at the bedside or at morning report meetings; a skills test; information from colleagues, nurses, and patients; a double-checked procedure; a case-based discussion; and other sources. For trainees as well as supervisors, it is important to state how many times an EPA or its constituent parts must have been observed to enable taking a summative entrustment decision, and to state who takes the decision. It is highly recommended that multiple staff members sign off on such decisions. Supervisors should feel personal responsibility for these important decisions.
6. Entrustment for which level of supervision is to be reached at which stage of training? The consequence of an entrustment decision is stated as the permission to act under a designated level of supervision (eg, indirect supervision, distant supervision) not generally permitted before that time. Next, it is necessary to state at which transition of training trainees must ultimately master the EPA at the designated level. Graduation should require that all core EPAs of the program be mastered. When building an individual workplace curriculum, it is useful to estimate when this trainee is expected to receive the entrustment decision, based on prior training and expected rotations and experiences.
7. Expiration date Optional but recommended. Entrustment should drop if no maintenance of competence for this EPA happens, for example, over 1 to 5 years, depending on the EPA. Revalidation may require marginal or a more substantive check.

Source: Reproduced with permission from the Association for Medical Education in Europe. Modified from ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using Entrustable Professional Activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983-1002. doi:10.3109/0142159X.2015.1060308

×
Discussion
This literature review provides insight into key structural elements and processes identified as important in the assessment of EPA-related competencies. Taken together, the results indicate that several challenges should be taken into account when designing and assessing EPAs: 
  • Rater variability and reliability: There are many factors that can influence faculty ratings of learner performance (eg, comparisons with other learners, different ideas as to what a numerical value represents, faculty member's own skills and experiences, context, time, limitations in cognitive processes).7,12-19 These challenges are not unique to the assessment of EPAs. However, educators must remain mindful that EPA assessment scales need to be designed as descriptors, narratives, or “entrustability scales” that help guide evaluators and lead to improvements in interrater agreement.20 We acknowledge that sometimes variability and different perspectives can be deemed positive.
  • Clearly defined faculty roles: Defining “appropriate” levels of learner supervision, ensuring quality of care delivered to patients, and safeguarding learner development can pose challenges if faculty roles are not clearly defined. Faculty must understand and be able to ensure learner competence while also ensuring patient safety. Faculty must also be able to address the fine line between quality of care and challenging learners to improve.5,21,22 It is highly dependent on faculty skills in assessment and feedback.7
  • Assessment should be holistic: When assessing an EPA, faculty need to have a broad picture of the learner that includes competencies and milestones. This approach requires mapping EPAs to these broader elements to develop a shared mental model for entrustment. This approach takes time and buy-in, and it can vary by institutional mission, resource capacity, curricular structure, and programming support for faculty development, among other factors.7
  • Time for assessment: The time available to faculty for engagement in assessment planning, development, and implementation is limited. It can affect the quality of assessment, feedback, role modeling, and education provided to students.23-27 To avoid delay in the delivery of feedback to learners, EPA assessment methods must be both feasible and useful to the stakeholders involved.23,28
  • Big data and longitudinal assessment: Effective EPA assessment requires the scheduling of numerous observations conducted by multiple raters over time, with follow-up triangulation of the findings undertaken by an individual or group (eg, assessment committee).29 To this end, a large volume of data must be collected across the learning continuum to limit the impact of contextual variations on overall trainee assessment.20
  • Assessment cannot be standardized: Effective EPA assessment relies on the independent observations of numerous medical professionals under widely varying conditions. The aim is to assess how learners can perform in a variety of educational environments/clinical settings as they work with diverse patient populations and fluctuating supervisory expectations.
  • Assessments should collectively measure all aspects of “entrustability”: Assessment outcomes will be used to confirm not only the ability, but also the right and duty, of a trainee to act unsupervised.5
Many experts advocate for matrix-type assessment planning that maps EPAs to competency frameworks and milestones (matched to appropriate levels of entrustment).6,30 Narrative statements included as descriptors can also be used to guide decisions about entrustment level. These approaches can serve as a mechanism for feedback and reporting by providing a common language or map for discussing learner performance and progress over time. 
We noted several limitations related to this review. For example, literature related to the assessment of EPAs is still in its infancy, making it difficult to support findings with previously published research. The lack of an established set of EPA-related search terms mandated the use of a relatively complex set of search algorithms, which may require refinement as research in this area advances. While a wide range of articles were located, the diversity of study populations and methodologies, represented in the various articles, made the meaningful comparison of instrument psychometric properties difficult. The results of this review may be outdated quickly as assessment of EPAs progress and more articles are published. Therefore, additional searches will be necessary to locate emergent findings. It is also possible that some exclusion criteria (especially limiting to the medical field) may have resulted in oversight of relevant assessments. 
Future EPA assessment subcommittee research will focus on assessment tools affiliated with EPA 7 through EPA 13 and refining search terms to better identify articles (eg, filtering out articles related to specific tests and finding articles related to the ability to recommend or interpret appropriate tests for EPA 3). The committee will develop similar MeSH terms for EPA 7 through EPA 13. The MeSH terms developed and used in this review (eAppendix 1) can be applicable to institutions interested in continuing to search and develop EPA-related assessment tools going forward. A continued goal of the subcommittee research will be to create practical tools for use by medical schools interested in developing assessments that uniquely align with their organizational contexts and settings. 
This article provides a unique frame of reference by summarizing the limited research literature to date. Given that the core set of medical schools charged with EPA piloting have not yet completed their initial piloting process nor extensively published on EPA assessment tool development, it is the hope of the subcommittee that this information will provide a useful benchmark against which future EPA assessment research and reporting can be measured. 
Conclusion
A review of the extant literature over the past year allowed the AACOM EPA Assessment Planning Subcommittee to denote key gaps in the existing literature, as well as identify several considerations worth keeping in mind when designing assessments for use in entrustment decision making. In addition to the current findings, educators should note that institutional context will be an important covariant in determining the effectiveness of EPA assessment in osteopathic medical education because each institution has a different mission, curricular resources, and support structures. Therefore, this article may assist institutions by providing a baseline set of assessment tools that can be used to guide the development of EPA-specific assessment plans. Each medical school is urged to consider how the tools presented in this article can be adapted to meet the needs of its learners. The featured articles can also be reframed to address key entrustment decisions using deliberately assigned levels of supervision. Measures should address not only ability, but also integrity, reliability, and humility.31 Medical educators may find it useful to implement several of the entrustment scales (Table 1)4,5 and EPA planning tools (Table 2)6,11 as a way of ensuring that all major decision points and assessment components are taken into consideration. By developing multiple measures and making consistent use of entrustment scales, instruments will be better able to document evidence of progress toward summative entrustment decisions. 
Acknowledgment
We acknowledge Stephen C. Shannon, DO, MPH, president of the American Association of Colleges of Osteopathic Medicine, for his vision and support in advancing the EPA initiative, as well as Mary Essig, MLS, library director at the West Virginia School of Osteopathic Medicine in Lewisburg; Rebecca Hines, MLKS, education librarian and assistant professor at Des Moines University in Iowa; and Kelli Hines, MLIS, librarian at Western University of Health Sciences in Pomona, California, for their insights, advice, and efforts in the literature search process. 
References
American Association of Colleges of Osteopathic Medicine Osteopathic Considerations for Core Entrustable Professional Activities (EPAs) for Entering Residency. Chevy Chase, MD: American Association of Colleges of Osteopathic Medicine; 2016. http://www.aacom.org/docs/default-source/med-ed-presentations/core-epas.pdf?sfvrsn=20. Accessed February 27, 2018.
Basehore PM, Mortensen LH, Katsaros E, et al.   Entrustable Professional Activities for entering residency: establishing common osteopathic performance standards in the transition from medical school to residency. J Am Osteopath Assoc. 2017;117(11):712-718. doi: 10.7556/jaoa.2017.137 [CrossRef] [PubMed]
ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39(12):1176-1177. doi: 10.1111/j.1365-2929.2005.02341.x [CrossRef] [PubMed]
Chen HC, van den Broek WE, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90(4):431-436. doi: 10.1097/ACM.0000000000000586 [CrossRef] [PubMed]
ten Cate O Hart D, Ankel F, et al.   Entrustment decision making in clinical training. Acad Med. 2016;91(2):191-198. doi: 10.1097/ACM.0000000000001044 [CrossRef] [PubMed]
ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983-1002. doi: 10.3109/0142159X.2015.1060308 [CrossRef] [PubMed]
Carraccio C, Englander R, Holmboe ES, Kogan JR. Driving care quality: aligning trainee assessment and supervision through practical application of entrustable professional activities, competencies, and milestones. Acad Med. 2016;91(2):199-203. doi: 10.1097/ACM.0000000000000985 [CrossRef] [PubMed]
ten Cate O. Trusting graduates to enter residency: what does it take? J Grad Med Educ. 2014;6(1):7-10. doi: 10.4300/JGME-D-13-00436.1 [CrossRef] [PubMed]
Obeso V, Brown D, Phillipi C, et al, eds. Core Entrustable Professional Activities for Entering Residency: Toolkits for the 13 Core EPAs. Washington, DC: Association of American Medical Colleges; 2017. https://www.aamc.org/download/482214/data/epa13toolkit.pdf. Accessed February 26, 2018.
Rekman J, Hamstra SJ, Dudek N, Wood T, Seabrook C, Gofton W. A new instrument for assessing resident competence in surgical clinic: the Ottawa Clinic assessment tool. J Surg Ed. 2016;73(4):575-582. doi: 10.1016/j.jsurg.2016.02.003 [CrossRef]
Chen HC, McNamara M, Teherani A, ten Cate O, O'Sullivan P. Developing entrustable professional activities for entry into clerkship. Acad Med. 2016;91(12):247-255. doi: 10.1097/ACM.0000000000000988 [CrossRef] [PubMed]
Govaerts MJB, van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM. Broadening perspectives on clinical performance assessment: Rethinking the nature of in-training assessment. Adv Health Sci Educ Theory Pract. 2007;12(2):239-260. doi: 10.1007/s10459-006-9043-1 [CrossRef] [PubMed]
Govaerts MJ, Van de Wiel MW, Schuwirth LW, Van der Vleuten CP, Muijtjens AM. Workplace-based assessment: raters’ performance theories and constructs. Adv Health Sci Educ Theory Pract. 2013;18:375-396. doi: 10.1007/s10459-012-9376-x [CrossRef] [PubMed]
Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ. 2011;45(10):1048-1060. doi: 10.1111/j.1365-2923.2011.04025.x [CrossRef] [PubMed]
Gingerich A, Regehr G, Eva KW. Rater-based assessments as social judgments: rethinking the etiology of rater errors. Acad Med. 2011;86(10 suppl):S1-S7. doi: 10.1097/ACM.0b013e31822a6cf8 [CrossRef] [PubMed]
Gingerich A, Kogan J, Yeates P, Govaerts M, Holmboe E. Seeing the “black box” differently: assessor cognition from three research perspectives. Med Educ. 2014;48(11):1055-1068. doi: 10.1111/medu.12546 [CrossRef] [PubMed]
Yeates P, O'Neill P, Mann K, Eva KW. Effect of exposure to good vs poor medical trainee performance on attending physician ratings of subsequent performances [published correction appears in JAMA. 2013;309(3):237]. JAMA. 2012;308:2226-2232. doi: 10.1001/jama.2012.36515 [CrossRef] [PubMed]
Govaerts MJ, Schuwirth LW, Van der Vleuten CP, Muijtjens AM. Workplace-based assessment: effects of rater expertise. Adv Health Sci Educ Theory Pract. 2011;16(2):151-165. doi: 10.1007/s10459-010-9250-7 [CrossRef] [PubMed]
van der Vleuten CP. When I say … context specificity. Med Educ. 2014;48(3):234-235. doi: 10.1111/medu.12263 [CrossRef] [PubMed]
Rekman J, Gofton W, Dudek N, Goften T, Hamstra SJ. Entrustability scales: outlining their usefulness for competency-based clinical assessment. Acad Med. 2016;91(2):186-190. doi: 10.1097/ACM.0000000000001045 [CrossRef] [PubMed]
Kennedy TJT, Lingard L, Baker GR, Kitchen L, Regehr G. Clinical oversight: conceptualizing the relationship between supervision and safety. J Gen Intern Med. 2007;22(6):1080-1085. doi: 10.1007/s11606-007-0179-3 [CrossRef] [PubMed]
Teman NR, Gauger PG, Mullan PB, Tarpley JL, Minter RM. Entrustment of general surgery residents in the operating room: factors contributing to provision of resident autonomy. J Am Coll Surg. 2014;219(4):778-787. doi: 10.1016/j.jamcollsurg.2014.04.019 [CrossRef] [PubMed]
Klamen DL, Williams RG, Roberts N, Cianciolo AT. Competencies, milestones, and EPAs—are those who ignore the past condemned to repeat it? Med Teach. 2016;28(9):904-910. doi: 10.3109/0142159X.2015.1132831 [CrossRef]
Chisholm CD, Whenmouth LF, Daly EA, Cordell WH, Giles BK, Brizendine EJ. An evaluation of emergency medicine resident interaction time with faculty in different teaching venues. Acad Emerg Med. 2004;11(2):149-155. [CrossRef] [PubMed]
Han H, Roberts NK, Korte R. Learning in the real place: medical students’ learning and socialization in clerkships at one medical school. Acad Med. 2015; 90(2):231-239. doi: 10.1097/ACM.0000000000000544 [CrossRef] [PubMed]
Williams RG, Dunnington GL. Assessing the ACGME competencies with methods that improve the quality of evidence and adequacy of sampling. ACGME Bull. 2006;19:38-42.
van Loon KA, Teunissen PW, Driessen EW, Scheele F. The role of generic competencies in the entrustment of professional activities: a nationwide competency-based curriculum assessed. J Grad Med Educ. 2016;8(4):546-552. doi: 10.4300/JGME-D-15-00321.1 [CrossRef] [PubMed]
Williams RG, Chen X, Sanfey H, Markwell S, Mellinger JD, Dunnington G. The measured effect of delay in completing operative performance ratings on clarity and detail of ratings assigned. J Surg Edu. 2014;71(6):e132-e138. [CrossRef]
Williams RG, Dunnington GL, JD, Klamen DL. Placing constraints on the use of the ACGME milestones: a commentary on the limitations of global performance ratings. Acad Med. 2015;90(4):404-407. doi: 10.1097/ACM.0000000000000507 [CrossRef] [PubMed]
Pangaro L, ten Cate O. Frameworks for learner assessment in medicine: AMEE Guide No. 78. Med Teach. 2013;35:e1197-e1210. doi: 10.3109/0142159X.2013.788789 [CrossRef] [PubMed]
ten Cate O. Entrustment as assessment: recognizing the ability, the right, and the duty to act. J Grad Med Educ. 2016;8(2):261-262. doi: 10.4300/JGME-D-16-00097.1 [CrossRef] [PubMed]
eAppendix 1.
MeSH (Medical Subject Heading) terms: strategies to guide entrustable professional activity (EPA) assessment tool searches for EPAs 1 through 6. aEPA 3 will take more time because diagnostic tests comprise a broad category. 
EPA 1 
(“history and physical”[tw] OR “physical exam”[tw] OR “physical examination”[tw] OR “physical examination”[MeSH Terms] OR “history taking”[tw] OR “medical history taking”[MeSH Terms] OR “interviews as topic”[MeSH Terms]) AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric[All Fields] OR tool[tw] OR “checklist”[MeSH Terms] OR “checklist”[All Fields]) AND (“Validation Studies”[Publication Type] OR reliability[tw] OR validate[tw] OR validity[tw] OR validation[tw] OR “Reproducibility of Results”[MeSH Terms]) 
EPA 2 
(“diagnosis”[MeSH] OR “Diagnosis, Differential”[MeSH] OR diagnosis[tw] OR “Clinical Examination”) AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric OR tool[tw] OR checklist) AND (“Validation Studies” [Publication Type] OR reliability[tw] OR validate[tw] OR Validity[tw] OR validation[tw] OR “Reproducibility of Results”[MeSH Terms]) 
(“Diagnosis, Differential”[MeSH] OR diagnosis[tw] OR “Clinical Examination”) AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric OR tool[tw] OR checklist) AND (“Validation Studies” [Publication Type] OR reliability[tw] OR validate[tw] OR validity[tw] OR validation[tw] OR “Reproducibility of Results”[MeSH Terms]) 
EPA 3a 
“Diagnostic Techniques and Procedures”[MeSH] AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric OR tool[tw] OR checklist) AND (“Validation Studies” [Publication Type] OR reliability[tw] OR validate[tw] OR validity[tw] OR validation[tw] OR “Reproducibility of Results”[MeSH Terms]) AND “humans”[MeSH Terms] 
EPA 4 
(prescribing[tw] OR “Drug Therapy”[MeSH] OR “Medication Errors”[MeSH] or pharmacology[tw] OR “Pharmacology, Clinical”[MeSH] OR “Pharmacology/education”[MeSH]) AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric OR tool[tw] OR checklist) AND (“Validation Studies” [Publication Type] OR validate[tw] OR validity[tw] OR validation[tw] OR reliability[tw] OR “Reproducibility of Results”[MeSH Terms]) 
EPA 5 
(Documentation[tw] OR “Health record”[tw] OR “Medical record”[tw] OR “soap note”[tw] OR “computerized provider order entry”) AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric OR tool[tw OR checklist]) AND (“Validation Studies” [Publication Type] OR validate[tw] OR validity[tw] OR validation[tw] OR reliability[tw] OR “Reproducibility of Results”[MeSH Terms]) 
EPA 6 
(“case presentation” OR “oral presentation” OR “patient presentation”) AND (“Schools, Medical”[MeSH] OR “Students, Medical”[MeSH] OR “Education, Medical, Undergraduate”[MeSH] OR “Clinical Clerkship”[MeSH]) AND (“Curriculum/standards”[MeSH] OR “Educational Measurement”[MeSH] OR instrument[tw] OR rubric OR tool[tw] OR checklist) AND (“Validation Studies” [Publication Type] OR validate[tw] OR validity[tw] OR validation[tw] OR reliability[tw] OR “Reproducibility of Results”[MeSH Terms]) 
eAppendix 2.
Articles to support entrustable professional activity (EPA) assessment tool development (as of January 2018) by EPA. aBrinkman et al mention these measures but no results were shown to verify reliability. Abbreviations: GME, graduate medical education; NA, not available/applicable; MSF, multisource feedback; NI, not included; OSCE, objective structured clinical examination; SOAP, subjective, objective, assessment, plan; SP, standardized patient; UME, undergraduate medical education. 
Figure.
Classification system to extract the most relevant information from each article on entrustable professional activity (EPA) assessment. Abbreviations: GME, graduate medical education; OMM, osteopathic manipulative medicine; OSCE, objective structured clinical examination; UME, undergraduate medical education.
Figure.
Classification system to extract the most relevant information from each article on entrustable professional activity (EPA) assessment. Abbreviations: GME, graduate medical education; OMM, osteopathic manipulative medicine; OSCE, objective structured clinical examination; UME, undergraduate medical education.
Table 1.
Entrustment Scales for UME, GME, and Beyonda
Level of Supervision UME GME and Beyond
Chen et al4 ten Cate et al5 Ottawa Scale10
1 Not allowed to practice EPA Is present and observes “I had to do”
2 Allowed to practice EPA only under proactive, full supervision Acts with direct supervision “I had to talk them through”
3 Allowed to practice EPA only under reactive/on-demand supervision Acts with indirect supervision “I had to prompt them from time to time”
4 Allowed to practice EPA unsupervised Acts without supervision “I needed to be in the room just in case”
5 Allowed to supervise others in practice of EPA Provides supervision “I did not need to be there”

a The UME levels were proposed by Chen et al,4 whereas the scales for GME and beyond are currently used in evaluating learners.

Abbreviations: EPA, entrustable professional activity; GME, graduate medical education; UME, undergraduate medical education.

Table 1.
Entrustment Scales for UME, GME, and Beyonda
Level of Supervision UME GME and Beyond
Chen et al4 ten Cate et al5 Ottawa Scale10
1 Not allowed to practice EPA Is present and observes “I had to do”
2 Allowed to practice EPA only under proactive, full supervision Acts with direct supervision “I had to talk them through”
3 Allowed to practice EPA only under reactive/on-demand supervision Acts with indirect supervision “I had to prompt them from time to time”
4 Allowed to practice EPA unsupervised Acts without supervision “I needed to be in the room just in case”
5 Allowed to supervise others in practice of EPA Provides supervision “I did not need to be there”

a The UME levels were proposed by Chen et al,4 whereas the scales for GME and beyond are currently used in evaluating learners.

Abbreviations: EPA, entrustable professional activity; GME, graduate medical education; UME, undergraduate medical education.

×
Table 2.
Components of a Fully Described EPA6
Component Description
1. Title of the EPA Should be concise and informative (ie, readily understood). As it only reflects work, it should not be stated as a learning objective or skill, merely as an activity. Limit to ≤10 words. Use neutral infinitive tense to avoid the association with individuals (eg, “discharging patients” instead of “discharges a patient”).
2. Specification and limitations Should clearly list what is and is not included, given the level of the intended trainees. Include the context and targeted transition (eg, entering residency, fellowship, autonomous practice).
3. Most relevant domains of competence Should relate the EPA to the competency framework used. Those domains of competencies or competencies of the framework that are most applicable may be mentioned.
4. Required experience, knowledge, skills, attitude, and behavior Trainees should be aware what knowledge, skills, and attitudes are expected before they can be trusted to carry out the EPA to help them prepare for entrustment. It may also be helpful to understand which workplace experiences are considered necessary before entrustment (type of rotation, type of patients, number of procedures).
5. Assessment information sources to assess progress and ground a summative entrustment decision Supervisors should be aware of which sources of information should be used to determine progress. Sources can be observed behavior or skill at the bedside or at morning report meetings; a skills test; information from colleagues, nurses, and patients; a double-checked procedure; a case-based discussion; and other sources. For trainees as well as supervisors, it is important to state how many times an EPA or its constituent parts must have been observed to enable taking a summative entrustment decision, and to state who takes the decision. It is highly recommended that multiple staff members sign off on such decisions. Supervisors should feel personal responsibility for these important decisions.
6. Entrustment for which level of supervision is to be reached at which stage of training? The consequence of an entrustment decision is stated as the permission to act under a designated level of supervision (eg, indirect supervision, distant supervision) not generally permitted before that time. Next, it is necessary to state at which transition of training trainees must ultimately master the EPA at the designated level. Graduation should require that all core EPAs of the program be mastered. When building an individual workplace curriculum, it is useful to estimate when this trainee is expected to receive the entrustment decision, based on prior training and expected rotations and experiences.
7. Expiration date Optional but recommended. Entrustment should drop if no maintenance of competence for this EPA happens, for example, over 1 to 5 years, depending on the EPA. Revalidation may require marginal or a more substantive check.

Source: Reproduced with permission from the Association for Medical Education in Europe. Modified from ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using Entrustable Professional Activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983-1002. doi:10.3109/0142159X.2015.1060308

Table 2.
Components of a Fully Described EPA6
Component Description
1. Title of the EPA Should be concise and informative (ie, readily understood). As it only reflects work, it should not be stated as a learning objective or skill, merely as an activity. Limit to ≤10 words. Use neutral infinitive tense to avoid the association with individuals (eg, “discharging patients” instead of “discharges a patient”).
2. Specification and limitations Should clearly list what is and is not included, given the level of the intended trainees. Include the context and targeted transition (eg, entering residency, fellowship, autonomous practice).
3. Most relevant domains of competence Should relate the EPA to the competency framework used. Those domains of competencies or competencies of the framework that are most applicable may be mentioned.
4. Required experience, knowledge, skills, attitude, and behavior Trainees should be aware what knowledge, skills, and attitudes are expected before they can be trusted to carry out the EPA to help them prepare for entrustment. It may also be helpful to understand which workplace experiences are considered necessary before entrustment (type of rotation, type of patients, number of procedures).
5. Assessment information sources to assess progress and ground a summative entrustment decision Supervisors should be aware of which sources of information should be used to determine progress. Sources can be observed behavior or skill at the bedside or at morning report meetings; a skills test; information from colleagues, nurses, and patients; a double-checked procedure; a case-based discussion; and other sources. For trainees as well as supervisors, it is important to state how many times an EPA or its constituent parts must have been observed to enable taking a summative entrustment decision, and to state who takes the decision. It is highly recommended that multiple staff members sign off on such decisions. Supervisors should feel personal responsibility for these important decisions.
6. Entrustment for which level of supervision is to be reached at which stage of training? The consequence of an entrustment decision is stated as the permission to act under a designated level of supervision (eg, indirect supervision, distant supervision) not generally permitted before that time. Next, it is necessary to state at which transition of training trainees must ultimately master the EPA at the designated level. Graduation should require that all core EPAs of the program be mastered. When building an individual workplace curriculum, it is useful to estimate when this trainee is expected to receive the entrustment decision, based on prior training and expected rotations and experiences.
7. Expiration date Optional but recommended. Entrustment should drop if no maintenance of competence for this EPA happens, for example, over 1 to 5 years, depending on the EPA. Revalidation may require marginal or a more substantive check.

Source: Reproduced with permission from the Association for Medical Education in Europe. Modified from ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using Entrustable Professional Activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983-1002. doi:10.3109/0142159X.2015.1060308

×