Free
Original Contribution  |   February 2019
Assessing Competency in Family Medicine Residents Using the Osteopathic Manipulative Medicine Mini-Clinical Evaluation Exercise
Author Notes
  • From the A.T. Still University School of Osteopathic Medicine in Mesa, Arizona (Drs LeBeau, Morgan, and Heath); the Wright Center for Graduate Medical Education National Family Medicine Residency program in Scranton, Pennsylvania (Dr LeBeau); and Research Support at A.T. Still University in Kirksville, Missouri (Ms Pazdernik).  
  • Financial Disclosures: None reported.  
  • Support: This study was supported by a grant from the American Association of Colleges of Osteopathic Medicine and the Osteopathic Heritage Foundation.  
  •  *Address correspondence to Lawrence LeBeau, DO, A.T. Still University School of Osteopathic Medicine in Arizona, 5850 E Still Circle, Mesa, AZ 85206-3618. Email: llebeau@atsu.edu
     
Article Information
Original Contribution   |   February 2019
Assessing Competency in Family Medicine Residents Using the Osteopathic Manipulative Medicine Mini-Clinical Evaluation Exercise
The Journal of the American Osteopathic Association, February 2019, Vol. 119, 81-88. doi:https://doi.org/10.7556/jaoa.2019.013
The Journal of the American Osteopathic Association, February 2019, Vol. 119, 81-88. doi:https://doi.org/10.7556/jaoa.2019.013
Abstract

Context: The Mini-Clinical Evaluation Exercise (Mini-CEX) is one example of a direct observation tool used for workplace-based skills assessment. The Mini-CEX has been validated as a useful formative evaluation tool in graduate medical education. However, no Mini-CEX has been reported in the literature that specifically assesses the osteopathic manipulative medicine (OMM) skills of family medicine residents. Therefore, the authors created and studied an OMM Mini-CEX to fill this skills assessment gap.

Objective: To determine whether the OMM Mini-CEX is perceived as an effective evaluation tool for assessing the OMM core competencies of family medicine residents.

Methods: Faculty and residents of The Wright Center for Graduate Medical Education National Family Medicine Residency program participated in the study. Each resident was evaluated at least once using the OMM Mini-CEX. Surveys were used to assess faculty and resident perceptions of the usefulness and effectiveness of the OMM Mini-CEX for assessing OMM competencies.

Results: Eighty-one responses were received during 2 survey cycles within a 7-month period. The internal consistency of the survey responses had a high reliability (Cronbach α=0.93). Considering respondents who agreed that they had a clear understanding of the general purpose of a Mini-CEX, the perceived effectiveness score for the OMM Mini-CEX was higher among those who agreed that a Mini-CEX was a useful part of training than among those who disagreed or were unsure of its usefulness (median score, 4.0 vs 3.4, respectively; P=.047).

Conclusions: The results suggest the OMM Mini-CEX can be a useful direct observation evaluation tool to assess OMM core competencies in family medicine residents. Additional research is needed to determine its perceived effectiveness in other clinical specialties and in undergraduate medical education.

Direct observation tools are considered an optimal and essential method for assessing competency skills of medical learners.1-4 The American Board of Internal Medicine developed the Mini-Clinical Evaluation Exercise (Mini-CEX) as a direct observation tool whereby the assessor observes a trainee during a single clinical experience and then provides immediate feedback on the trainee's performance.5,6 In a traditional Clinical Evaluation Exercise, residents are evaluated as they perform a complete history and physical examination to reach diagnostic and therapeutic conclusions, in a process that takes about 2 hours. With the Mini-CEX, the resident is observed during a focused patient encounter and is then evaluated and given immediate feedback in a process that takes about 15 to 20 minutes.7 Because the Mini-CEX is shorter, it is repeated at least twice throughout the year.7 The Mini-CEX has been validated as a useful formative assessment tool to evaluate trainees.8-12 It has also been adapted for clinical competency skills assessment by other medical specialties, such as anesthesiology and obstetrics, and as a tool to measure resident performance in patient care transfers, or “handoffs.”13-17 To our knowledge, no Mini-CEX had been developed to assess the osteopathic manipulative medicine (OMM) skills of osteopathic family medicine residents. 
Historically, OMM assessment tools were developed for use at the undergraduate level and for family medicine specialty certification. The National Board of Osteopathic Medical Examiners administers the Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Evaluation (COMLEX-USA Level 2-PE), a performance-based clinical skills examination that uses standardized patients. The COMLEX-USA Level 2-PE was created to assess the clinical skills proficiency of osteopathic medical students and is a graduation requirement for all osteopathic medical schools.18 Similarly, the American Osteopathic Board of Family Physicians administers the family medicine certification examination, which includes a performance evaluation. For this evaluation, a team of examiners assesses a candidate's ability to make a diagnosis and demonstrate osteopathic manipulative treatment on a partner candidate with 3 randomly assigned case histories.19 Unlike the Mini-CEX, however, the COMLEX-USA Level 2-PE and the family medicine certification examinations are designed as summative assessment tools rather than formative learning experiences. 
Under the single graduate medical education accreditation system, programs that are accredited by the Accreditation Council for Graduate Medical Education (ACGME) and demonstrate a commitment to teaching and assessing osteopathic principles and practice are eligible to apply for the Osteopathic Recognition designation. Programs with Osteopathic Recognition must “provide learning activities to advance the procedural skill acquisition of osteopathic manipulative medicine.”20 Therefore, OMM skills need to be taught and assessed like any other procedural skill in medicine, which creates the need for assessment tools. Moreover, assessment tools will be especially important to evaluate the readiness and competency development of allopathic physician candidates who are applying to programs with Osteopathic Recognition and have not been previously assessed with the COMLEX-USA Level 2-PE as part of their medical degree requirements. 
The purpose of the current study was to investigate the effectiveness, as perceived by faculty and residents, of the OMM Mini-CEX in assessing OMM core competencies of family medicine residents. American Osteopathic Association (AOA) and ACGME-accredited training programs are required to evaluate 6 core competencies in the training and evaluation of residents: medical knowledge, patient care and procedural skills, interpersonal and communication skills, professionalism, practice-based learning and improvement, and systems-based practice.21,22 In programs seeking Osteopathic Recognition, osteopathic principles and practice must be fully integrated into the 6 core competencies and residents must demonstrate OMM skills appropriate to their medical specialty.20 We hypothesized that an OMM Mini-CEX would be perceived as an effective tool for assessing AOA core competencies as well as ACGME competencies for residency and fellowship programs with Osteopathic Recognition. 
Methods
The Wright Center's pioneering AOA-accredited National Family Medicine Residency (NFMR) program was established in 2013 in partnership with A.T. Still University School of Osteopathic Medicine in Arizona. The purpose of this program is to provide training in the delivery of comprehensive primary care safety-net services at partnering community health centers throughout the country. The study was conducted during the 2015-2016 academic year, and study participants were OMM faculty and residents of The Wright Center's NFMR program. The Wright Center's Institutional Review Board approved the study. Participation was voluntary, and submission of the survey provided consent to participate. Survey responses were anonymous and did not affect overall resident evaluation. During the study, participants included 4 OMM faculty members and 58 family medicine osteopathic graduate medical education (OGME) residents: 17 OGME year 1 (OGME 1), 20 OGME year 2 (OGME 2), and 21 OGME year 3 (OGME 3). 
The NFMR program requires completion of 2 Mini-CEX forms per month to assess each resident's progress; 16 unique Mini-CEX forms can be used to meet this requirement, including assessments of breast, neurological assessments, pelvic examinations, psychiatric evaluations, and more. In 2014, a resident leader requested development of an OMM Mini-CEX to help assess OMM skills. Using other Wright Center Mini-CEX forms as a guide, the research team drafted an OMM Mini-CEX and distributed it to OMM faculty at NFMR training sites and at A.T. Still University School of Osteopathic Medicine in Arizona for review and feedback. Through the faculty feedback, content validity was established for relevance and integration of osteopathic principles and practice into the 6 ACGME core competencies. 
One key to successful implementation of a Mini-CEX evaluation tool is training faculty in its proper use.12 Similarly, residents may perceive more value in a Mini-CEX if they receive instructions in its proper use.1-3,23 Therefore, all OMM faculty and residents of the current study were trained on the general purpose of Mini-CEX evaluation tools and specifically on the use of the OMM Mini-CEX. The Wright Center NFMR program director traveled to each training site to instruct study participants on the use of a Mini-CEX to evaluate core competencies and provide residents with immediate performance feedback. Faculty and residents were also introduced to the new OMM Mini-CEX and were instructed on its use. Training sessions took place between September and December 2015 during the annual site visits for each location. 
Survey Instrument
To assess the perceived effectiveness of a Mini-CEX in general and of OMM Mini-CEX specifically, 2 surveys were developed for the current study: 1 for faculty preceptors and 1 for residents. Study information and a link to the online survey (created with SurveyMonkey software) were sent to study participants. Consent to participate was provided by submission of the survey. The surveys were distributed after 2 consecutive semiannual evaluation cycles that corresponded to the first 6 months (cycle 1) and last 6 months (cycle 2) of the academic year. Cycle 1 survey data were gathered from December 15, 2015, to January 15, 2016, and cycle 2 data from June 15 to July 15, 2016. 
Both the faculty preceptor and resident surveys contained 8 similarly worded questions. Six of the 8 questions used the Likert scale for responses (with 1 indicating strongly disagree; 2, disagree; 3, unsure; 4, agree; and 5, strongly agree). Survey question 1 asked about training level, and questions 2 and 3 addressed the level of agreement with statements on the general use of a Mini-CEX in residency training. Questions 4 through 8 focused on the specific use of the OMM Mini-CEX. Question 4 asked about the frequency of use of the OMM Mini-CEX, and question 5 about the respondent's comfort level in using it. Question 6 included 7 subitems to assess agreement that the OMM Mini-CEX provided an accurate assessment of OMM competency for each of the 6 ACGME core competencies and for overall clinical competency. Question 7 asked whether using the OMM Mini-CEX provided useful feedback, and question 8 asked whether it was an effective tool for assessing OMM skills. 
Statistical Analysis
Participants were classified into 3 groups according to their responses to survey questions 2 and 3. Group 1 included respondents who strongly disagreed, disagreed, or were unsure both that they had a clear understanding of the purpose of a Mini-CEX and that they found participating in a Mini-CEX to be a useful part of training. Group 2 included those who agreed or strongly agreed that they had a clear understanding of the purpose but strongly disagreed, disagreed, or were unsure that it was useful. Group 3 included those who agreed or strongly agreed both that they had a clear understanding of the purpose and that participating was useful. 
We assigned point values to Likert responses to calculate medians and perform subsequent analyses. Cronbach α values were calculated to assess the internal consistency of the 8 responses for survey questions 6 and 8. We defined the mean score of these 8 items for each participant as the perceived effectiveness score. Within each combination of participant type (faculty or resident) and cycle, we used Kruskal-Wallis tests to assess whether perceived effectiveness scores differed between the 3 defined groups. Post hoc Kruskal-Wallis tests were used with Bonferroni adjustment for multiple comparisons. We further assessed whether perceived effectiveness scores differed significantly from 3.0 (equivalent to “unsure”), using Wilcoxon signed rank tests within the categories of faculty preceptors and residents from cycles 1 and 2, and whether they differed significantly between preceptors and residents from cycles 1 and 2, using Kruskal-Wallis tests. Although the same pool of residents was invited to participate in cycles 1 and 2, we analyzed the data assuming that responses from the 2 cycles were independent, because responses from the same residents were not paired. An α value of .05 (2 tailed when applicable) was used as a cutoff for statistical significance. Frequencies and percentages and medians and interquartile ranges (IQRs) were recorded after pooling of responses from faculty preceptors and from residents for each cycle. We used SAS software (version 9.4; SAS Institute Inc) to conduct the analyses. 
Results
Four OMM faculty preceptors (67%) and 50 residents (86%; 17 OGME 1, 14 OGME 2, and 19 OGME 3) participated in cycle 1 and 27 residents (47%: 6 OGME 1, 10 OGME 2, and 11 OGME 3) participated in cycle 2. All faculty preceptors were board certified in family medicine and OMM and had more than 4 experiences using the OMM Mini-CEX (Table 1). In the first cycle, 19 residents (38%) had 1 or 2 experiences using the OMM Mini-CEX, 22 (44%) had 3 or more experiences, and 9 (18%) had none. In the second cycle, 6 residents (22%) had 1 or 2 experiences, 19 (71%) had 3 or more, 1 (4%) had none, and 1 did not respond to the question. 
Table 1.
Frequency and Percentage of Participants With Previous OMM Mini-CEX Experience
Previous Evaluation Faculty Preceptor, Residents by OGME Year, No. (%)
Cycle 1 Cycle 2
Experience No. (%) 1 2 3 All 1 2 3 All
No response 0 0 0 0 0 0 0 1 (9) 1 (4)
No, and do not plan to use 0 0 0 1 (5) 1 (2) 0 0 1 (9) 1 (4)
No, but plan to use in 6 mo 0 1 (6) 2 (14) 5 (26) 8 (16) 0 0 0 0
Yes, 1-2 experiences 0 12 (71) 2 (14) 5 (26) 19 (38) 2 (33) 3 (30) 1 (9) 6 (22)
Yes, 3-4 experiences 0 2 (12) 4 (29) 4 (21) 10 (20) 3 (50) 3 (30) 2 (18) 8 (30)
Yes, >4 experiences 4 (100) 2 (12) 6 (43) 4 (21) 12 (24) 1 (17) 4 (40) 6 (55) 11 (41)
Total 4 17 14 19 50 6 10 11 27

Abbreviations: OGME, osteopathic graduate medical education; OMM Mini-CEX, osteopathic manipulative medicine Mini-Clinical Evaluation Exercise.

Table 1.
Frequency and Percentage of Participants With Previous OMM Mini-CEX Experience
Previous Evaluation Faculty Preceptor, Residents by OGME Year, No. (%)
Cycle 1 Cycle 2
Experience No. (%) 1 2 3 All 1 2 3 All
No response 0 0 0 0 0 0 0 1 (9) 1 (4)
No, and do not plan to use 0 0 0 1 (5) 1 (2) 0 0 1 (9) 1 (4)
No, but plan to use in 6 mo 0 1 (6) 2 (14) 5 (26) 8 (16) 0 0 0 0
Yes, 1-2 experiences 0 12 (71) 2 (14) 5 (26) 19 (38) 2 (33) 3 (30) 1 (9) 6 (22)
Yes, 3-4 experiences 0 2 (12) 4 (29) 4 (21) 10 (20) 3 (50) 3 (30) 2 (18) 8 (30)
Yes, >4 experiences 4 (100) 2 (12) 6 (43) 4 (21) 12 (24) 1 (17) 4 (40) 6 (55) 11 (41)
Total 4 17 14 19 50 6 10 11 27

Abbreviations: OGME, osteopathic graduate medical education; OMM Mini-CEX, osteopathic manipulative medicine Mini-Clinical Evaluation Exercise.

×
Group distributions based on responses to survey questions 2 and 3 are presented in Table 2. The internal consistency of the 8 responses for survey questions 6 and 8 had high reliability overall (Cronbach α=0.93) and within the categories of faculty, residents in cycle 1, and residents in cycle 2 (all Cronbach α>0.89). No differences in perceived effectiveness scores were found between groups for faculty preceptors (P=.26) or residents in cycle 1 (P=.054). For residents in cycle 2, there was evidence of at least one between-group difference (P=.049); the perceived median (IQR) effectiveness scores were higher in group 3 (4.0 [4.0-4.3]) than in group 2 (3.4 [3.1-3.9]) (P=.047). 
Table 2.
Participants and Median Perceived Effectiveness Scores by Group for Responses to Survey Questions 2 and 3
Participants by Groupa Participants, No. (%) Perceived Effectiveness Score, Median (IQR)
Faculty Preceptors 4 (100) 3.7 (3.4-4.4)
 Group 1 1 (25) 3.3 (3.3-3.3)
 Group 2 1 (25) 3.6 (3.6-3.6)
 Group 3 2 (50) 4.4 (3.8-5.0)
Residents (Cycle 1) 41 (100) 3.6 (3.3-4.0)
 Group 1 7 (17) 3.6 (2.4-3.9)
 Group 2 16 (39) 3.4 (3.1-3.9)
 Group 3 18 (44) 4.0 (3.4-4.0)
Residents (Cycle 2) 25 (100) 3.9 (3.4-4.0)
 Group 1 5 (20) 3.9 (3.6-4.0)
 Group 2 13 (52) 3.4 (3.1-3.9)
 Group 3 7 (28) 4.0 (4.0-4.3)

a The groups were defined as follows: group 1 included participants who strongly disagreed, disagreed, or were unsure both that they understood the general purpose of a Mini-Clinical Evaluation Exercise (Mini-CEX) and that they found it useful in training; group 2, those who agreed or strongly agreed that they understood its purpose but strongly disagreed, disagreed, or were unsure that it was useful; and group 3, those who agreed or strongly agreed both that they understood its purpose and that it was useful.

Abbreviation: IQR, interquartile range.

Table 2.
Participants and Median Perceived Effectiveness Scores by Group for Responses to Survey Questions 2 and 3
Participants by Groupa Participants, No. (%) Perceived Effectiveness Score, Median (IQR)
Faculty Preceptors 4 (100) 3.7 (3.4-4.4)
 Group 1 1 (25) 3.3 (3.3-3.3)
 Group 2 1 (25) 3.6 (3.6-3.6)
 Group 3 2 (50) 4.4 (3.8-5.0)
Residents (Cycle 1) 41 (100) 3.6 (3.3-4.0)
 Group 1 7 (17) 3.6 (2.4-3.9)
 Group 2 16 (39) 3.4 (3.1-3.9)
 Group 3 18 (44) 4.0 (3.4-4.0)
Residents (Cycle 2) 25 (100) 3.9 (3.4-4.0)
 Group 1 5 (20) 3.9 (3.6-4.0)
 Group 2 13 (52) 3.4 (3.1-3.9)
 Group 3 7 (28) 4.0 (4.0-4.3)

a The groups were defined as follows: group 1 included participants who strongly disagreed, disagreed, or were unsure both that they understood the general purpose of a Mini-Clinical Evaluation Exercise (Mini-CEX) and that they found it useful in training; group 2, those who agreed or strongly agreed that they understood its purpose but strongly disagreed, disagreed, or were unsure that it was useful; and group 3, those who agreed or strongly agreed both that they understood its purpose and that it was useful.

Abbreviation: IQR, interquartile range.

×
The median perceived effectiveness score for faculty preceptors was 3.7, which did not differ significantly from 3.0 (unsure) (P=.13). The median perceived effectiveness scores for residents were 3.6 in cycle 1 and 3.9 in cycle 2, both differing significantly from 3.0 (unsure) (P≤.001). No significant differences were found between faculty preceptors, residents in cycle 1, and residents in cycle 2 (P=.56). 
The distribution of the levels of agreement for all Likert-scale survey questions is presented in the Figure. The median level of agreement was 4 (agree) for all except 2 questions. Those questions, which concerned the usefulness of Mini-CEX in residency training and the accuracy of the OMM Mini-CEX in assessing OMM competency in systems-based practice, each had a median agreement of 3 (unsure) (IQR, 2-4 and 3-4; 37% and 47% agreement, respectively). For the 6 core competencies, the highest levels of agreement (median [IQR]) were for assessing professionalism (4 [4-4]; 76% agreement), patient care and procedural skills (4 [3.25-4]; 74% agreement), and interpersonal and communication skills (4 [3-4]; 70% agreement), followed by practice-based learning and improvement (4 [3-4]; 60% agreement) and medical knowledge (4 [3-4]; 54% agreement). Both faculty preceptors and residents agreed that they had a high level of comfort using the OMM Mini-CEX (median [IQR], 4 [4-4]; 86% agreement; 0% disagreement) and thought it provided useful feedback (4 [3-4]; 70% agreement and 10% disagreement); there was less agreement about its overall effectiveness (4 [3-4]; 51% agreement and 19% disagreement). 
Figure.
Distribution of levels of agreement for all Likert-scale survey questions. The perceived effectiveness score was calculated as the mean of the responses for the core competency categories and for overall effectiveness. aItems that make up the effectiveness scale. Abbreviations: Mini-CEX, Mini-Clinical Evaluation Exercise; OMM, osteopathic manipulative medicine.
Figure.
Distribution of levels of agreement for all Likert-scale survey questions. The perceived effectiveness score was calculated as the mean of the responses for the core competency categories and for overall effectiveness. aItems that make up the effectiveness scale. Abbreviations: Mini-CEX, Mini-Clinical Evaluation Exercise; OMM, osteopathic manipulative medicine.
Discussion
The Mini-CEX was originally developed by the American Board of Internal Medicine in the 1990s and has since been adapted by other medical specialties to assess clinical competency.13-17 More recently, the ACGME focus on direct observation of clinical skills to reach milestones has motivated residency programs to develop valid tools for measuring competency.17 Seeking ACGME accreditation with Osteopathic Recognition for our AOA-accredited program, we found a lack of direct observation tools for assessing OMM competency. To help fill this gap, we developed the OMM Mini-CEX. 
During the current study, the OMM Mini-CEX was used more than 100 times to assess OMM competency among family medicine residents. Among those who agreed that they understood the purpose of a Mini-CEX, those who also agreed that it was a useful part of their training perceived the OMM Mini-CEX as more effective than those who did not agree on the usefulness of the Mini-CEX. This finding is consistent with others, showing that training faculty and residents in its proper use is key to the successful implementation of a Mini-CEX evaluation tool.1-3,12,23 
In the current study, the resident response rate was lower for cycle 2 (June 2016) than for cycle 1 (January 2016). The lower response rate probably reflected less participation by OGME 3 residents in cycle 2, when they were busy preparing for graduation and employment after residency. Survey fatigue may have been another factor for all residents, because other required program surveys were also due in June. For the current study, we conducted 2 rounds of surveys to capture responses from all the residents, and results from cycle 2 indicated that residents had been evaluated more often using the OMM Mini-CEX than they had during cycle 1. In retrospect, a single survey round in April or May would probably have captured enough data and eliminated the need for a second cycle. 
Among those experienced with the OMM Mini-CEX, residents in both cycles had perceived effectiveness scores toward the agreement end of the scale. Furthermore, all study participants reported a high level of comfort with using the OMM Mini-CEX and stated that it provided useful feedback. There was a good level of agreement for 5 of the 6 core competencies, the exception being systems-based practices. Although systems-based practice is perceived as the most abstract of the competencies, it is foundational to the practice of medicine. At its core, systems-based practice involves understanding complex systems and navigating and improving them to benefit patients.24 The low agreement for systems-based practice in our study made us look more closely at this domain and consider how to measure it through the lens of OMM competency. We also compared the descriptors in the OMM Mini-CEX evaluation tool with the newly published Osteopathic Recognition milestones.20 We particularly focused on the core competency domains for which the OMM Mini-CEX was perceived as less effective by our survey respondents. We found that the descriptors in the OMM Mini-CEX were congruent with those in the published milestones for all domains except systems-based practice, so we revised the wording in the OMM Mini-CEX to clarify this domain. 
In late 2016, the research team created version 2 of the OMM Mini-CEX, based on analysis of our survey results and review of the Osteopathic Recognition milestones. The revised version of the OMM Mini-CEX has been in use since January 1, 2017. Faculty and residents consistently rate it as one of the best Mini-CEX tools available for resident evaluation in the NFMR program, with 132 OMM Mini-CEX evaluations completed as of the end of December 2017. 
The current study had several limitations. It was conducted in a single AOA-accredited family medicine residency program, which limits generalizability to other programs and specialties, including programs with ACGME accreditation. Another limitation is the lower response rate for cycle 2, which resulted in a smaller sample size for data analysis. A single survey cycle would probably have provided sufficient data and lessened survey fatigue. 
We assumed that study participants had a clear understanding of how the ACGME core competencies related to osteopathic principles and practice and how the OMM Mini-CEX could be used for assessment. The Osteopathic Recognition milestones were published after we developed the OMM Mini-CEX. Training on the Osteopathic Recognition milestones could provide a framework to help meet resident performance targets for each of the ACGME competencies. 
Conclusion
Direct observation tools are essential for assessing the competency of medical learners. Formative assessment tools that can be used to provide immediate feedback on the trainees’ performance are especially important, and training on their purpose and use is also necessary. The OMM Mini-CEX was designed as a formative assessment tool to evaluate the OMM competency of osteopathic family medicine residents during a single clinical encounter. The results of the current study suggested that it is a useful and effective tool for its designated purpose. Our goal was to satisfy the need for direct observation tools to assess AOA core competencies and the ACGME competencies for residency and fellowship programs with Osteopathic Recognition. Additional research is needed to determine whether the OMM Mini-CEX is applicable to other training specialties, ACGME-accredited residency programs with Osteopathic Recognition, or undergraduate medical training. We invite others to adapt the OMM Mini-CEX to meet their needs or use it as a building block in developing additional tools to assess competency among medical learners. 
Acknowledgments
We thank the American Association of Colleges of Osteopathic Medicine and the Osteopathic Heritage Foundation for their grant support of the study. We appreciate the editorial support provided by Deborah Goggin, MA, ELS, from A.T. Still University for her assistance with reviewing the manuscript. 
References
Rassbach CE, Blankenburg R. A novel pediatric residency coaching program: outcomes after one year. Acad Med. 2018;93(3):430-434. doi: 10.1097/ACM.0000000000001825 [CrossRef] [PubMed]
Holmboe ES. Realizing the promise of competency-based medical education. Acad Med. 2015;90(4):411-413. doi: 10.1097/ACM.0000000000000515 [CrossRef] [PubMed]
Iobst WF, Sherbino J, Cate OT, et al. Competency-based medical education in postgraduate medical education. Med Teach. 2010;32(8):651-656. doi: 10.3109/0142159X.2010.500709 [CrossRef] [PubMed]
Williams RG, Dunnington GL, Mellinger JD, Klamen DL. Placing constraints on the use of the ACGME milestones: a commentary on the limitations of global performance ratings. Acad Med. 2015;90(4):404-407. doi: 10.1097/ACM.0000000000000507 [CrossRef] [PubMed]
Norcini JJ, Blank LL, Arnold GK, Kimball HR. The Mini-CEX (Clinical Evaluation Exercise): a preliminary investigation. Ann Intern Med. 1995;123(10):795-799. [CrossRef] [PubMed]
Malhotra S, Hatala R, Courneya CA. Internal medicine residents’ perceptions of the Mini-Clinical Evaluation Exercise. Med Teach. 2008;30(4):414-419. doi: 10.1080/01421590801946962 [CrossRef] [PubMed]
Norcini JJ, Blank LL, Duffy FD, Fortna GS. The Mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138(6):476-481. [CrossRef] [PubMed]
Ansari A Al, Ali SK, Donnon T. The construct and criterion validity of the Mini-CEX: a meta-analysis of the published research. Acad Med. 2013;88(3):413-420. doi: 10.1097/ACM.0b013e318280a953 [CrossRef] [PubMed]
Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009;302(12):1316-1326. doi: 10.1001/jama.2009.1365 [CrossRef] [PubMed]
Pelgrim EA, Kramer AW, Mokkink HG, van den Elsen L, Grol RP, van der Vleuten CP. In-training assessment using direct observation of single-patient encounters: a literature review. Adv Health Sci Educ Theory Pract. 2011;16(1):131-142. doi: 10.1007/s10459-010-9235-6 [CrossRef] [PubMed]
Durning SJ, Cation LJ, Markert RJ, Pangaro LN. Assessing the reliability and validity of the Mini-Clinical Evaluation Exercise for internal medicine residency training. Acad Med. 2002;77(9):900-904. [CrossRef] [PubMed]
Liao KC, Pu SJ, Liu MS, Yang CW, Kuo HP. Development and implementation of a Mini-Clinical Evaluation Exercise (Mini-CEX) program to assess the clinical competencies of internal medicine residents: from faculty development to curriculum evaluation. BMC Med Educ. 2013;13:31. doi: 10.1186/1472-6920-13-31 [CrossRef] [PubMed]
Weller JM, Jolly B, Misur MP, et al. Mini-Clinical Evaluation Exercise in anaesthesia training. Br J Anaesth. 2009;102(5):633-641. doi: 10.1093/bja/aep055 [CrossRef] [PubMed]
Setna Z, Jha V, Boursicot KA, Roberts TE. Evaluating the utility of workplace-based assessment tools for speciality training. Best Pract Res Clin Obstet Gynaecol. 2010;24(6):767-782. doi: 10.1016/j.bpobgyn.2010.04.003 [CrossRef] [PubMed]
Brazil V, Ratcliffe L, Zhang J. Davin L. Mini-CEX as a workplace-based assessment tool for interns in an emergency department: does cost outweigh value? Med Teach. 2012;34(12):1017-1023. doi: 10.3109/0142159X.2012.719653
Cohen SN, Farrant PB, Taibjee SM. Assessing the assessments: U.K. dermatology trainees’ views of the workplace assessment tools. Br J Dermatol. 2009:161(1):34-39. doi: 10.1111/j.1365-2133.2009.09097.x [CrossRef] [PubMed]
Arora VM, Berhie S, Horwitz LI, Saathoff M, Staisiunas P, Farnan JM. Using standardized videos to validate a measure of handoff quality: the Handoff Mini-Clinical Examination Exercise. J Hosp Med. 2014;9(7):441-446. doi: 10.1002/jhm.2185 [CrossRef] [PubMed]
COMLEX-USA Level 2-PE. National Board of Osteopathic Medical Examiners website. https://www.nbome.org/exams-assessments/comlex-usa/comlex-usa-level-2-pe/. Accessed December 20, 2018.
OMT practical exam. American Osteopathic Board of Family Physicians website. https://certification.osteopathic.org/family-physicians/certification-process/family-medicine/practical-exam/. Accessed December 20, 2018.
Osteopathic Recognition Requirements. Chicago, IL: Accreditation Council for Graduate Medical Education; 2018. https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/801OsteopathicRecognition2018.pdf?ver=2018-02-20-154513-650. Accessed December 12, 2018.
The Basic Documents for Postdoctoral Training. Chicago, IL: American Osteopathic Association; 2018.
ACGME Common Program Requirements. Chicago, IL: Accreditation Council for Graduate Medical Education; 2017.
Weston PS, Smith CA. The use of Mini-CEX in UK foundation training six years following its introduction: lessons still to be learned and the benefit of formal teaching regarding its utility. Med Teach. 2014;36(2):155-163. doi: 10.3109/0142159X.2013.836267 [CrossRef] [PubMed]
Guralnick S, Ludwig S, Englander R. Domain of competence: systems-based practice. Acad Pediatr. 2014;14(2 suppl):S70-S79. doi: 10.1016/j.acap.2013.11.015 [CrossRef] [PubMed]
Figure.
Distribution of levels of agreement for all Likert-scale survey questions. The perceived effectiveness score was calculated as the mean of the responses for the core competency categories and for overall effectiveness. aItems that make up the effectiveness scale. Abbreviations: Mini-CEX, Mini-Clinical Evaluation Exercise; OMM, osteopathic manipulative medicine.
Figure.
Distribution of levels of agreement for all Likert-scale survey questions. The perceived effectiveness score was calculated as the mean of the responses for the core competency categories and for overall effectiveness. aItems that make up the effectiveness scale. Abbreviations: Mini-CEX, Mini-Clinical Evaluation Exercise; OMM, osteopathic manipulative medicine.
Table 1.
Frequency and Percentage of Participants With Previous OMM Mini-CEX Experience
Previous Evaluation Faculty Preceptor, Residents by OGME Year, No. (%)
Cycle 1 Cycle 2
Experience No. (%) 1 2 3 All 1 2 3 All
No response 0 0 0 0 0 0 0 1 (9) 1 (4)
No, and do not plan to use 0 0 0 1 (5) 1 (2) 0 0 1 (9) 1 (4)
No, but plan to use in 6 mo 0 1 (6) 2 (14) 5 (26) 8 (16) 0 0 0 0
Yes, 1-2 experiences 0 12 (71) 2 (14) 5 (26) 19 (38) 2 (33) 3 (30) 1 (9) 6 (22)
Yes, 3-4 experiences 0 2 (12) 4 (29) 4 (21) 10 (20) 3 (50) 3 (30) 2 (18) 8 (30)
Yes, >4 experiences 4 (100) 2 (12) 6 (43) 4 (21) 12 (24) 1 (17) 4 (40) 6 (55) 11 (41)
Total 4 17 14 19 50 6 10 11 27

Abbreviations: OGME, osteopathic graduate medical education; OMM Mini-CEX, osteopathic manipulative medicine Mini-Clinical Evaluation Exercise.

Table 1.
Frequency and Percentage of Participants With Previous OMM Mini-CEX Experience
Previous Evaluation Faculty Preceptor, Residents by OGME Year, No. (%)
Cycle 1 Cycle 2
Experience No. (%) 1 2 3 All 1 2 3 All
No response 0 0 0 0 0 0 0 1 (9) 1 (4)
No, and do not plan to use 0 0 0 1 (5) 1 (2) 0 0 1 (9) 1 (4)
No, but plan to use in 6 mo 0 1 (6) 2 (14) 5 (26) 8 (16) 0 0 0 0
Yes, 1-2 experiences 0 12 (71) 2 (14) 5 (26) 19 (38) 2 (33) 3 (30) 1 (9) 6 (22)
Yes, 3-4 experiences 0 2 (12) 4 (29) 4 (21) 10 (20) 3 (50) 3 (30) 2 (18) 8 (30)
Yes, >4 experiences 4 (100) 2 (12) 6 (43) 4 (21) 12 (24) 1 (17) 4 (40) 6 (55) 11 (41)
Total 4 17 14 19 50 6 10 11 27

Abbreviations: OGME, osteopathic graduate medical education; OMM Mini-CEX, osteopathic manipulative medicine Mini-Clinical Evaluation Exercise.

×
Table 2.
Participants and Median Perceived Effectiveness Scores by Group for Responses to Survey Questions 2 and 3
Participants by Groupa Participants, No. (%) Perceived Effectiveness Score, Median (IQR)
Faculty Preceptors 4 (100) 3.7 (3.4-4.4)
 Group 1 1 (25) 3.3 (3.3-3.3)
 Group 2 1 (25) 3.6 (3.6-3.6)
 Group 3 2 (50) 4.4 (3.8-5.0)
Residents (Cycle 1) 41 (100) 3.6 (3.3-4.0)
 Group 1 7 (17) 3.6 (2.4-3.9)
 Group 2 16 (39) 3.4 (3.1-3.9)
 Group 3 18 (44) 4.0 (3.4-4.0)
Residents (Cycle 2) 25 (100) 3.9 (3.4-4.0)
 Group 1 5 (20) 3.9 (3.6-4.0)
 Group 2 13 (52) 3.4 (3.1-3.9)
 Group 3 7 (28) 4.0 (4.0-4.3)

a The groups were defined as follows: group 1 included participants who strongly disagreed, disagreed, or were unsure both that they understood the general purpose of a Mini-Clinical Evaluation Exercise (Mini-CEX) and that they found it useful in training; group 2, those who agreed or strongly agreed that they understood its purpose but strongly disagreed, disagreed, or were unsure that it was useful; and group 3, those who agreed or strongly agreed both that they understood its purpose and that it was useful.

Abbreviation: IQR, interquartile range.

Table 2.
Participants and Median Perceived Effectiveness Scores by Group for Responses to Survey Questions 2 and 3
Participants by Groupa Participants, No. (%) Perceived Effectiveness Score, Median (IQR)
Faculty Preceptors 4 (100) 3.7 (3.4-4.4)
 Group 1 1 (25) 3.3 (3.3-3.3)
 Group 2 1 (25) 3.6 (3.6-3.6)
 Group 3 2 (50) 4.4 (3.8-5.0)
Residents (Cycle 1) 41 (100) 3.6 (3.3-4.0)
 Group 1 7 (17) 3.6 (2.4-3.9)
 Group 2 16 (39) 3.4 (3.1-3.9)
 Group 3 18 (44) 4.0 (3.4-4.0)
Residents (Cycle 2) 25 (100) 3.9 (3.4-4.0)
 Group 1 5 (20) 3.9 (3.6-4.0)
 Group 2 13 (52) 3.4 (3.1-3.9)
 Group 3 7 (28) 4.0 (4.0-4.3)

a The groups were defined as follows: group 1 included participants who strongly disagreed, disagreed, or were unsure both that they understood the general purpose of a Mini-Clinical Evaluation Exercise (Mini-CEX) and that they found it useful in training; group 2, those who agreed or strongly agreed that they understood its purpose but strongly disagreed, disagreed, or were unsure that it was useful; and group 3, those who agreed or strongly agreed both that they understood its purpose and that it was useful.

Abbreviation: IQR, interquartile range.

×