Free
JAOA/AACOM Medical Education  |   December 2018
Moving Toward Milestone-Based Assessment in Osteopathic Manipulative Medicine
Author Notes
  • Financial Disclosures: None reported. 
  • Support: None reported. 
  •  *Address correspondence to Ryan Alan Seals, DO, Department of Family Medicine and Osteopathic Manipulative Medicine, University of North Texas Health Science Center Texas College of Osteopathic Medicine, 3500 Camp Bowie Blvd, Fort Worth, TX 76107-2644. Email: ryan.seals@unthsc.edu
     
Article Information
Medical Education / Graduate Medical Education
JAOA/AACOM Medical Education   |   December 2018
Moving Toward Milestone-Based Assessment in Osteopathic Manipulative Medicine
The Journal of the American Osteopathic Association, December 2018, Vol. 118, 806-811. doi:https://doi.org/10.7556/jaoa.2018.173
The Journal of the American Osteopathic Association, December 2018, Vol. 118, 806-811. doi:https://doi.org/10.7556/jaoa.2018.173
Abstract

Osteopathic medicine is continuing to move toward competency-based education at undergraduate and graduate levels. Competencies and Entrustable Professional Activities (EPAs) have been implemented to guide educators on the skills and abilities that osteopathic medical students and residents should be able to perform as physicians. Unfortunately, many of these skills have not been well described, and the threshold of “competence” or “entrustability” for each of these tasks remains elusive. The author presents an approach to measuring competence in the domain of osteopathic manipulative medicine using a milestone rubric to assess skills related to osteopathic screening, diagnosis, technique, and explanation. This rubric can be applied to all levels of osteopathic training and across many diagnostic and treatment modalities. Clearly defining and assessing the individual skills composing competence in osteopathic manipulative medicine will be increasingly important as medical education continues to evolve and modernize.

For more than 10 years, the osteopathic medical profession has been working toward the adoption of competency-based education. The American Association of Colleges of Osteopathic Medicine (AACOM) and its workgroups have endorsed the use of the American Osteopathic Association's 7 core competencies to guide this approach to osteopathic medical education.1 Competencies are observable characteristics or qualities of graduating students that integrate knowledge, skills, attitudes, and behaviors.2 In 2016, AACOM released the osteopathic core Entrustable Professional Activities (EPAs) to enumerate osteopathic-specific skills and practices deemed essential to entering residency.2 The EPAs are specific observable and measurable tasks that trainees can be entrusted to perform—without supervision—once they have attained a sufficient level of performance, or competence. The purpose of EPAs is to support the transformation of desired competencies into observable behaviors that can be more easily taught and objectively assessed. These EPAs can further be broken down into stepping stones, or milestones, which learners must demonstrate, one step at a time, in their efforts to achieve each targeted competence.3 These recent initiatives (ie, EPAs and milestones) are playing an increasingly important role in molding both undergraduate and graduate medical education, and they continue to be refined.3,4 
How do educators know when a student is competent in a particular task? At what point is the learner entrusted to competently perform the task at hand? These questions have not been fully answered, but they must be addressed to allow consistency in the training and assessment of trainees. This article offers a method for defining specific milestones within the domain of osteopathic manipulative medicine (OMM) that can be used for training to competence in osteopathic diagnosis and treatment skills. 
Osteopathic EPAs
According to EPA 1, osteopathic graduates should be able to “perform a complete and accurate physical exam, including an osteopathic structural exam” and “identify, describe, and document abnormal physical exam findings, including osteopathic structural findings (e.g. somatic dysfunction, TART, etc.).”2 EPA 12 guides expectations for performing procedures (including OMM) by requiring “confidence” and “technical (motor) skills.”2 While these EPAs provide guidelines for tasks that should be performed by osteopathic medical students, they do not give specific criteria that can be used to assess whether a student is able to perform them. Therefore, it is necessary for the developers of osteopathic medical curricula to set clear criteria for assessment to determine a student's level of mastery of a particular EPA. 
The University of North Texas Health Science Center Texas College of Osteopathic Medicine (UNTHSC/TCOM) has structured its curriculum with the understanding that competence is domain-, problem-, and task-specific and that procedural knowledge plays a greater role in the development of competence than declarative knowledge.4 This concept of procedural knowledge is especially important because the EPAs require performance of skills and not just conceptual understanding. To assess skill performance (such as those specified in EPAs), the student must “show how” and not merely “know how.”5 To properly apply these principles, we must individually assess students on the specific tasks that make up a particular EPA. For example, within the domain of osteopathic diagnosis and treatment, students are often asked to perform multiple tasks during an OMM competency examination. A student may perform a standing postural examination, diagnose landmark asymmetry in the pelvis, and treat a patient's somatic dysfunction with muscle energy all during a single assessment. The abilities of performing a screening examination, diagnosing somatic dysfunction, and performing muscle energy are independent tasks from a learning sciences perspective. One cannot assume that a student who can diagnose and apply treatment to the pelvis will be able to perform equally well on the cervical spine. Thus, screening, diagnosis, and treatment should be independently evaluated for each body region to properly assess a student's ability to perform an osteopathic structural examination and osteopathic manipulative treatment as outlined in the competencies1 and EPAs.2 
Variability exists among different osteopathic medical schools regarding assessment methods for osteopathic diagnosis and treatment. However, there are principles from the learning sciences that can be applied to optimize assessment methods. Skill performance can be measured based on efficiency and accuracy because experts perform movements more quickly, with more purposeful movements, and with fewer errors. Additionally, skill quality can be assessed based on the end result.6 Checklist and global assessment approaches have been described and validated for evaluating psychomotor skills in surgeons. Global assessments use more generic components and a scale (such as 0-5) to reflect the level of performance on various tasks, whereas checklists give points for each individual step of a procedure performed.7 In surgical intraoperative assessment, global scales were found to have better interexaminer reliability and were superior to checklists in differentiating experts from novices. Global scales also allowed for better assessment and feedback on various aspects of the procedure by giving the learner a scale of how well they did in each area.7 
At UNTHSC/TCOM, faculty observed problems with a checklist approach to grading, often giving an overall passing score based on the summation of various skills, including screening, diagnosis, and treatment. However, students could have significant deficiencies in one area (eg, diagnosis) but achieve enough points in the other elements to receive a passing grade. Having an overall passing score based on accumulated points did not accurately reflect the student's ability to perform each skill competently. This approach did not allow for the faculty to be able to declare a student competent in the EPAs because he or she could have areas of deficiency while still receiving a passing grade. 
Assessment Rubric
In response to these issues, UNTHSC/TCOM developed milestone-based rubrics using the principles of the global assessment method. This assessment approach uses 4 separate rubrics relating to the 4 aspects of assessing osteopathic skills: screening examination, diagnosis, technique, and explanation. A single combined rubric is shown in the Figure. This rubric can apply to a variety of osteopathic diagnosis and treatment modalities because it focuses on the quality, accuracy, and fluidity of a skill. It also standardizes the assessment process and eliminates the need for different assessment tools to be created for each examination. Six levels (level 0 through level 5) were chosen based on psychomotor learning principles,6 on observing how osteopathic medical students progress from novice to expert, and on the format of other milestone metrics.10 Through a standard-setting process, faculty consensus dictated that levels 1 and 2 best describe the steps of progression of skills development seen during the first 2 years of osteopathic medical school, level 3 is consistent with the skills and abilities that would define a student or physician who effectively performs osteopathic diagnosis and treatment skills on a patient, level 4 is advanced (the level expected of a neuromusculoskeletal medicine/OMM resident), and level 5 is expert. 
Figure.
Proposed assessment rubric for osteopathic manipulative medicine by learner level and skill.13
Figure.
Proposed assessment rubric for osteopathic manipulative medicine by learner level and skill.13
To be properly graded at a given level, the learner must fulfill all criteria listed in the rubric. For example, during diagnosis of the sacrum, the students may know the steps of the diagnosis and use an appropriate amount of force, but they would still get a level 1 if the tissue contact was grossly incorrect (eg, they were palpating lateral to the posterior superior iliac spine instead of on the sacral sulci). Even if they obtained the “correct” diagnosis by chance, this rubric identifies the fact that they are not contacting the proper structures and, therefore, have not acquired the skill. This situation can be compared to a student who uses a stethoscope to auscultate a heart murmur but does not contact the appropriate chest location for auscultation, indicating that the skill clearly has not been mastered and must be remediated and retested. When assessing treatment technique, it is important to differentiate knowing the steps at a basic level (level 2) and applying the steps efficiently and effectively to achieve a therapeutic response (level 3 or higher). Achieving level 3 is parallel to the student who can listen to the correct areas of the heart and describe the sounds correctly. These distinctions are consistent with our understanding of skill development5,6 and allow learners to see how they can progress from beginner to expert in a step-by-step manner. In this rubric, performing the required steps is necessary but not sufficient to achieve the higher levels. The specific steps should be available to the students for learning the skill, but, ultimately, the learners either complete the steps in a manner that achieves the appropriate outcome or they do not. 
At UNTHSC/TCOM, the minimum performance for any skill is level 2, and the expectation raises to level 3 as students move through their second year. Students who do not reach level 3 are required to attend remediation sessions and are subsequently retested. This method allows the learner to meet the expectations for their level while guiding them to improve as they progress in their training. It also facilitates incremental progression in a skill and is consistent with the zone of proximal development and mastery learning theories.8,9 Furthermore, this method assures that students meet the criteria for competence in this portion of the EPAs once they achieve a level 3 in diagnosis and treatment of all body regions. 
Standardization and specificity serve as important instructional methods for teaching psychomotor skills; therefore, semiannual faculty meetings are structured to review student examination videos and discuss the elements of the grading rubric. We also review average scores for each grader and have the course director work alongside each grader at least once during an examination session to help improve consistency. Although there are still challenges with maintaining consistency among graders, consensus training has been shown to improve interobserver reliability.11 Furthermore, maintaining the same rubric for all assessments should increase reliability as a result of consistency and repetition. 
The students are introduced to the milestones at the beginning of their training, given examples of each performance level, and given feedback during each laboratory on how they can improve. The best improvement in performance occurs when learners have well-defined goals, specific feedback, and practice opportunities to refine their performance.12 A milestone approach gives the learner goals on how to progress by forcing repetition and reassessment to improve in areas that do not reach the required level. Specific feedback from an expert is essential to the success of this approach. 
Implications of the Rubric
Valid assessment is imperative for ensuring that learners are competent in the knowledge and skills necessary to practice osteopathic medicine. This rubric takes an important step by applying criteria from the learning sciences, assessing individual tasks, and using a global scale/milestone approach. The next step is to perform studies to validate this instrument's ability to distinguish novices from experts; however, in the interim, this rubric offers an assessment tool derived from best evidence. The EPAs and competencies provide a good framework, but they fall short in defining the specific elements necessary to become competent in each domain. This rubric bridges this gap and can be modified for use in a variety of skills to better assess and track competency. 
Defining competence in OMM is important for informing physicians who will supervise these students and residents. Specific information on skill competence is valuable for osteopathic or allopathic physicians who do not routinely perform OMM but are supervising osteopathic medical students and residents. Given the move into the single accreditation system for graduate medical education and the opportunity for osteopathic recognition of programs, it is important to provide specific tools and methods for assessing osteopathic skills. This rubric could be used as part of a larger strategy to define and assess osteopathically distinct skills to designate a program as osteopathically recognized. The osteopathic medical profession must be able to clearly define the particular skills and abilities that make osteopathic medicine distinct as we move forward into the single accreditation system. We, as a profession, have an opportunity to expand the influence of osteopathic medicine in graduate medical education, but to do so we must clearly define—and assess—the knowledge and skills that make osteopathic physicians unique in their approach to patient care. 
Conclusion
This milestone-based rubric allows for further delineation of the skills required to competently perform specific tasks in osteopathic diagnosis and treatment as outlined in the osteopathic EPAs. Furthermore, it suggests specific criteria (ie, level 3) that would allow a learner to be entrusted to perform a particular skill unsupervised or to globally perform an EPA if all elements were scored at a level 3. This instrument can also be modified to more specifically assess a variety of other psychomotor skills that physicians routinely perform. Medical educators should continue to strive to create an environment that produces competent physicians by developing clear objectives, instructional strategies, and assessment methods that reliably reflect the knowledge, skills, and attitudes required of medical students and graduates. I hope that this assessment rubric serves as a helpful step in the path toward competency-based medical education. 
References
Osteopathic Core Competencies for Medical Students. Chevy Chase, MD: American Association of Colleges of Osteopathic Medicine; 2012. https://www.aacom.org/docs/default-source/core-competencies/corecompetencyreport2012.pdf?sfvrsn=4. Accessed October 22, 2018.
Osteopathic Considerations for Core Entrustable Professional Activities (EPAs) for Entering Residency. Chevy Chase, MD: American Association of Colleges of Osteopathic Medicine; 2016. https://www.aacom.org/docs/default-source/med-ed-presentations/core-epas.pdf?sfvrsn=10. Accessed October 22, 2018.
Carraccio C, Burke AE. Beyond competencies and milestones: adding meaning through context. J Grad Med Educ. 2010;2(3):419-422. doi: 10.4300/JGME-D-10-00127.1 [CrossRef] [PubMed]
Papa FJ, D'Agostino D. Faculty development directed at curricular reforms designed to improve patient outcomes. J Am Osteopath Assoc. 2016;116(11):736-741. doi: 10.7556/jaoa.2016.144 [CrossRef] [PubMed]
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(suppl 9):S63-S67. [CrossRef] [PubMed]
Fried GM, Feldman LS. Objective assessment of technical performance. World J Surg. 2008;32(2):156-160. [CrossRef] [PubMed]
Vassiliou MC, Feldman LS, Andrew CG, et al. The global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. 2005;190(1):107-113. [CrossRef] [PubMed]
Sanders D, Welk DS. Strategies to scaffold student learning: applying Vygotsky's Zone of Proximal Development. Nurse Educ. 2005;30(5):203-207. [CrossRef] [PubMed]
Yudkowsky R, Park YS, Lineberry M, Knox A, Ritter EM. Setting mastery learning standards. Acad Med. 2015;90(11):1495-1500. doi: 10.1097/ACM.0000000000000887 [CrossRef] [PubMed]
The Family Medicine Milestone Project. Chicago, IL: Accreditation Council for Graduate Medical Education; 2015.
Degenhardt BF, Johnson JC, Snider KT, Snider EJ. Maintenance and improvement of interobserver reliability of osteopathic palpatory tests over a 4-month period. J Am Osteopath Assoc. 2010;110(10):579-586. [PubMed]
Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008;15(11):988-994. doi: 10.1111/j.1553-2712.2008.00227.x [CrossRef] [PubMed]
Gustowski S, Budner-Gentry M, Seals R. Osteopathic Techniques: The Learner's Guide. New York, NY: Thieme Publishers; 2017.
Figure.
Proposed assessment rubric for osteopathic manipulative medicine by learner level and skill.13
Figure.
Proposed assessment rubric for osteopathic manipulative medicine by learner level and skill.13