Free
Medical Education  |   June 2011
Competency-Based Classification of COMLEX-USA Cognitive Examination Test Items
Author Notes
  • From the National Board of Osteopathic Medical Examiners (NBOME) National Center for Clinical Skills Testing in Conshohocken, Pennsylvania. 
  • Address correspondence to Erik Langenau, DO, Vice President for Clinical Skills Testing, the National Center for Clinical Skills Testing, NBOME, 101 W Elm St, Suite 150, Conshohocken, PA 19428-2004.E-mail: elangenau@nbome.org 
Article Information
Medical Education / Practice Management / Professional Issues / Graduate Medical Education / COMLEX-USA
Medical Education   |   June 2011
Competency-Based Classification of COMLEX-USA Cognitive Examination Test Items
The Journal of the American Osteopathic Association, June 2011, Vol. 111, 396-402. doi:10.7556/jaoa.2011.111.6.396
The Journal of the American Osteopathic Association, June 2011, Vol. 111, 396-402. doi:10.7556/jaoa.2011.111.6.396
Abstract

Context: The Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) currently assesses osteopathic medical knowledge via a series of 3 progressive cognitive examinations and 1 clinical skills assessment. In 2009, the National Board of Osteopathic Medical Examiners created the Fundamental Osteopathic Medical Competencies (FOMC) document to outline the essential competencies required for the practice of osteopathic medicine.

Objectives: To measure the distribution and extent to which cognitive examination items of the current series of COMLEX-USA assess knowledge of each of the medical competencies included in the FOMC document.

Methods: Eight graduate medical education panelists with expertise in competency-based assessment reviewed 1046 multiple-choice examination items extracted from the 3 COMLEX-USA cognitive examinations (Level 1, Level 2-Cognitive Evaluation, and Level 3) used during the 2008-2009 testing cycle. The 8 panelists individually judged each item to classify it as 1 of the 6 fundamental osteopathic medical competencies described in the FOMC document.

Results: Panelists made 8368 judgments. The majority of the sample examination items were classified as either patient care (3343 [40%]) or medical knowledge (4236 [51%]). Panelists also reported these 2 competencies as being the easiest to define, teach, and assess. The frequency of medical knowledge examination items decreased throughout the COMLEX-USA series (69%, 43%, 40%); conversely, items classified as interpersonal and communication skills, systems-based practice, practice-based learning and improvement, and professionalism increased throughout the 3-examination series.

Conclusion: Results indicate that knowledge of each of the 6 competencies is being assessed to some extent with the current COMLEX-USA format. These findings provide direction for the enhancement of existing examinations and development of new assessment tools.

Since 1999, when they were first introduced, the medical competencies of the Accreditation Council on Graduate Medical Education (ACGME) and American Osteopathic Association (AOA) have been integrated into undergraduate and graduate medical education.1,2 The 6 ACGME competencies are patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice. The AOA general competencies mirror those of the ACGME but include a seventh dimension: osteopathic principles and practice (OPP) and osteopathic manipulative treatment (OMT). In 2009, the National Board of Osteopathic Medical Examiners (NBOME) endorsed the Fundamental Osteopathic Medical Competencies (FOMC) document that integrated OPP and OMT concepts into the original 6 ACGME categories.3 The NBOME is currently reviewing and revising the FOMC document to include each of the 7 competencies as distinctly defined by the AOA. 
Created by a group of experts in osteopathic medical education and assessment, the FOMC document serves as a tool for examination development and future examination assessment. The document defines each of the fundamental osteopathic medical competencies, outlines how physicians may demonstrate competency in each of the 6 categories, and identifies specific observable outcomes and assessment tools that can be used to assess each of these required elements. 
The COMLEX-USA cognitive examination series is a progressive 3-level examination, which is designed to assess the medical knowledge of osteopathic candidates as they work toward fulfilling osteopathic medical licensing requirements.4 The COMLEX-USA cognitive examinations were constructed to measure osteopathic knowledge and skill in 2 areas: clinical presentation (“Dimension I”) and physician task (“Dimension II”). These areas are described in the COMLEX-USA cognitive examination blueprint, which serves as a guide during examination development.4 Major categories of the 2 dimensions can be found in Table 1. 
Table 1.
Proportion of Questions by Topic and Level According to the COMLEX-USA Cognitive Examination Blueprint

Dimension

Proportion, %
1: Patient Presentation* Levels 1, 2-CE, and 3
□ Population health concepts and patients with presentations related to health promotion, chronic disease management, and human development 8-16
□ Patients with presentations related to digestion and metabolism4-10
□ Patients with presentations related to cognition, behavior, sensory and central nervous systems, substance abuse, and visceral and sensory pain 28-38
□ Patients with presentations related to the musculoskeletal system, including somatic pain6-12
□ Patients with presentations related to the genitourinary system and human sexuality 3-8
□ Patients with presentations related to circulation and the respiratory system8-16
□ Patients with presentations related to thermoregulation 2-6
□ Patients with presentations related to trauma, masses, edema, discharge, and the skin, hair, and nails8-16
□ Patients with presentations related to pregnancy, the peripartum, and the neonatal period 3-8
2: Physician Tasks* 1 2-CE 3
□ Health promotion and disease prevention 1-5 15-20 15-20
□ History and physical examination5-1530-4010-20
□ Diagnostic technologies 1-5 10-20 15-25
□ Management2-710-2025-40
□ Scientific understanding of health and disease mechanisms 70-85 5-15 5-10
□ Healthcare delivery issues1-35-105-10
 Abbreviation: CE, cognitive evaluation.Source: Copyright 2011 by the National Board of Osteopathic Medical Examiners. Reprinted with permission. All rights reserved.
 *Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) computer-based testing.
Table 1.
Proportion of Questions by Topic and Level According to the COMLEX-USA Cognitive Examination Blueprint

Dimension

Proportion, %
1: Patient Presentation* Levels 1, 2-CE, and 3
□ Population health concepts and patients with presentations related to health promotion, chronic disease management, and human development 8-16
□ Patients with presentations related to digestion and metabolism4-10
□ Patients with presentations related to cognition, behavior, sensory and central nervous systems, substance abuse, and visceral and sensory pain 28-38
□ Patients with presentations related to the musculoskeletal system, including somatic pain6-12
□ Patients with presentations related to the genitourinary system and human sexuality 3-8
□ Patients with presentations related to circulation and the respiratory system8-16
□ Patients with presentations related to thermoregulation 2-6
□ Patients with presentations related to trauma, masses, edema, discharge, and the skin, hair, and nails8-16
□ Patients with presentations related to pregnancy, the peripartum, and the neonatal period 3-8
2: Physician Tasks* 1 2-CE 3
□ Health promotion and disease prevention 1-5 15-20 15-20
□ History and physical examination5-1530-4010-20
□ Diagnostic technologies 1-5 10-20 15-25
□ Management2-710-2025-40
□ Scientific understanding of health and disease mechanisms 70-85 5-15 5-10
□ Healthcare delivery issues1-35-105-10
 Abbreviation: CE, cognitive evaluation.Source: Copyright 2011 by the National Board of Osteopathic Medical Examiners. Reprinted with permission. All rights reserved.
 *Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) computer-based testing.
×
Because the COMLEX-USA computer-based cognitive examinations were originally created to assess clinical presentation and physician task, it is not known which of the fundamental osteopathic medical competencies included in the FOMC document are being assessed, and to what extent. Knowledge of multiple competencies may be assessed in a single examination item, and classifying existing examination items into distinct competencies may not be straight-forward. Several sample questions are provided in the Figure to illustrate this challenge. For example, in sample question 2, is the competency classification patient care or medical knowledge? For sample question 4, is the competency classification practice-based learning and improvement, systems-based practice, patient care, medical knowledge, or some combination of these? 
Multiple-choice examination items, like those used in the cognitive portions of COMLEX-USA, have been used for a long time to assess medical knowledge,5 but can these examination items accurately evaluate knowledge of the other medical competencies? The purpose of the present study was to help elucidate the extent to which the current COMLEX-USA cognitive examinations assess knowledge of each of the fundamental osteopathic medical competencies and identify some of the challenges with categorizing examination items according to competency. Our research questions were as follows: 
  • To what extent do current cognitive examination questions of the COMLEX-USA series measure the competencies described in the FOMC document?
  • What is the distribution of these competencies throughout the 3-part cognitive examination series?
  • How challenging is it for experts to retrofit the COMLEX-USA items into the competency categories described in the FOMC document?
Methods
A sample of 1046 examination items were selected from 3 COMLEX-USA cognitive examination forms used during the 2008-2009 testing cycle. One examination form was taken from each level: Level 1 (n=348), Level 2-Cognitive Evaluation (CE) (n=349), and Level 3 (n=349). Graduate medical education experts independently reviewed and classified each item into 1 of the 6 competencies described in the FOMC document. 
Panelists and Training
Panelists included 1 osteopathic director of medical education, 3 residency program directors, 2 former residency program directors, 1 administrative designated institutional official specializing in resident assessment, and 1 clinical psychologist specializing in patient-physician communication skills among residents. All were selected by NBOME leadership and physician staff based on their extensive experience with resident education and knowledge of competency-based assessments, and all had ongoing experience with assessing or teaching osteopathic medical students. Current COMLEX-USA item writers were excluded from the panel to control for bias when making competency-based classification judgments. Panel members (6 men and 2 women) convened over 2 days to complete the assigned task of classifying examination items by competency. 
Figure.
Sample questions for the Comprehensive Osteopathic Medical Licensing Examination-USA. These sample questions are not current questions from the examination; they were created for the present study to illustrate the challenge of classifying items into particular competencies.
Figure.
Sample questions for the Comprehensive Osteopathic Medical Licensing Examination-USA. These sample questions are not current questions from the examination; they were created for the present study to illustrate the challenge of classifying items into particular competencies.
Prior to the classification exercise, panelists received an orientation to COMLEX-USA and reviewed NBOME's FOMC document. Panelists also participated in a variety of exercises to illustrate how each of the competencies could be incorporated into formative and summative assessments. Such examples included videotaped demonstrations with large-group discussion of competency-based assessment as well as roleplaying exercises with facilitated small-group discussion related to providing competency-specific feedback to residents. Each panelist signed a confidentiality statement to maintain the integrity and security of the items they were to review. 
Measures
Pre-exercise survey—Panelists completed a survey prior to the competency classification exercises and responded to the following questions: 
  • For how many years have you worked with the ACGME and AOA competencies?
  • Do you find the FOMC document informative (yes, no, not sure)?
  • How confident are you in your knowledge of each of the following competency-based activities: defining, teaching, providing feedback, and assessing (not confident, somewhat confident, confident, completely confident)?
  • How confident are you with regard to each of the following: completing, designing, and interpreting COMLEX-USA questions and multiple-choice questions in general (not confident, somewhat confident, confident, completely confident)?
  • Each of the following question types can be used to assess fundamental understanding of competencies: multiple-choice, short-answer, and essaystyle examination questions (yes, no, not sure).
  • Existing COMLEX-USA examination items can be classified into a particular competency (yes, no, not sure).
  • Please rank the following competencies according to level of difficulty with regard to defining, teaching, and assessing each of the competencies: patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice (1 being most difficult, 6 being least difficult).
Competency-based classification of items—During the competency classification exercises, panelists were presented with multiple-choice examination items from COMLEX-USA Levels 1, 2-CE, and 3 and were asked to classify each as 1 of the 6 competency categories previously described. 
Focus group discussion—Based on disparate competency classifications by panelists, 11 examination items were selected for facilitated small-group discussion and detailed review because panelists found them particularly difficult to classify. Before discussion, panelists were asked to complete a worksheet on which they identified what made each particular examination item difficult to classify, to which category or categories the item could potentially be classified, and what was the rationale for their classification. After a focus group discussion, guided by their worksheet responses, participants were asked to independently re-classify each examination item discussed. 
Postexercise survey—Similar to the pre-exercise survey, panelists completed a postexercise survey and responded to the following questions: 
  • How confident are you in your knowledge of each of the following competency-based activities: defining, teaching, providing feedback, and assessing (not confident, somewhat confident, confident, completely confident)?
  • How confident are you with regard to each of the following: completing, designing, and interpreting COMLEX-USA questions and multiple-choice questions in general (not confident, somewhat confident, confident, completely confident)?
  • Multiple choice examination questions can be used to assess fundamental understanding of each of the core competencies (yes, no, not sure).
  • After today's exercise, do you think existing COMLEX-USA examination questions can be classified into independent competencies? (yes, no, not sure)?
  • Please rank the following competencies according to level of difficulty with regard to defining, teaching, and assessing each of the competencies: patient care, medical knowledge, practice-based learning and improvement, interpersonal and communication skills, professionalism, and systems-based practice (1 being most difficult, 6 being least difficult).
Analysis
Descriptive frequency analysis was completed using pre- and postexercise survey data and examination item classification. Univariate statistics, including frequencies and mean response ratings, were computed. 
Results
Survey Results
Panelists reported that they worked with the ACGME and AOA competencies for 3 to 10 years (mean, 7.4 years), and all reported that they found the FOMC document informative. All panelists reported being “confident” or “somewhat-confident” on the pre-exercise survey questions regarding their knowledge of competency-based activities (defining, teaching, providing feedback and assessing) and their ability to complete, design, and interpret multiple-choice questions. Postexercise responses from the panelists did not change, and all reported being “confident” or “somewhat confident” to the same questions. 
Results of the pre-exercise survey indicated that the majority of panelists believed fundamental understanding of each of the competencies could be assessed through the use of multiple-choice (75% agreed, 25% disagreed), short-answer (50% agreed, 12.5% disagreed, 37.5% “not sure”), and essaystyle examination questions (62% agreed, 37.5% disagreed). 
Results of the postexercise survey also indicated that the majority of panelists believed multiple-choice questions could be used to assess fundamental understanding of each of the core competencies (63% agreed, 13% disagreed, 25% “not sure”). Approximately one-third of the panelists reported that existing COMLEX-USA items could be classified into independent competencies (38% agreed, 63% disagreed). 
Before the start of the exercise, panelists felt that the medical knowledge competency was the easiest to define, teach, and assess; patient care followed. Panelists gave similar ratings for the remaining competencies (interpersonal and communication skills, systems-based practice, practice-based learning and improvement, and professionalism), assigning them an average response rating of 3, on a 6-point Likert scale, from 1 (most difficult) to 6 (least difficult) (Table 2). 
Table 2.
Panelists' Mean Response Ratings for Defining, Teaching, and Assessing Competencies *

Competency

Pre-exercise

Postexercise
Patient Care
□ Defining44.5
□ Teaching 4.375 5
□ Assessing4.3755
Medical Knowledge
□ Defining56
□ Teaching 4.875 5.875
□ Assessing5.1255.875
Practice-Based Learning and Improvement
□ Defining2.753.375
□ Teaching 2.875 3
□ Assessing2.753.25
Interpersonal and Communication Skills
□ Defining3.753.375
□ Teaching 3.625 3.25
□ Assessing3.6253.125
Professionalism
□ Defining3.251.875
□ Teaching 2.875 1.875
□ Assessing2.6251.25
Systems-Based Practice
□ Defining2.752.5
□ Teaching 3.125 2.75
□ Assessing
2.75
2.625
 *The scale ranged from 1 (most difficult) to 6 (least difficult).
Table 2.
Panelists' Mean Response Ratings for Defining, Teaching, and Assessing Competencies *

Competency

Pre-exercise

Postexercise
Patient Care
□ Defining44.5
□ Teaching 4.375 5
□ Assessing4.3755
Medical Knowledge
□ Defining56
□ Teaching 4.875 5.875
□ Assessing5.1255.875
Practice-Based Learning and Improvement
□ Defining2.753.375
□ Teaching 2.875 3
□ Assessing2.753.25
Interpersonal and Communication Skills
□ Defining3.753.375
□ Teaching 3.625 3.25
□ Assessing3.6253.125
Professionalism
□ Defining3.251.875
□ Teaching 2.875 1.875
□ Assessing2.6251.25
Systems-Based Practice
□ Defining2.752.5
□ Teaching 3.125 2.75
□ Assessing
2.75
2.625
 *The scale ranged from 1 (most difficult) to 6 (least difficult).
×
Results from the postexercise survey (Table 2) indicate that the majority of panelists viewed professionalism as the most difficult to define, teach, and assess. The percentage of items classified as professionalism was also the smallest of all competency categories. In comparison, medical knowledge was rated the easiest to define, teach, and assess. Classification of examination items into the medical knowledge category was highest in Level 1 (69%), followed by Level 2-CE (43%) and Level 3 (40%). 
Competency Classification
The majority of the sample examination items from Level 1, Level 2-CE, and Level 3 were classified as either patient care or medical knowledge. Together, panelists assigned the patient care or medical knowledge competencies to approximately 85% to 90% of the examination items reviewed. Items classified as medical knowledge appeared less often in Level 2-CE and Level 3 than in Level 1. Competencies identified as more difficult to define, teach, and assess (interpersonal and communication skills, systems-based practice, practice-based learning and improvement, and professionalism) increased in usage from Level 1 to Level 3. See Table 3 for a complete listing of the classifications, percentages, and distributions. The interrater reliabilities for competency classifications were acceptable for classifying items from Levels 1, 2-CE, and 3 (interclass correlation coefficients = 0.74, 0.86, and 0.83, respectively). 
Table 3.
Number of Competency Classifications Per COMLEX-USA Level


Distribution, No. (%)
Competency
Level 1*
Level 2-CE
Level 3
Patient care785 (28)1264 (45)1294 (46)
Medical knowledge 1912 (69) 1206 (43) 1118 (40)
Practice-based learning and improvement38 (1)123 (5)163 (6)
Interpersonal and communication skills 13 (1) 24 (1) 51 (2)
Professionalism10 (0)27 (1)42 (2)
Systems-based practice 26 (1)
148 (5)
124 (4)
Total Classifications
2784 (100)
2792 (100)
2792 (100)
 *For Level 1 of the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA), 348 examination items were reviewed.
 For COMLEX-USA Level 2-Cognitive Evaluation (CE) and Level 3, 349 examination items were reviewed.
Table 3.
Number of Competency Classifications Per COMLEX-USA Level


Distribution, No. (%)
Competency
Level 1*
Level 2-CE
Level 3
Patient care785 (28)1264 (45)1294 (46)
Medical knowledge 1912 (69) 1206 (43) 1118 (40)
Practice-based learning and improvement38 (1)123 (5)163 (6)
Interpersonal and communication skills 13 (1) 24 (1) 51 (2)
Professionalism10 (0)27 (1)42 (2)
Systems-based practice 26 (1)
148 (5)
124 (4)
Total Classifications
2784 (100)
2792 (100)
2792 (100)
 *For Level 1 of the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA), 348 examination items were reviewed.
 For COMLEX-USA Level 2-Cognitive Evaluation (CE) and Level 3, 349 examination items were reviewed.
×
Difficult-to-Classify Examination Items
Eleven examination items were considered “difficult to classify.” The interrater reliability (interclass correlation coefficient) for these items was 0.79. When asked to independently reassign competency categories for each of these items after the focus group discussion and worksheet exercises, panelists did change their initial classifications. The proportion of panelists that changed classification of these items ranged from 12.5% to 87.5%, with a mean change of 53.4%. 
Approximately 25% of the “difficult to classify” examination items were initially classified as patient care, with a similar percentage for systems-based practice (26%). Fewer panelists believed “difficult to classify” items measured medical knowledge (16%), practice-based learning and improvement (15%), interpersonal and communication skills (9%), and professionalism (9%). After the focus group discussion, patient care classifications dropped slightly (from 25% to 22%), as did classifications of interpersonal and communication skills (from 9% to 3%) and systems-based practice (from 26% to 21%). Item classifications increased after discussion in the competencies of practice-based learning and improvement (from 15% to 24%), medical knowledge (from 16% to 19%), and professionalism (from 9% to 11%). 
Panelists were asked to identify the reasons each examination item was difficult to classify (focus group discussion). Panelist responses could be condensed into 4 main categories (based on the frequency/type of response): question wording (25 [37%]), discrimination and could fall into more than one category (21 [31%]), circumstances of case presentation in item stem (13 [20%]), and interpretation of competency definition as it pertains to the particular item (8 [12%]). 
Comment
Upon initiation of the examination review exercise, panelists described themselves as somewhat confident or confident in their abilities to define, teach, and assess the competencies of residents and osteopathic medical students. On average, panelists agreed that the medical knowledge competency was easiest to define, teach, and assess. In contrast, systems-based practice, practice-based learning and improvement, and professionalism were the most difficult competencies to define, teach, and assess. This finding is consistent with previous research indicating that, except for general medical knowledge, assessment of the remaining competencies has been challenging to quantify in a valid and reliable fashion.6 
Results of the present study indicate that the percentage of examination items classified as medical knowledge decreased throughout the COMLEX-USA series. Consequently, the percentage of examination items related to the other competencies increased. This result is consistent with traditional undergraduate medical education models in which basic sciences and medical knowledge are emphasized in the preclinical years followed by an emphasis on integration of medical knowledge into clinical practice during the last 2 years of training (the clinical clerkship years).7 Therefore, the decrease in the number of examination items classified as general medical knowledge may reflect what is expected of students as they progress in their undergraduate training. 
This decrease in the number of examination items classified as medical knowledge, and the corresponding increase in the other competencies, is also consistent with the current COMLEX-USA blueprint, which calls for a decreasing number of items classified as scientific understanding of mechanisms throughout the series. These items likely correspond to the medical knowledge competency described in the FOMC document. Similarly, the percentage of management examination items in the current blueprint increases throughout the testing sequence; this is likely to correlate with the increasing number of examination items classified as patient care. Future correlation studies could further explore the relationships between examination items classified using competencies and the original blueprint specifications. 
A number of examination items that panelists had difficulty classifying were reviewed in focus group discussion. Panelists changed their competency designations on average 53% of the time after discussion. This rate of change reflects the challenge of classifying existing examination items into 1 discrete competency. In addition, a majority of panelists (63%) in the postexercise survey reported that existing COMLEX-USA examination items could not be classified into independent competencies. Consider, for example, question 3 in the Figure. In this example, panelists may have classified the examination item as systems-based practice because it addresses involvement by different agencies, professionalism because it addresses legislation, or even patient care because it focuses on how legislation is applied to the care of the patient. For each of these “difficult to classify” questions, panelists reported that classification was difficult due to the questions' stem construction, wording, or content overlap within each of the competencies. This finding is consistent with previous studies demonstrating that current measurement tools cannot assess the competencies independently from one another.6 In addition, other studies using factor analysis demonstrate that competencies overlap substantially and perhaps fewer than 6 measurable constructs exist.8,9 Because certain medical topics cross various competency categories, and higher-order questions may necessitate integrating multiple competencies, classifying a complex examination item into 1 discrete competency may be impractical. 
Acknowledging that physician skill and knowledge cannot be assessed by multiple-choice examination alone, several competency-based assessment tools have been utilized in medical education environments in the past: faculty rating forms, behavior checklists, clinical skills examinations, standardized patient simulations, 360° evaluations, portfolios, record review, chart simulated recall, and more.6,10-12 Results of the present study indicate that to some extent, knowledge of each competency can also be measured using multiplechoice examination items. However, examination items of this nature should not be used to replace performance evaluations of clinical competence. Ideally multiple assessment modalities, administered at different stages of training and involving a range of tasks, should be used to evaluate physician performance.10 
Limitations
The present study has some limitations. First, although all panelists had experience with training and assessing medical students and residents, they predominately demonstrated expertise in applying the clinical competencies in a graduate medical education setting. This selection bias may have confounded study results, and future study could include representatives from osteopathic undergraduate, graduate, and continuing medical education. Second, for the 8368 judgments made by the 8 panelists, interrater reliability was quite high, but reliability could be strengthened further by repeating the exercise with a second group of medical educators. Third, by integrating OPP and OMT into the original competencies, only 6 competencies were used for classification instead of the 7 competencies defined by the AOA. Fourth, previously written examination items, designed specifically to meet current COMLEX-USA blueprint specifications, were reviewed for the purposes of this study. Imposing competency-based classifications onto items that were created using an entirely different blueprint may not only be impractical, but may also explain the difficulty in identifying one discrete competency for each item. Future study could involve classification of items that are constructed specifically for a particular competency. 
Although the present study supports the claim that knowledge of a variety of competencies can be assessed using multiple-choice questions, it is important to remember that COMLEX-USA Level 2-Performance Evaluation (PE) augments the COMLEX-USA Level 2-CE by providing an assessment of fundamental clinical skills of osteopathic medical students who are preparing to enter graduate medical education.13,14 The clinical skills examination measures patientphysician communication, interpersonal skills, professionalism, medical history-taking and physical examination skills, integration of osteopathic principles into practice and use of OMT, written communication skills and ability to synthesize information, and the ability to develop a differential diagnosis and formulate a diagnostic and treatment plan. Results of the present study suggest that COMLEX-USA cognitive examinations can be used to some extent to assess knowledge of each of the competencies; COMLEX-USA Level 2-PE can be used to assess performance of each of the competencies. 
Conclusion
The present study confirms that, to some extent, knowledge of each of the competencies is assessed via the current COMLEX-USA format. Using factor or item analysis, future study could investigate relationships between items or between classifications. Although medical knowledge is assessed throughout the entire examination series, examination items classified as such decreased throughout the series; consequently, the number of examination items addressing the remaining fundamental osteopathic medical competencies increased. Additional investigation is required to explore further integration of the knowledge and application of medical competencies into current COMLEX-USA examinations, as well as future assessment tools. 
 Financial Disclosures: None reported.
 
We wish to acknowledge the contributions of our expert panelists for dedicating their time to the project. For their assistance in preparing and collecting data, we also wish to thank the following NBOME staff: Linjun Shen, PhD, MPH; Cassie Dyer, MA; and Mia Solomon, PhD, as well as Kristie Lang for her comprehensive review and editing of the manuscript. 
American Osteopathic Association. Core Competency Compliance Program (CCCP) Part III. Chicago, IL: American Osteopathic Association; March 2004. http://www.osteopathic.org/inside-aoa/accreditation/postdoctoral-training-approval/Documents/core-competency-compliance-program-part-3.pdf. Accessed May 19, 2011.
Common program requirements: general competencies. Accreditation Council on Graduate Medical Education Web site. http://www.acgme.org/Outcome/comp/GeneralCompetenciesStandards21307.pdf. Effective February 13, 2007. Accessed May 19, 2011.
National Board of Osteopathic Medical Examiners. Fundamental Osteopathic Medical Competencies: Guidelines for Osteopathic Medical Licensure and the Practice of Osteopathic Medicine. Conshohocken, PA: National Board of Osteopathic Medical Examiners; March 2009. http://www.nbome.org/docs/NBOME%20Fundamental%20Osteopathic%20Medical%20Competencies.pdf. Accessed May 19, 2011.
COMLEX computer based testing. National Board of Osteopathic Medical Examiners Web site. http://www.nbome.org/comlex-cbt.asp?m=can. Accessed May 19, 2011.
DeChamplain AF. A primer on classical test theory and item response theory for assessments in medical education. Med Educ. 2010;44(1):109-117.
Lurie SJ, Mooney CJ, Lyness JM. Measurement of the general competencies of the Accreditation Council for Graduate Medical Education: a systematic review. Acad Med. 2009;84(3):301-309.
Chumley H, Olney C, Usatine R, Dobbie A. A short transitional course can help medical students prepare for clinical learning. Fam Med. 2005;37(7):496-501.
Silber CG, Nasca TJ, Paskin DL, Eiger G, Robeson M, Veloski JJ. Do global rating forms enable program directors to assess the ACGME competencies? Acad Med. 2004;79(6):549-556.
Brasel KJ, Bragg D, Simpson DE, Weigelt JA. Meeting the Accreditation Council for Graduate Medical Education competencies using established residency training program assessment tools. Am J Surg. 2004;188(1):9-12.
Swing SR. Assessing the ACGME general competencies: general considerations and assessment methods. Acad Emerg Med. 2002;9(11):1278-1288.
ACGME competencies: suggested best methods for evaluation. Accreditation Council for Graduate Medical Education Web site. http://www.acgme.org/Outcome/assess/ToolTable.pdf. Effective September 2000. Accessed May 19, 2011.
Swing SR, Clyman SC, Holmboe ES, Williams RG. Advancing resident assessment in graduate medical education. J Grad Med Educ. 2009;1(2):278-286.
National Board of Osteopathic Medical Examiners. 2010-2011 orientation guide COMLEX-USA Level 2-PE. http://www.nbome.org/docs/PEOrientationGuide.pdf. Accessed August 12, 2010.
Langenau E, Dyer C, Roberts WL, Wilson CD, Gimpel JR. Five-year summary of COMLEX-USA Level 2-PE examinee performance and survey data. J Am Osteopath Assoc. 2010;110(3):114-125.
Figure.
Sample questions for the Comprehensive Osteopathic Medical Licensing Examination-USA. These sample questions are not current questions from the examination; they were created for the present study to illustrate the challenge of classifying items into particular competencies.
Figure.
Sample questions for the Comprehensive Osteopathic Medical Licensing Examination-USA. These sample questions are not current questions from the examination; they were created for the present study to illustrate the challenge of classifying items into particular competencies.
Table 1.
Proportion of Questions by Topic and Level According to the COMLEX-USA Cognitive Examination Blueprint

Dimension

Proportion, %
1: Patient Presentation* Levels 1, 2-CE, and 3
□ Population health concepts and patients with presentations related to health promotion, chronic disease management, and human development 8-16
□ Patients with presentations related to digestion and metabolism4-10
□ Patients with presentations related to cognition, behavior, sensory and central nervous systems, substance abuse, and visceral and sensory pain 28-38
□ Patients with presentations related to the musculoskeletal system, including somatic pain6-12
□ Patients with presentations related to the genitourinary system and human sexuality 3-8
□ Patients with presentations related to circulation and the respiratory system8-16
□ Patients with presentations related to thermoregulation 2-6
□ Patients with presentations related to trauma, masses, edema, discharge, and the skin, hair, and nails8-16
□ Patients with presentations related to pregnancy, the peripartum, and the neonatal period 3-8
2: Physician Tasks* 1 2-CE 3
□ Health promotion and disease prevention 1-5 15-20 15-20
□ History and physical examination5-1530-4010-20
□ Diagnostic technologies 1-5 10-20 15-25
□ Management2-710-2025-40
□ Scientific understanding of health and disease mechanisms 70-85 5-15 5-10
□ Healthcare delivery issues1-35-105-10
 Abbreviation: CE, cognitive evaluation.Source: Copyright 2011 by the National Board of Osteopathic Medical Examiners. Reprinted with permission. All rights reserved.
 *Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) computer-based testing.
Table 1.
Proportion of Questions by Topic and Level According to the COMLEX-USA Cognitive Examination Blueprint

Dimension

Proportion, %
1: Patient Presentation* Levels 1, 2-CE, and 3
□ Population health concepts and patients with presentations related to health promotion, chronic disease management, and human development 8-16
□ Patients with presentations related to digestion and metabolism4-10
□ Patients with presentations related to cognition, behavior, sensory and central nervous systems, substance abuse, and visceral and sensory pain 28-38
□ Patients with presentations related to the musculoskeletal system, including somatic pain6-12
□ Patients with presentations related to the genitourinary system and human sexuality 3-8
□ Patients with presentations related to circulation and the respiratory system8-16
□ Patients with presentations related to thermoregulation 2-6
□ Patients with presentations related to trauma, masses, edema, discharge, and the skin, hair, and nails8-16
□ Patients with presentations related to pregnancy, the peripartum, and the neonatal period 3-8
2: Physician Tasks* 1 2-CE 3
□ Health promotion and disease prevention 1-5 15-20 15-20
□ History and physical examination5-1530-4010-20
□ Diagnostic technologies 1-5 10-20 15-25
□ Management2-710-2025-40
□ Scientific understanding of health and disease mechanisms 70-85 5-15 5-10
□ Healthcare delivery issues1-35-105-10
 Abbreviation: CE, cognitive evaluation.Source: Copyright 2011 by the National Board of Osteopathic Medical Examiners. Reprinted with permission. All rights reserved.
 *Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA) computer-based testing.
×
Table 2.
Panelists' Mean Response Ratings for Defining, Teaching, and Assessing Competencies *

Competency

Pre-exercise

Postexercise
Patient Care
□ Defining44.5
□ Teaching 4.375 5
□ Assessing4.3755
Medical Knowledge
□ Defining56
□ Teaching 4.875 5.875
□ Assessing5.1255.875
Practice-Based Learning and Improvement
□ Defining2.753.375
□ Teaching 2.875 3
□ Assessing2.753.25
Interpersonal and Communication Skills
□ Defining3.753.375
□ Teaching 3.625 3.25
□ Assessing3.6253.125
Professionalism
□ Defining3.251.875
□ Teaching 2.875 1.875
□ Assessing2.6251.25
Systems-Based Practice
□ Defining2.752.5
□ Teaching 3.125 2.75
□ Assessing
2.75
2.625
 *The scale ranged from 1 (most difficult) to 6 (least difficult).
Table 2.
Panelists' Mean Response Ratings for Defining, Teaching, and Assessing Competencies *

Competency

Pre-exercise

Postexercise
Patient Care
□ Defining44.5
□ Teaching 4.375 5
□ Assessing4.3755
Medical Knowledge
□ Defining56
□ Teaching 4.875 5.875
□ Assessing5.1255.875
Practice-Based Learning and Improvement
□ Defining2.753.375
□ Teaching 2.875 3
□ Assessing2.753.25
Interpersonal and Communication Skills
□ Defining3.753.375
□ Teaching 3.625 3.25
□ Assessing3.6253.125
Professionalism
□ Defining3.251.875
□ Teaching 2.875 1.875
□ Assessing2.6251.25
Systems-Based Practice
□ Defining2.752.5
□ Teaching 3.125 2.75
□ Assessing
2.75
2.625
 *The scale ranged from 1 (most difficult) to 6 (least difficult).
×
Table 3.
Number of Competency Classifications Per COMLEX-USA Level


Distribution, No. (%)
Competency
Level 1*
Level 2-CE
Level 3
Patient care785 (28)1264 (45)1294 (46)
Medical knowledge 1912 (69) 1206 (43) 1118 (40)
Practice-based learning and improvement38 (1)123 (5)163 (6)
Interpersonal and communication skills 13 (1) 24 (1) 51 (2)
Professionalism10 (0)27 (1)42 (2)
Systems-based practice 26 (1)
148 (5)
124 (4)
Total Classifications
2784 (100)
2792 (100)
2792 (100)
 *For Level 1 of the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA), 348 examination items were reviewed.
 For COMLEX-USA Level 2-Cognitive Evaluation (CE) and Level 3, 349 examination items were reviewed.
Table 3.
Number of Competency Classifications Per COMLEX-USA Level


Distribution, No. (%)
Competency
Level 1*
Level 2-CE
Level 3
Patient care785 (28)1264 (45)1294 (46)
Medical knowledge 1912 (69) 1206 (43) 1118 (40)
Practice-based learning and improvement38 (1)123 (5)163 (6)
Interpersonal and communication skills 13 (1) 24 (1) 51 (2)
Professionalism10 (0)27 (1)42 (2)
Systems-based practice 26 (1)
148 (5)
124 (4)
Total Classifications
2784 (100)
2792 (100)
2792 (100)
 *For Level 1 of the Comprehensive Osteopathic Medical Licensing Examination-USA (COMLEX-USA), 348 examination items were reviewed.
 For COMLEX-USA Level 2-Cognitive Evaluation (CE) and Level 3, 349 examination items were reviewed.
×