Free
Original Contribution  |   November 2012
Bibliometric Measures and National Institutes of Health Funding at Colleges of Osteopathic Medicine, 2006-2010
Author Notes
  • From the departments of physiology (Dr Suminski), anatomy (Dr May), and bioethics (Dr Wasserman) at Kansas City University of Medicine and Biosciences in Missouri; from the State University of New York at Buffalo (Mr Hendrix); and from the A.T. Still University-Kirksville College of Osteopathic Medicine in Missouri (Dr Guillory) 
  • Address correspondence to Richard R. Suminski, PhD, MPH, Kansas City University of Medicine and Biosciences, Department of Physiology, 454 Strickland Education Pavilion, Kansas City, MO 64106-1453. E-mail: rsuminski@kcumb.edu  
Article Information
Medical Education
Original Contribution   |   November 2012
Bibliometric Measures and National Institutes of Health Funding at Colleges of Osteopathic Medicine, 2006-2010
The Journal of the American Osteopathic Association, November 2012, Vol. 112, 716-724. doi:10.7556/jaoa.2012.112.11.716
The Journal of the American Osteopathic Association, November 2012, Vol. 112, 716-724. doi:10.7556/jaoa.2012.112.11.716
Abstract

Context: During the past 20 years, colleges of osteopathic medicine (COMs) have made several advances in research that have substantially improved the osteopathic medical profession and the health of the US population. Furthering the understanding of research at COMs, particularly the factors influencing the attainment of extramural funds, is highly warranted and coincides with the missions of most COMs and national osteopathic organizations.

Objectives: To describe bibliometric measures (numbers of peer-reviewed publications [ie, published articles] and citations of these publications, impact indices) at COMs from 2006 through 2010 and to examine statistical associations between these measures and the amount of National Institutes of Health (NIH) research funds awarded to COMs in 2006 and 2010.

Methods: A customized, systematic search of the Web of Science database was used to obtain bibliometric measures for 28 COMs. For the analyses, the bibliometric measures were summed or averaged over a 5-year period (2006 through 2010). The NIH database was used to obtain the amount of NIH funds for research grants and contracts received by the 28 COMs. Bivariate and multivariate statistical procedures were used to explore relationships between bibliometric measures and NIH funding amounts.

Results: The COMs with 2010 NIH funding, compared with COMs without NIH funding, had greater numbers of publications and citations and higher yearly average impact indices. Funding from the NIH in 2006 and 2010 was positively and significantly correlated with the numbers of publications, citations, and citations per publication and impact indices. The regression analysis indicated that 63.2% and 38.5% of the total variance in 2010 NIH funding explained by the model (adjusted R2=0.74) was accounted for by 2006 NIH funding and the combined bibliometric (ie, publications plus citations), respectively.

Conclusion: Greater scholarly output leads to the procurement of more NIH funds for research at COMs.

Colleges of osteopathic medicine (COMs) have made great strides in research during the past 20 years.1,2 The total amount of research funding secured by COMs in 2004 was approximately $101.7 million, up from $16.6 million in 1989.1,2 (When adjusted for inflation in 2010 dollars, these totals are $117.4 million and $29.2 million, respectively.) The American Osteopathic Association (AOA), recognizing that support and encouragement of quality scientific research is critical for the osteopathic medical profession, plays a vital role at the start of the funding process. To stimulate scholarly activity, the AOA Council on Research (formerly the Bureau of Research) provides modest support—often termed “seed grants”—to investigators and trainees at COMs to help them gain visibility in the broader medical community and within federal funding agencies such as the National Institutes of Health (NIH).3 Rose and Prozialeck4 reported that awards made by the AOA Bureau of Research between 1995 and 2001 helped grantees secure an additional $5.7 million in extramural funds. 
The NIH was the primary funder of research at COMs—as it is for most allopathic medical schools—accounting for $60.4 million of $101.7 million (59%) of the total funds. This amount is more than 7 times that of the next identifiable contributor (“Other Federal”), which gave $8.5 million.1,5 On a yearly basis, the NIH allocates billions of dollars for health research and typically provides nearly half of its funds for research at US medical schools.4 In 2009, however, of the approximately $11 billion given to medical schools, only approximately $135 million (1.2%) was granted to COMs4; COMs accounted for approximately 20% of all medical schools at that time. Given the importance of NIH funding to research at COMs but the disproportionate amount awarded to COMs, an investigation of the factors related to the securing of NIH funding by COMs is highly warranted. 
Previous assessments of research productivity at COMs1,2 have been limited in number and scope; they have focused primarily on describing the characteristics of funded research projects, such as the amount and number of grants. To our knowledge, no systematically conducted scientific reports are available regarding correlates or predictors of research funding at COMs, although Prozialeck7 offered anecdotal and speculative suggestions about factors that influence research productivity by COM faculty. Furthermore, some researchers8,9 have underscored the weaknesses (eg, inadequate use of standard statistical methods) of U.S. News & World Report, the most commonly cited and referenced ranking of the nation's colleges, which may lead to biased outcomes. For example, every COM in the research medical school rankings10 is listed as “Rank Not Published” or “Unranked” either because they are in the bottom quarter of the research medical schools or because the school did not supply the Report with enough key statistical data to be numerically ranked.11 
Bibliometrics is an array of methods that examines the influence of research areas, researchers, or research products (eg, journal articles) in a given field of study.12-14 These methods may, for example, comprise counting total publications (ie, published articles) or counting the average number of citations per publications. Bibliometric measures are used to calculate journal impact factors, develop benchmarks, and coordinate research activities.12,15,16 Although the object of some criticism (eg, self-citation bias), these measures nevertheless accurately depict scholarly communication patterns, correlate with peer-review ratings, predict emerging fields of research, show disciplinary influences, and map various types of collaboration.17-20 Germane to a primary motivation for the current study, bibliometric measures appear to be important for informing funding decisions. A majority (64%)21 of NIH grants result in an article in a mainstream scientific journal, and the number of articles produced is directly related to the amount of NIH awards made to the medical schools. Other studies5,22,23 also have shown strong linear relationships between bibliometric measures and funding for research. An evaluation of bibliometric measures may provide insights into NIH funding at COMs, especially if several measures derived from sound methodologic approaches are integrated.5 
The objectives of the present study were to determine bibliometric measures at COMs and to examine associations between measures generated between 2006 and 2010 and the amount of research funding COMs received from the NIH in the fiscal years 2006 and 2010. The current study is important because it focuses on bibliometric measures from COMs, which, to our knowledge, have not been previously published. Thus, we hypothesize that the data on bibliometrics could establish benchmarks for more productive research at COMs and thus help COMs improve their ability to procure NIH funds. 
Methods
Colleges of Osteopathic Medicine
All COMs that had inaugural classes in 2006 or before were included in this study (N=28). Institution establishment dates ranged from 1891 to 2005, and 22 of 28 (79%) were privately financed institutions. In 2010, the average enrollment was 450 students and the total number of full-time faculty was 1932.24 
Bibliometric Data Extraction Procedures
On June 3, 2011, we used the Web of Science database to obtain information about bibliometric measures at COMs. The database indexes more than 12,000 of the world's top-cited journals in all disciplines and provides details regarding their citation data.25 We analyzed several search strategies and results for each institution in the address field. We searched using a combination of the schools' proper names, common name abbreviations, parent institution names, departmental affiliations, zip codes, and city names to retrieve all relevant data. After searching for research at an institution, we refined the results by 2 criteria available within the “Refine Results” feature. First, we limited the results to 2 document types—articles and reviews—thus excluding such items as letters, proceedings papers, meeting abstracts, editorial materials, notes, news items, reprints, and corrections. After the document type refinement, we limited the results to 2006 through 2010, the publication years within the scope of this study. The use of this 5-year period diminished the possibility of a single year skewing the results, and it allowed up to a 5-year lag between publications and funding.22 
We used the Web of Science database to generate a citation report of 3 bibliometric measures for each year from 2006 through 2010 at each COM: total number of peer-reviewed, published articles (ie, publications); total number of citations to publications (ie, citations); and number of publications with no citations. From these measures we calculated institutional percentages of articles with no citations, the number of citations per publication, and the impact index. The impact index has been described in detail elsewhere.5,26 It characterizes, by institution, the number of publications with a high number of citations relative to all publications at the same institution. The index was computed as follows:  
impact index = h ÷ p m
 
For this equation, h is the h-index, or the number of publications (h) from an institution that have been cited by others at least h times, and p is the total number of publications per institution raised to the power law or master curve exponent (m). Molinari and Molinari26 set m at 0.4, deriving that value from the universal growth rate for citations over time for large numbers of publications. 
Numbers of publications, citations, and publications with 0 citations were summated to yield 5-year totals (2006 through 2010) for these variables. The average value over the same 5-year period was used for citations per publication, impact index, and institutional percentages of publications with 0 citations. We obtained the information about NIH funding of COMs from a database maintained by the US Department of Health and Human Services (http://report.nih.gov/). 
Statistical Analysis
Descriptive statistics are presented as mean (standard deviation) and sums, if applicable. All 2006 dollar amounts were adjusted for inflation (plus 8.16%) to reflect 2010 dollar values. Prior to data analyses, variable distributions were examined for normality. Deviations from normality (eg, skewness statistics ⩾2.0) were log or square root transformed for use in analyses requiring normality of data. In all cases, the transformation procedure reduced skewness statistics to values of less than 2.0. Multivariate analysis of variance was conducted to compare bibliometric measures between COMs with and COMs without NIH funding in 2010. Bivariate relationships among variables were examined by means of the Pearson product moment correlation. A multiple linear regression model was constructed to determine if bibliometric measures resulting from 5 years of scholarly activity (2006 through 2010) at COMs predict 2010 NIH funding. The independent variables were school type (private=0, public=1), 2006 NIH funding, and a combined bibliometric. The combined bibliometric was calculated as the number of publications from 2006 through 2010 plus citations to these publications (ie, publications plus citations). The variable was created because publications and citations could not exist as separate variables in the same regression model because of their high correlation (r=0.99) and subsequent problems with multicolinearity.27 All statistical analyses were conducted using SPSS statistical software (version 17.1; SPSS Inc, Chicago, Illinois) with α set a priori at .05 and with COM (n=28) as the unit of analysis. 
Results
Table 1 lists the full and the abbreviated names for all COMs included in this study. Table 2 contains descriptive statistics for the bibliometric measures summed or averaged for the 5-year period studied and the amount of NIH funding in 2006 and 2010 for the 28 COMs. Over the course of 5 years, COMs with NIH funding in 2010 averaged 60 more publications (P<.005) and 395 more citations (P<.001) than COMs without NIH funding; the number of citations per publication, however, did not differ significantly. Funded COMs, compared with nonfunded COMs, produced more publications that were not cited (P<.01); however, the funded COMs and the nonfunded COMs did so at similar rates, with approximately one-fourth of the publications not being cited regardless of whether they came from funded or nonfunded COMs. The yearly average impact index was 43% higher at COMs with 2010 NIH funding compared with COMs without 2010 NIH funding (P<.01). 
Table 1.
Colleges of Osteopathic Medicine in 2006
Abbreviation Name
ATSU-KCOM A.T. Still University-Kirksville College of Osteopathic Medicine
ATSU-SOMA A.T. Still University-School of Osteopathic Medicine in Arizona
DMU-COM Des Moines University College of Osteopathic Medicine
KCUMB-COM Kansas City University of Medicine and Biosciences' College of Osteopathic Medicine
LECOM Lake Erie College of Osteopathic Medicine
LECOM-Bradenton Lake Erie College of Osteopathic Medicine-Bradenton
LMU-DCOM Lincoln Memorial University-DeBusk College of Osteopathic Medicine
MSUCOM Michigan State University College of Osteopathic Medicine
MWU/AZCOM Midwestern University/Arizona College of Osteopathic Medicine
MWU/CCOM Midwestern University/Chicago College of Osteopathic Medicine
NSU-COM Nova Southeastern University College of Osteopathic Medicine
NYCOM New York College of Osteopathic Medicine of New York Institute of Technology
OSU-COM Oklahoma State University Center for Health Sciences College of Osteopathic Medicine
OU-HCOM Ohio University Heritage College of Osteopathic Medicine
PCOM Philadelphia College of Osteopathic Medicine
GA-PCOM Georgia Campus-Philadelphia College of Osteopathic Medicine
PNWU-COM Pacific Northwest University of Health Sciences, College of Osteopathic Medicine
RVUCOM Rocky Vista University College of Osteopathic Medicine
TouroCOM Touro College of Osteopathic Medicine in New York City
TUCOM Touro University California, College of Osteopathic Medicine
TUNCOM Touro University Nevada College of Osteopathic Medicine
UMDNJ-SOM University of Medicine and Dentistry of New Jersey-School of Osteopathic Medicine
UNECOM University of New England College of Osteopathic Medicine
UNTHSC/TCOM University of North Texas Health Science Center Texas College of Osteopathic Medicine
UP-KYCOM University of Pikeville-Kentucky College of Osteopathic Medicine
VCOM-Virginia Edward Via College of Osteopathic Medicine–Virginia Campus
WesternU/COMP Western University of Health Sciences College of Osteopathic Medicine of the Pacific
WVSOM West Virginia School of Osteopathic Medicine
Table 1.
Colleges of Osteopathic Medicine in 2006
Abbreviation Name
ATSU-KCOM A.T. Still University-Kirksville College of Osteopathic Medicine
ATSU-SOMA A.T. Still University-School of Osteopathic Medicine in Arizona
DMU-COM Des Moines University College of Osteopathic Medicine
KCUMB-COM Kansas City University of Medicine and Biosciences' College of Osteopathic Medicine
LECOM Lake Erie College of Osteopathic Medicine
LECOM-Bradenton Lake Erie College of Osteopathic Medicine-Bradenton
LMU-DCOM Lincoln Memorial University-DeBusk College of Osteopathic Medicine
MSUCOM Michigan State University College of Osteopathic Medicine
MWU/AZCOM Midwestern University/Arizona College of Osteopathic Medicine
MWU/CCOM Midwestern University/Chicago College of Osteopathic Medicine
NSU-COM Nova Southeastern University College of Osteopathic Medicine
NYCOM New York College of Osteopathic Medicine of New York Institute of Technology
OSU-COM Oklahoma State University Center for Health Sciences College of Osteopathic Medicine
OU-HCOM Ohio University Heritage College of Osteopathic Medicine
PCOM Philadelphia College of Osteopathic Medicine
GA-PCOM Georgia Campus-Philadelphia College of Osteopathic Medicine
PNWU-COM Pacific Northwest University of Health Sciences, College of Osteopathic Medicine
RVUCOM Rocky Vista University College of Osteopathic Medicine
TouroCOM Touro College of Osteopathic Medicine in New York City
TUCOM Touro University California, College of Osteopathic Medicine
TUNCOM Touro University Nevada College of Osteopathic Medicine
UMDNJ-SOM University of Medicine and Dentistry of New Jersey-School of Osteopathic Medicine
UNECOM University of New England College of Osteopathic Medicine
UNTHSC/TCOM University of North Texas Health Science Center Texas College of Osteopathic Medicine
UP-KYCOM University of Pikeville-Kentucky College of Osteopathic Medicine
VCOM-Virginia Edward Via College of Osteopathic Medicine–Virginia Campus
WesternU/COMP Western University of Health Sciences College of Osteopathic Medicine of the Pacific
WVSOM West Virginia School of Osteopathic Medicine
×
Table 2.
Bibliometric Measures of NIH Funding at COMs From 2006 Through 2010
Overall (N=28) COMs With 2010 NIH Funding (n=12) COMs Without 2010 NIH Funding (n=16)
Measure Mean (SD) Sum Mean (SD) Sum Mean (SD) Sum
Publications 65.8 (58.4) 1843 100.5 (52.6)a 1206 39.8 (49.3) 637
Citations 358.6 (327.9) 10,041 584.2 (300.4)a 7010 189.4 (237.0) 3031
Citations per publication 5.1 (2.1) NA 6.0 (1.7) NA 4.5 (2.1) NA
Impact index 1.7 (0.5) NA 2.0 (0.3)a NA 1.4 (0.5) NA
Publications with 0 citations 16.3 (14.0) 455 23.7 (11.1)b 284 10.7 (13.6) 171
Publications with 0 citations, % 27.4 (17.3) NA 24.2 (6.1) NA 29.8 (22.3) NA
2010 NIH fundingc 0.469 (1.023) 13.139 1.095 (1.349)d 13.139 0 0
2006 NIH fundingc,e 0.435 (0.970) 11.318 0.885 (1.300) 10.618 0.050 (0.187) 0.700
  a P<.005.
  b P<.05.
  c In millions of US dollars.
  d P<.001.
  e Adjusted for inflation (+8.16%).
  Abbreviations: COM, college of osteopathic medicine; NA, not applicable; NIH, National Institutes of Health; SD, standard deviation.
Table 2.
Bibliometric Measures of NIH Funding at COMs From 2006 Through 2010
Overall (N=28) COMs With 2010 NIH Funding (n=12) COMs Without 2010 NIH Funding (n=16)
Measure Mean (SD) Sum Mean (SD) Sum Mean (SD) Sum
Publications 65.8 (58.4) 1843 100.5 (52.6)a 1206 39.8 (49.3) 637
Citations 358.6 (327.9) 10,041 584.2 (300.4)a 7010 189.4 (237.0) 3031
Citations per publication 5.1 (2.1) NA 6.0 (1.7) NA 4.5 (2.1) NA
Impact index 1.7 (0.5) NA 2.0 (0.3)a NA 1.4 (0.5) NA
Publications with 0 citations 16.3 (14.0) 455 23.7 (11.1)b 284 10.7 (13.6) 171
Publications with 0 citations, % 27.4 (17.3) NA 24.2 (6.1) NA 29.8 (22.3) NA
2010 NIH fundingc 0.469 (1.023) 13.139 1.095 (1.349)d 13.139 0 0
2006 NIH fundingc,e 0.435 (0.970) 11.318 0.885 (1.300) 10.618 0.050 (0.187) 0.700
  a P<.005.
  b P<.05.
  c In millions of US dollars.
  d P<.001.
  e Adjusted for inflation (+8.16%).
  Abbreviations: COM, college of osteopathic medicine; NA, not applicable; NIH, National Institutes of Health; SD, standard deviation.
×
Bivariate correlations among study variables are described by the Pearson product moment correlation coefficients given in Table 3. As expected, several statistically significant correlations were observed among the bibliometric measures. For example, the numbers of publications and citations were almost perfectly correlated (r=0.99; P<.001), and both were directly and statistically significantly related to the impact index and articles with 0 citations. When citations and publications with 0 citations were made relative to the number of publications, their correlations with publications became statistically insignificant. The impact index was the only indicator significantly correlated with all other bibliometric measures. Impact index values were directly related to the number of publications, citations, citations per publication, and publications with 0 citations but inversely associated with the percentage of publications with 0 citations. All bibliometric measures except the percentage of publications with 0 citations were positively and statistically significantly related to 2006 and 2010 NIH funding. Higher levels of NIH funding also were related to status as a public COM. 
Table 3.
Bivariate Relationships Between COM Type, Bibliometric Measures, and NIH Funding From 2006 Through 2010 (N=28)
Variable Typea Publications Citations Citations per Publication Impact Index Publications With 0 Citations Publications With 0 Citations, % NIH Funding, 2006
Publications 0.28
Citations 0.34 0.99b
Citations per publicationc 0.10 0.02 0.2
Impact Indexc 0.17 0.50d 0.58e 0.50e
Publications with 0 citations 0.19 0.97b 0.91b 0.01 0.43d
Publications with 0 citations, %c −0.17 −0.11 −0.12 −0.49e −0.53e −0.30
NIH funding, 2006 0.68b 0.45d 0.59e 0.37d 0.46d 0.39d −0.17
NIH funding, 2010 0.49e 0.58e 0.72b 0.38d 0.51e 0.52e −0.14 0.88b
  a 0=private, 1=public.
  b P<.001.
  c Yearly average from 2006 through 2010.
  d P<.05.
  e P<.01.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health.
Table 3.
Bivariate Relationships Between COM Type, Bibliometric Measures, and NIH Funding From 2006 Through 2010 (N=28)
Variable Typea Publications Citations Citations per Publication Impact Index Publications With 0 Citations Publications With 0 Citations, % NIH Funding, 2006
Publications 0.28
Citations 0.34 0.99b
Citations per publicationc 0.10 0.02 0.2
Impact Indexc 0.17 0.50d 0.58e 0.50e
Publications with 0 citations 0.19 0.97b 0.91b 0.01 0.43d
Publications with 0 citations, %c −0.17 −0.11 −0.12 −0.49e −0.53e −0.30
NIH funding, 2006 0.68b 0.45d 0.59e 0.37d 0.46d 0.39d −0.17
NIH funding, 2010 0.49e 0.58e 0.72b 0.38d 0.51e 0.52e −0.14 0.88b
  a 0=private, 1=public.
  b P<.001.
  c Yearly average from 2006 through 2010.
  d P<.05.
  e P<.01.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health.
×
Results of the stepwise multiple regression analysis are presented in Table 4. Higher amounts of NIH funding in 2006 and a higher combined bibliometric (ie, publications plus citations) were statistically significantly associated with higher levels of 2010 NIH funding (P<.005). These 2 independent variables explained 73.6% of the total variance in 2010 NIH funding after adjusting for the standard error. Funding from the NIH in 2006 and combined bibliometric measures accounted for 63.2% and 38.5% of the total variance explained by the model, respectively. School type had a statistically insignificant association with 2010 NIH funding (P=.65). 
Table 4.
Stepwise Multiple Regression Analysis Showing Predictors of NIH Funding, 2010a
Variable Standardized β Coefficients t Values
Intercept −0.02
NIH funding, 2006 ($, million) .632 4.33b
Combined bibliometric (publications plus citations, 2006-2010) .385 3.20c
School type (0=private, 1=public) −.031 −0.23
  a Parameters forfull model statistics: F3,25=22.9, P<.001, adjusted R2=0.724. Parameters for reduced model with only 2006 NIH funding and combined bibliometric: F2,25=35.8, P<.001, adjusted R2=0.736
  b P<.001.
  c P<.005.
  Abbreviation: NIH, National Institutes of Health.
Table 4.
Stepwise Multiple Regression Analysis Showing Predictors of NIH Funding, 2010a
Variable Standardized β Coefficients t Values
Intercept −0.02
NIH funding, 2006 ($, million) .632 4.33b
Combined bibliometric (publications plus citations, 2006-2010) .385 3.20c
School type (0=private, 1=public) −.031 −0.23
  a Parameters forfull model statistics: F3,25=22.9, P<.001, adjusted R2=0.724. Parameters for reduced model with only 2006 NIH funding and combined bibliometric: F2,25=35.8, P<.001, adjusted R2=0.736
  b P<.001.
  c P<.005.
  Abbreviation: NIH, National Institutes of Health.
×
The bibliometric data for the 5 years examined in the present study are provided in Table 5. The 28 COMs are sorted in ascending order by 2010 NIH funding amounts. Approximately $10.5 million of $13.1 million (79%) of the total 2010 NIH funds were accounted for by the first 5 COMs. Likewise, the 5 COMs—despite representing only 18% of COMs—were responsible for disproportionately larger percentages of all publications (699 of 1843 [37.9%]) and citations (4260 of 10,041 [42.4%]). Of the COMs with 2010 NIH funding, 11 of 12 (91.7%) had more than 50 publications between 2006 and 2010. In contrast, only 4 of 16 COMs (25%) in the bottom half of the funding list had more than 50 publications during this period. 
Table 5.
Bibliometric Measures of National Institutes of Health Funding for COMs, 2006-2010 (N=28)
COM NIH Funding, 2010a Publications Citations Citations per Publication Impact Index Publications With 0 Citations, No. Publications With 0 Citations, %
MSUCOM 4.995 147 1007 6.9 2.3 34 23.1
UMDNJ-SOM 1.940 136 729 5.4 1.8 32 23.5
OU-HCOM 1.448 219 1213 5.5 1.9 46 21.0
NSU-COM 1.175 63 664 10.5 2.5 16 25.4
VCOM-Virginia 0.935 134 647 4.8 1.7 28 20.9
UNECOM 0.780 52 281 5.4 2.1 11 21.2
ATSU-KCOM 0.528 57 241 4.2 1.8 22 38.6
UNTHSC/TCOM 0.473 42 287 6.8 2.2 7 16.7
PCOM 0.396 94 583 6.2 2.1 30 31.9
MWU/CCOM 0.219 68 470 6.9 2.2 14 20.6
WesternU/COMP 0.182 68 308 4.5 1.7 19 27.9
NYCOM 0.067 126 580 4.6 1.9 25 19.8
MWU/AZCOM 0 177 921 5.2 1.9 51 28.8
TUCOM 0 31 294 9.5 2.3 9 29.0
KCUMB-COM 0 108 412 3.8 1.4 22 20.4
LECOM 0 79 353 4.5 1.7 20 25.3
ATSU-SOMA 0 86 264 3.1 1.4 28 32.6
DMU-COM 0 45 231 5.1 1.8 11 24.4
TUNCOM 0 33 199 6.0 2.0 7 21.2
WVSOM 0 22 105 4.8 1.5 6 27.3
GA-PCOM 0 12 85 7.1 1.1 4 33.3
OSU-COM 0 15 57 3.8 1.4 3 20.0
TouroCOM 0 9 42 5.7 1.3 4 44.4
LECOM-Bradenton 0 9 29 3.2 0.8 4 44.4
LMU-DCOM 0 4 23 5.8 1.7 0 0.0
RVUCOM 0 4 10 2.5 1.2 1 25.0
PNWU-COM 0 2 6 3.0 1.5 0 0.0
UP-KYCOM 0 1 0 0.0 0.0 1 100.0
  a In millions of US dollars.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health. The full names of the COMs appear in Table 1.
Table 5.
Bibliometric Measures of National Institutes of Health Funding for COMs, 2006-2010 (N=28)
COM NIH Funding, 2010a Publications Citations Citations per Publication Impact Index Publications With 0 Citations, No. Publications With 0 Citations, %
MSUCOM 4.995 147 1007 6.9 2.3 34 23.1
UMDNJ-SOM 1.940 136 729 5.4 1.8 32 23.5
OU-HCOM 1.448 219 1213 5.5 1.9 46 21.0
NSU-COM 1.175 63 664 10.5 2.5 16 25.4
VCOM-Virginia 0.935 134 647 4.8 1.7 28 20.9
UNECOM 0.780 52 281 5.4 2.1 11 21.2
ATSU-KCOM 0.528 57 241 4.2 1.8 22 38.6
UNTHSC/TCOM 0.473 42 287 6.8 2.2 7 16.7
PCOM 0.396 94 583 6.2 2.1 30 31.9
MWU/CCOM 0.219 68 470 6.9 2.2 14 20.6
WesternU/COMP 0.182 68 308 4.5 1.7 19 27.9
NYCOM 0.067 126 580 4.6 1.9 25 19.8
MWU/AZCOM 0 177 921 5.2 1.9 51 28.8
TUCOM 0 31 294 9.5 2.3 9 29.0
KCUMB-COM 0 108 412 3.8 1.4 22 20.4
LECOM 0 79 353 4.5 1.7 20 25.3
ATSU-SOMA 0 86 264 3.1 1.4 28 32.6
DMU-COM 0 45 231 5.1 1.8 11 24.4
TUNCOM 0 33 199 6.0 2.0 7 21.2
WVSOM 0 22 105 4.8 1.5 6 27.3
GA-PCOM 0 12 85 7.1 1.1 4 33.3
OSU-COM 0 15 57 3.8 1.4 3 20.0
TouroCOM 0 9 42 5.7 1.3 4 44.4
LECOM-Bradenton 0 9 29 3.2 0.8 4 44.4
LMU-DCOM 0 4 23 5.8 1.7 0 0.0
RVUCOM 0 4 10 2.5 1.2 1 25.0
PNWU-COM 0 2 6 3.0 1.5 0 0.0
UP-KYCOM 0 1 0 0.0 0.0 1 100.0
  a In millions of US dollars.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health. The full names of the COMs appear in Table 1.
×
Comment
In this study, we sought to describe bibliometric data for COMs and to explore their relationships with NIH funding. The current study is unique in its focus on COMs and use of a retrospective, longitudinal study design to determine whether bibliometric measures predict NIH funding. Our primary finding was that previous NIH funding and publishing peer-reviewed journal articles that are cited enhances a COM's success in obtaining NIH support for research. Other noteworthy results include the statistically significant and positive correlations between 2006 NIH funding and 2006 to 2010 scholarly productivity (eg, number of publications); relationships among bibliometric measures; and the clear display of heterogeneity (ie, variance) of NIH and bibliometric data across the 28 COMs examined. 
The relationships among bibliometric measures found in this study coincide with those reported by others.5,28,29 For example, the correlation coefficient between publication and citation counts was r=0.99 in the present study, r=0.98 in Hendrix,5 and r=0.89 in van Raan.28 Likewise, uniform findings exist regarding the statistically insignificant association between publication counts and the percentage of publications not cited (correlation coefficient range r=−0.26 to r=0.35).5,28 Druss and Marcus30 indicated that each NIH R01 grant produced on average 7.6 MEDLINE publications, and Rose and Prozialeck4 showed that approximately 65% of projects at COMs funded by the AOA result in publications. The results of the present study confirm these findings and those of other investigators5,22,23,30,31 who have consistently demonstrated that funding is correlated with future scholarship. The current study, however, provides additional insight. 
According to the multiple regression analysis, the additive effects of publications and citations also predict NIH funding amounts. Because citations are a good indicator of publication quality,23 it appears that the best strategy for COMs to garner more NIH funds would be to optimize the balance between the number of publications and the quality of those publications. Results of regression analysis also reveal that a COM might potentially accrue NIH funding based on scholarly output alone. In other words, a track record of obtaining NIH funding (in this study, having obtained NIH funding 5 years previously in 2006 vs 2010) may not be a necessary part of the formula for securing future NIH funding. The finding of a causal relationship between publications plus citations and NIH funding supports the notion that grant application reviews at the NIH are influenced by bibliometric measures. Indeed, reviewers for federal agencies such as the NIH have been found to provide more favorable reviews to senior principal investigators with stronger scholarly records resulting from publishing results of important (ie, high-quality) findings.32 Furthermore, publications and citations provide greater visibility for scholars and attest to their ability to carry a project through to the dissemination phase.23,33 According to Ramsden,34 it is absolutely necessary for academics to have prior publications to be successful in obtaining research grants. 
To our knowledge, the present study is the first to provide a detailed description of bibliometric measures and NIH funding for COMs. The 28 COMs averaged $435,000 and $469,000 in 2006 and 2010 NIH funding, respectively. Twelve of the 26 COMs (46.2%) received NIH funding in 2010. On average, each COM produced a total of 13.2 publications and 71.7 citations per year from 2006 through 2010. In comparison, allopathic medical schools (n=123) were awarded, on average, $78 million each year in NIH funding between 1997 and 2005, with 952 publications and 16,288 citations per year between 1997 and 2007; US dental schools were awarded an average of $4.9 million per year in NIH funds between 2005 and 2009.5,35 These considerable differences are probably the result of several simultaneously active factors. One obvious institutional factor is faculty size. In 2010, COMs averaged 69 full-time faculty compared with an average of 1027 at allopathic medical schools, and this disparity appears to be widening.1,24,36,37 
Comparing allopathic schools with COMs also yields a much higher average citation rate per article (14 vs 6) and impact indices (3.2 vs 2.0).5 These differences were likely less related to the number of faculty and more to quality issues at the institutional and faculty levels. According to Clearfield et al,1 an important barrier to scholarly activity at COMs is the inadequate or suboptimal organizational infrastructure or the lack of human resources to cover academic responsibilities not related to research (eg, administration, classroom instruction, service on university committees).The existence of these barriers would reduce the time available for research.38-40 Ultimately, this could dampen motivation or momentum and foster a negative research culture, resulting in comparatively low levels of scholarship.7,41 Other barriers to scholarly output include the private status of COMs—which often limits resources and emphasizes training osteopathic physicians—and a long-standing funding bias, whereby COMs do not receive an adequate proportion of the federal monies for infrastructure and research programs.6,42 
A number of actions could be taken to overcome some of the barriers to scholarly productivity at COMs. Investment in research is a proven method to stimulate extramural grant procurement. An investment of $69 million at 1 medical school for basic science research garnered $99.7 million in extramural funding.43 The importance of this investment is accentuated by the considerable proportion of faculty (up to 55%) who conduct basic science research, because basic science faculty tend to be more focused on research compared with other faculty types, especially clinical faculty.44 Not surprisingly, when compared with clinical faculty, basic science faculty produce more publications per year (3.7 vs 2.1), receive higher impact scores for their publications (6.7 vs 4.4), and obtain more funding for research (5-fold higher funding amounts).44 Another approach would be to devote more funds to the development of research initiatives focused on the study of osteopathic manipulative medicine. A prime example of success in this area is The Osteopathic Research Center, which was established with an initial investment of $1.1 million. During the first 4 years of operation, The Osteopathic Research Center was awarded $3.2 million in NIH grants, which it then used to conduct comprehensive studies of osteopathic manipulative medicine, resulting in its faculty and staff generating 35 full-length publications and 30 conference posters or presentations.45 
Scholarly output also can be enhanced through writing interventions. In a review, McGrail et al46 observed that writing interventions—such as finding support groups, attending courses, and working with coaches—stimulate the publication of research findings. Several studies in this review46 reported that in addition to improving the number of publications, writing interventions also positively affected the quality of the publications. Other benefits of writing interventions include the formation of multidisciplinary collaborations, which leads to more authors contributing to a manuscript (ie, more publications overall) and the creation of a more attractive work environment, which is important for recruiting research faculty.46 The former benefit, more authors, could have a substantial impact on manuscript quality and thus funding procurement.21 
There are several limitations that should be considered when interpreting the results of the current study. First, because of the nature of Web of Science's data set and capabilities, research was excluded if linked to institution names that were misspelled or that used unfamiliar variants in the address field. Moreover, citation errors occur in all bibliographic fields in Web of Science; thus, these citations were not accounted for. As stated in the Methods section, no fraction or proportional attribution techniques were applied in the case of multiple authors from different medical schools. Because its coverage is selective, Web of Science does not track the citation histories of thousands of journals, proceedings, technical reports, and patents. Additionally, the collected Web of Science data include self-citations, which may skew the resulting data. Other bibliometric studies47,48 demonstrated that the inclusion of self-citations proved insignificant to the overall results in macro-level studies. Moreover, all of these potential errors are likely to be randomized across COMs and therefore should not substantially affect the quality of the statistical findings. 
The NIH was the only source of grant-funding data used in this study. Monies from private foundations, non-governmental organizations, and other government departments were not taken into account. There is a strong likelihood that there is some base of research funding at the COMs not associated with NIH funding, such as intramural or foundation grants.31 Further, only NIH funding awarded in 2006 was used to represent past NIH funding history. It is possible that considering NIH funding history from different perspectives (eg, continuous years used, total number of years used) could alter the outcomes. 
Because we conducted our analyses at the institutional level, an assortment of bibliometric measures were examined.5,49 Although this formed a more comprehensive and accurate picture of research at COMs than previously available, other measures—especially qualitative measures such as opinions or perceptions—might have been included.5 McAllister and Narin31 found a statistically significant relationship between size-dependent measures, such as number of publications, and faculty perceptions of institutional quality. Further, other factors beyond the methodologic scope (eg, the US economy and its impact on NIH budgets and funding decisions) of this study may have mediated the effects of scholarly activity on NIH funding. 
Conclusion
The findings in the current study attest to the need for COMs to improve resources and infrastructure, which are necessary to augment the dissemination of research findings in quality, peer-reviewed journals. This contention aligns with standards ratified by the AOA Commission on Osteopathic College Accreditation, which challenges COMs to contribute more to science and medicine by scaling up their research efforts.50 Improvements in scholarly quality would also address widely held negative opinions of COMs being overly reliant on tuition to finance operating expenses rather than research.51 The osteopathic medical profession, and COMs in particular, could ostensibly lead the way with scientific evidence to establish the safety, efficacy, and effectiveness of spinal manipulation in treating low back pain and other musculoskeletal conditions.51 This would dovetail with the results of the 2007 National Health Interview Survey, which reported that 14.3 million adults use complementary and alternative medicine and 36% of adults had received spinal manipulation within the past year.51 Further, because of the philosophical ideals that form the basis of osteopathic medicine, COMs are in an advantageous position to contribute substantial research-generated knowledge toward preventing chronic diseases, such as childhood obesity.52 
   Financial Disclosures: None reported.
 
References
Clearfield MB, Smith-Barbaro P, Guillory VJet al. Research funding at colleges of osteopathic medicine: 15 years of growth. J Am Osteopath Assoc. 2007;107(11): 469-478. http://www.jaoa.org/content/107/11/469.full. Accessed October 1, 2012.
Guillory VJ, Sharp G. Research at US colleges of osteopathic medicine: a decade of growth [published correction in J Am Osteopath Assoc. 2003;103(10):458-459]. J Am Osteopath Assoc. 2003;103(4):176-181. http://www.jaoa.org/content/103/4/176.full.pdf. Accessed October 1, 2012.
Research grant and fellowship programs. American Osteopathic Association Web site. http://www.osteopathic.org/inside-aoa/development/quality/research-and-grants/Pages/research-grants-and-fellowships-program.aspx. Accessed October 1, 2012.
Rose RC, Prozialeck WC. Productivity outcomes for recent grants and fellowships awarded by the American Osteopathic Association Bureau of Research. J Am Osteopath Assoc. 2003;103(9):435-440. http://www.jaoa.org/content/103/9/435.long. Accessed October 1, 2012.
Hendrix D. An analysis of bibliometric indicators, National Institutes of Health funding, and faculty size at Association of American Medical College medical schools, 1997-2007 [published correction in J Med Libr Assoc. 2009;97(2):74]. J Med Libr Assoc. 2008;96(4):324-334. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2568842/pdf/mlab-96-04-324.pdf. Accessed October 1, 2012.
Table 2: total NIH awards to each medical school in 2008. Blue Ridge Institute for Medical Research Web site. http://www.brimr.org/NIH_Awards/2008/C/SchoolOfMedicine2008C.xls. Accessed October 1, 2012.
Prozialeck WC. Culture drives research funding [letter]. J Am Osteopath Assoc. 2008;108(7):353. http://www.jaoa.org/content/108/7/353.1.full.pdf+html. Accessed October 1, 2012.
McGaghie WC, Thompson JA. America's best medical schools: a critique of the U.S. News & World Report rankings. Acad Med. 2001;76(10):985-992. [CrossRef] [PubMed]
Webster TJ. A principal components analysis of the U.S. News & World Report tier rankings of colleges and universities. Econ Educ Rev. 2001;20(3):235-244. journals.cluteonline.com/index.php/JABR/article/download/2063/2250. Accessed October 1, 2012.
Best medical schools: research. U.S. News & World Report Web site. http://grad-schools.usnews.rankingsandreviews.com/best-graduate-schools/top-medical-schools/research-rankings. Accessed October 1, 2012.
Morse R, Flanigan S. Methodology: medical school rankings. US News & World Report. March 12 , 2012. http://www.usnews.com/education/best-graduate-schools/top-medical-schools/articles/2012/03/12/methodology-medical-school-rankings. Accessed October 1, 2012.
Borgman CL, Furner J. Scholarly communication and bibliometrics. In: Cronin B, ed. Annual Review of Information Science Technology. Vol 36. Medford, NJ: Information Today; 2002:3-72.
Epstein RJ. Journal impact factors do not equitably reflect academic staff performance in different medical subspecialties. J Investig Med. 2004;52(8):531-536. [PubMed]
Maunder RG. Using publication statistics for evaluation in academic psychiatry. Can J Psychiatry. 2007:52(12):790-797. [PubMed]
Noyons ECM, Moed HR, Luwel M. Combining mapping and citation analysis for evaluative bibliometric purposes: a bibliometric study. J Am Soc Inf Sci. 1999;50(2):115-131. [CrossRef]
Garfield E. What citations tell us about Canadian research. Can J Info Libr Sci. 1993:18(4):14-35. http://www.garfield.library.upenn.edu/papers/canadianjinfolibsci18%284%29p14y1993.html. Accessed October 1, 2012.
Seglen PO. Why the impact factor of journals should not be used for evaluating research. BMJ. 1997;314(7079):498-502. http://www.bmj.com/content/314/7079/497.1.full. Accessed October 1, 2012.
van Raan AFJ. Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises. Scientometrics. 1996;36(3):397-420. [CrossRef]
Oppenheim C. The correlation between citation counts and the 1992 research assessment exercise ratings for British research in genetics, anatomy, and archaeology. J Documentation. 1997;53(5):477-487. [CrossRef]
Schoonbaert D, Roelants G. Citation analysis for measuring the value of scientific publications: quality assessment tool or comedy of errors? Trop Med Int Health. 1996;1(6):739-752. [CrossRef] [PubMed]
Börner K, Ma N, Biberstine JRet al. Introducing the Science of Science (Sci2) Tool to the Reporting Branch, Office of Extramural Research/Office of the Director, National Institutes of Health. Arlington, VA: National Science Foundation. http://www.nsf.gov/sbe/sosp/social/wagner-borner.pdf. Accessed October 1, 2012.
Lewison G, Dawson G. The effect of funding on the outputs of biomedical research. Scientometrics. 1998;41(1):17-27. [CrossRef]
Lee KP, Schotland M, Bacchetti P, Bero LA. Association of journal quality indicators with methodological quality of clinical research articles. JAMA. 2002;287(21):2805-2808. http://jama.ama-assn.org/content/287/21/2805.full.pdf+html. Accessed October 1, 2012.
Levitan T. A Report on a Survey of Osteopathic Medical School Growth: Analysis of the Fall 2009 Survey. Chevy Chase, MD: American Association of Colleges of Osteopathic Medicine; 2009. http://www.aacom.org/resources/bookstore/Documents/GrowthRpt2009.pdf. Accessed October 1, 2012.
Web of Knowledge fact sheet. Thomson Reuters Web site. http://thomsonreuters.com/content/science/pdf/Web_of_Knowledge_factsheet.pdf. Accessed October 1, 2012.
Molinari JF, Molinari A. A new method for ranking scientific institutions. Scientometrics. 2008;75(1):163-174. [CrossRef]
Crown WH. Violations of regression assumptions. In: Crown WH, ed. Statistical Models for the Social and Behavioral Sciences: Multiple Regression and Limited-Dependent Variable Models. Westport, CT: Praeger Publishers; 1998:71-98.
van Raan AFJ. Bibliometric statistical properties of the 100 largest European research universities: prevalent scaling rules in the science system. J Am Soc Inf Sci Technol. 2008;59(3):461-475. http://arxiv.org/ftp/arxiv/papers/0704/0704.0889.pdf. Accessed October 1, 2012.
Chen TW, Chou LF, Chen TJ. World trend of peritoneal dialysis publications. Perit Dial Int. 2007;27(2):173-178. http://www.pdiconnect.com/content/27/2/173.full.pdf. Accessed October 1, 2012.
Druss BG, Marcus SC. Tracking publication outcomes of National Institutes of Health grants. Am J Med. 2005;118(6):658-663. [CrossRef] [PubMed]
McAllister PR, Narin F. Characterization of the research papers of US medical schools. J Am Soc Inf Sci. 1983;34(2):123-131. [CrossRef] [PubMed]
Porter R. What do grant reviewers really want, anyway? J Res Admin. 2005;34(2):5-13. http://www.srainternational.org/sra03/uploadedFiles/Vol36Issue2.pdf. Accessed October 1, 2012.
Murphy PS. Journal quality assessment for performance-based funding. Assess Eval Higher Educ. 1998;23(1):25-31. [CrossRef]
Ramsden P. Describing and explaining research productivity. Higher Educ. 1994;28(2):207-226. [CrossRef]
Lipton JA, Kinane DF. Total NIH support to US dental schools, 2005-2009. J Dent Res. 2011;90(3):283-288. [CrossRef] [PubMed]
Osborn GG. Taking osteopathic distinctiveness seriously: historical and philosophical perspectives [editorial]. J Am Osteopath Assoc. 2005;105(5):241-244. http://www.jaoa.org/cgi/content/full/105/5/241. Accessed October 1, 2012.
US medical school faculty, 2010. American Association of Medical Colleges Web site. https://www.aamc.org/data/facultyroster/reports/169876/usmsf10.html. Accessed October 1, 2012.
Young RA, DeHaven MJ, Passmore C, Baumer JG, Smith KV. Research funding and mentoring in family medicine residencies. Fam Med. 2007;39(6):410-418. http://www.stfm.org/fmhub/fm2007/June/Richard410.pdf. Accessed October 1, 2012.
Frantz JM, Amosun SL. Identifying strategies to improve research publication output in health and rehabilitation sciences: a review of the literature. Afr J Health Prof Educ. 2011;3(1):7-10. http://www.ajol.info/index.php/ajhpe/article/viewFile/69938/58012. Accessed October 1, 2012.
Page-Adams D, Cheng LC, Gogineni A, Shen CY. Establishing a group to encourage writing for publication among doctoral students [research notes report]. J Soc Work Educ. 1995;31(3):402-407.
Boice R, Jones F. Why academicians don't write. J Higher Educ. 1984;55(5):567-582. [CrossRef]
Levitan T. AACOM projections for growth through 2012: results of a 2007 survey of US Colleges of Osteopathic Medicine. J Am Osteopath Assoc. 2008;108 (3):116-120. http://www.jaoa.org/content/108/3/116.full.pdf+html. Accessed October 1, 2012.
Dorsey ER, Van Wuyckhuyse BC, Beck CA, Passalacqua WP, Guzick DS. The economics of new faculty hires in basic sciences. Acad Med. 2009;84(1):26-31. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2746364/pdf/nihms-125698.pdf. Accessed October 1, 2012.
Zinner DE, Campbell EG. Life-science research within US academic medical centers. JAMA. 2009;302(9):969-976. http://jama.jamanetwork.com/article.aspx?articleid=184502. Accessed October 1, 2012.
The Osteopathic Research Center. Publications. University of North Texas Health Science Center Web site. http://www.hsc.unt.edu/orc/research_publications.aspx. Accessed October 1, 2012.
McGrail MR, Rickard CM, Jones R. Publish or perish: a systematic review of interventions to increase academic publication rates. Higher Educ Res Dev. 2006;25(1):19-35. [CrossRef]
Glänzel W, Debackere K, Thijs B, Schubert A. A concise review on the role of author self-citations in information science, bibliometrics and science policy. Scientometrics. 2006;67(2):263-277. [CrossRef]
Thijs B, Glanzel W. The influence of author self-citations on bibliometric meso-indicators: the case of European universities. Scientometrics. 2005;66(1):71-80. [CrossRef]
Lewison G. New bibliometric techniques for the evaluation of medical schools. Scientometrics. 1998;41(1-2):5-16. [CrossRef]
Commission on Osteopathic College Accreditation. Handbook. Chicago, IL: American Osteopathic Association; May 1 , 2011. http://www.osteopathic.org/inside-aoa/accreditation/predoctoral%20accreditation/Documents/coca-handbook.pdf. Accessed October 1, 2012..
Licciardone JC. Time for the osteopathic profession to take the lead in musculoskeletal research. Osteopath Med Prim Care. 2009;3:3-6. http://www.om-pc.com/content/3/1/6. Accessed October 1, 2012.
Ehrmann PR. Childhood obesity: call to action for America's physicians [letter]. J Am Osteopath Assoc. 2007;107(7):245. http://www.jaoa.org/content/107/7/245.full.pdf+html. Accessed October 1, 2012.
Table 1.
Colleges of Osteopathic Medicine in 2006
Abbreviation Name
ATSU-KCOM A.T. Still University-Kirksville College of Osteopathic Medicine
ATSU-SOMA A.T. Still University-School of Osteopathic Medicine in Arizona
DMU-COM Des Moines University College of Osteopathic Medicine
KCUMB-COM Kansas City University of Medicine and Biosciences' College of Osteopathic Medicine
LECOM Lake Erie College of Osteopathic Medicine
LECOM-Bradenton Lake Erie College of Osteopathic Medicine-Bradenton
LMU-DCOM Lincoln Memorial University-DeBusk College of Osteopathic Medicine
MSUCOM Michigan State University College of Osteopathic Medicine
MWU/AZCOM Midwestern University/Arizona College of Osteopathic Medicine
MWU/CCOM Midwestern University/Chicago College of Osteopathic Medicine
NSU-COM Nova Southeastern University College of Osteopathic Medicine
NYCOM New York College of Osteopathic Medicine of New York Institute of Technology
OSU-COM Oklahoma State University Center for Health Sciences College of Osteopathic Medicine
OU-HCOM Ohio University Heritage College of Osteopathic Medicine
PCOM Philadelphia College of Osteopathic Medicine
GA-PCOM Georgia Campus-Philadelphia College of Osteopathic Medicine
PNWU-COM Pacific Northwest University of Health Sciences, College of Osteopathic Medicine
RVUCOM Rocky Vista University College of Osteopathic Medicine
TouroCOM Touro College of Osteopathic Medicine in New York City
TUCOM Touro University California, College of Osteopathic Medicine
TUNCOM Touro University Nevada College of Osteopathic Medicine
UMDNJ-SOM University of Medicine and Dentistry of New Jersey-School of Osteopathic Medicine
UNECOM University of New England College of Osteopathic Medicine
UNTHSC/TCOM University of North Texas Health Science Center Texas College of Osteopathic Medicine
UP-KYCOM University of Pikeville-Kentucky College of Osteopathic Medicine
VCOM-Virginia Edward Via College of Osteopathic Medicine–Virginia Campus
WesternU/COMP Western University of Health Sciences College of Osteopathic Medicine of the Pacific
WVSOM West Virginia School of Osteopathic Medicine
Table 1.
Colleges of Osteopathic Medicine in 2006
Abbreviation Name
ATSU-KCOM A.T. Still University-Kirksville College of Osteopathic Medicine
ATSU-SOMA A.T. Still University-School of Osteopathic Medicine in Arizona
DMU-COM Des Moines University College of Osteopathic Medicine
KCUMB-COM Kansas City University of Medicine and Biosciences' College of Osteopathic Medicine
LECOM Lake Erie College of Osteopathic Medicine
LECOM-Bradenton Lake Erie College of Osteopathic Medicine-Bradenton
LMU-DCOM Lincoln Memorial University-DeBusk College of Osteopathic Medicine
MSUCOM Michigan State University College of Osteopathic Medicine
MWU/AZCOM Midwestern University/Arizona College of Osteopathic Medicine
MWU/CCOM Midwestern University/Chicago College of Osteopathic Medicine
NSU-COM Nova Southeastern University College of Osteopathic Medicine
NYCOM New York College of Osteopathic Medicine of New York Institute of Technology
OSU-COM Oklahoma State University Center for Health Sciences College of Osteopathic Medicine
OU-HCOM Ohio University Heritage College of Osteopathic Medicine
PCOM Philadelphia College of Osteopathic Medicine
GA-PCOM Georgia Campus-Philadelphia College of Osteopathic Medicine
PNWU-COM Pacific Northwest University of Health Sciences, College of Osteopathic Medicine
RVUCOM Rocky Vista University College of Osteopathic Medicine
TouroCOM Touro College of Osteopathic Medicine in New York City
TUCOM Touro University California, College of Osteopathic Medicine
TUNCOM Touro University Nevada College of Osteopathic Medicine
UMDNJ-SOM University of Medicine and Dentistry of New Jersey-School of Osteopathic Medicine
UNECOM University of New England College of Osteopathic Medicine
UNTHSC/TCOM University of North Texas Health Science Center Texas College of Osteopathic Medicine
UP-KYCOM University of Pikeville-Kentucky College of Osteopathic Medicine
VCOM-Virginia Edward Via College of Osteopathic Medicine–Virginia Campus
WesternU/COMP Western University of Health Sciences College of Osteopathic Medicine of the Pacific
WVSOM West Virginia School of Osteopathic Medicine
×
Table 2.
Bibliometric Measures of NIH Funding at COMs From 2006 Through 2010
Overall (N=28) COMs With 2010 NIH Funding (n=12) COMs Without 2010 NIH Funding (n=16)
Measure Mean (SD) Sum Mean (SD) Sum Mean (SD) Sum
Publications 65.8 (58.4) 1843 100.5 (52.6)a 1206 39.8 (49.3) 637
Citations 358.6 (327.9) 10,041 584.2 (300.4)a 7010 189.4 (237.0) 3031
Citations per publication 5.1 (2.1) NA 6.0 (1.7) NA 4.5 (2.1) NA
Impact index 1.7 (0.5) NA 2.0 (0.3)a NA 1.4 (0.5) NA
Publications with 0 citations 16.3 (14.0) 455 23.7 (11.1)b 284 10.7 (13.6) 171
Publications with 0 citations, % 27.4 (17.3) NA 24.2 (6.1) NA 29.8 (22.3) NA
2010 NIH fundingc 0.469 (1.023) 13.139 1.095 (1.349)d 13.139 0 0
2006 NIH fundingc,e 0.435 (0.970) 11.318 0.885 (1.300) 10.618 0.050 (0.187) 0.700
  a P<.005.
  b P<.05.
  c In millions of US dollars.
  d P<.001.
  e Adjusted for inflation (+8.16%).
  Abbreviations: COM, college of osteopathic medicine; NA, not applicable; NIH, National Institutes of Health; SD, standard deviation.
Table 2.
Bibliometric Measures of NIH Funding at COMs From 2006 Through 2010
Overall (N=28) COMs With 2010 NIH Funding (n=12) COMs Without 2010 NIH Funding (n=16)
Measure Mean (SD) Sum Mean (SD) Sum Mean (SD) Sum
Publications 65.8 (58.4) 1843 100.5 (52.6)a 1206 39.8 (49.3) 637
Citations 358.6 (327.9) 10,041 584.2 (300.4)a 7010 189.4 (237.0) 3031
Citations per publication 5.1 (2.1) NA 6.0 (1.7) NA 4.5 (2.1) NA
Impact index 1.7 (0.5) NA 2.0 (0.3)a NA 1.4 (0.5) NA
Publications with 0 citations 16.3 (14.0) 455 23.7 (11.1)b 284 10.7 (13.6) 171
Publications with 0 citations, % 27.4 (17.3) NA 24.2 (6.1) NA 29.8 (22.3) NA
2010 NIH fundingc 0.469 (1.023) 13.139 1.095 (1.349)d 13.139 0 0
2006 NIH fundingc,e 0.435 (0.970) 11.318 0.885 (1.300) 10.618 0.050 (0.187) 0.700
  a P<.005.
  b P<.05.
  c In millions of US dollars.
  d P<.001.
  e Adjusted for inflation (+8.16%).
  Abbreviations: COM, college of osteopathic medicine; NA, not applicable; NIH, National Institutes of Health; SD, standard deviation.
×
Table 3.
Bivariate Relationships Between COM Type, Bibliometric Measures, and NIH Funding From 2006 Through 2010 (N=28)
Variable Typea Publications Citations Citations per Publication Impact Index Publications With 0 Citations Publications With 0 Citations, % NIH Funding, 2006
Publications 0.28
Citations 0.34 0.99b
Citations per publicationc 0.10 0.02 0.2
Impact Indexc 0.17 0.50d 0.58e 0.50e
Publications with 0 citations 0.19 0.97b 0.91b 0.01 0.43d
Publications with 0 citations, %c −0.17 −0.11 −0.12 −0.49e −0.53e −0.30
NIH funding, 2006 0.68b 0.45d 0.59e 0.37d 0.46d 0.39d −0.17
NIH funding, 2010 0.49e 0.58e 0.72b 0.38d 0.51e 0.52e −0.14 0.88b
  a 0=private, 1=public.
  b P<.001.
  c Yearly average from 2006 through 2010.
  d P<.05.
  e P<.01.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health.
Table 3.
Bivariate Relationships Between COM Type, Bibliometric Measures, and NIH Funding From 2006 Through 2010 (N=28)
Variable Typea Publications Citations Citations per Publication Impact Index Publications With 0 Citations Publications With 0 Citations, % NIH Funding, 2006
Publications 0.28
Citations 0.34 0.99b
Citations per publicationc 0.10 0.02 0.2
Impact Indexc 0.17 0.50d 0.58e 0.50e
Publications with 0 citations 0.19 0.97b 0.91b 0.01 0.43d
Publications with 0 citations, %c −0.17 −0.11 −0.12 −0.49e −0.53e −0.30
NIH funding, 2006 0.68b 0.45d 0.59e 0.37d 0.46d 0.39d −0.17
NIH funding, 2010 0.49e 0.58e 0.72b 0.38d 0.51e 0.52e −0.14 0.88b
  a 0=private, 1=public.
  b P<.001.
  c Yearly average from 2006 through 2010.
  d P<.05.
  e P<.01.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health.
×
Table 4.
Stepwise Multiple Regression Analysis Showing Predictors of NIH Funding, 2010a
Variable Standardized β Coefficients t Values
Intercept −0.02
NIH funding, 2006 ($, million) .632 4.33b
Combined bibliometric (publications plus citations, 2006-2010) .385 3.20c
School type (0=private, 1=public) −.031 −0.23
  a Parameters forfull model statistics: F3,25=22.9, P<.001, adjusted R2=0.724. Parameters for reduced model with only 2006 NIH funding and combined bibliometric: F2,25=35.8, P<.001, adjusted R2=0.736
  b P<.001.
  c P<.005.
  Abbreviation: NIH, National Institutes of Health.
Table 4.
Stepwise Multiple Regression Analysis Showing Predictors of NIH Funding, 2010a
Variable Standardized β Coefficients t Values
Intercept −0.02
NIH funding, 2006 ($, million) .632 4.33b
Combined bibliometric (publications plus citations, 2006-2010) .385 3.20c
School type (0=private, 1=public) −.031 −0.23
  a Parameters forfull model statistics: F3,25=22.9, P<.001, adjusted R2=0.724. Parameters for reduced model with only 2006 NIH funding and combined bibliometric: F2,25=35.8, P<.001, adjusted R2=0.736
  b P<.001.
  c P<.005.
  Abbreviation: NIH, National Institutes of Health.
×
Table 5.
Bibliometric Measures of National Institutes of Health Funding for COMs, 2006-2010 (N=28)
COM NIH Funding, 2010a Publications Citations Citations per Publication Impact Index Publications With 0 Citations, No. Publications With 0 Citations, %
MSUCOM 4.995 147 1007 6.9 2.3 34 23.1
UMDNJ-SOM 1.940 136 729 5.4 1.8 32 23.5
OU-HCOM 1.448 219 1213 5.5 1.9 46 21.0
NSU-COM 1.175 63 664 10.5 2.5 16 25.4
VCOM-Virginia 0.935 134 647 4.8 1.7 28 20.9
UNECOM 0.780 52 281 5.4 2.1 11 21.2
ATSU-KCOM 0.528 57 241 4.2 1.8 22 38.6
UNTHSC/TCOM 0.473 42 287 6.8 2.2 7 16.7
PCOM 0.396 94 583 6.2 2.1 30 31.9
MWU/CCOM 0.219 68 470 6.9 2.2 14 20.6
WesternU/COMP 0.182 68 308 4.5 1.7 19 27.9
NYCOM 0.067 126 580 4.6 1.9 25 19.8
MWU/AZCOM 0 177 921 5.2 1.9 51 28.8
TUCOM 0 31 294 9.5 2.3 9 29.0
KCUMB-COM 0 108 412 3.8 1.4 22 20.4
LECOM 0 79 353 4.5 1.7 20 25.3
ATSU-SOMA 0 86 264 3.1 1.4 28 32.6
DMU-COM 0 45 231 5.1 1.8 11 24.4
TUNCOM 0 33 199 6.0 2.0 7 21.2
WVSOM 0 22 105 4.8 1.5 6 27.3
GA-PCOM 0 12 85 7.1 1.1 4 33.3
OSU-COM 0 15 57 3.8 1.4 3 20.0
TouroCOM 0 9 42 5.7 1.3 4 44.4
LECOM-Bradenton 0 9 29 3.2 0.8 4 44.4
LMU-DCOM 0 4 23 5.8 1.7 0 0.0
RVUCOM 0 4 10 2.5 1.2 1 25.0
PNWU-COM 0 2 6 3.0 1.5 0 0.0
UP-KYCOM 0 1 0 0.0 0.0 1 100.0
  a In millions of US dollars.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health. The full names of the COMs appear in Table 1.
Table 5.
Bibliometric Measures of National Institutes of Health Funding for COMs, 2006-2010 (N=28)
COM NIH Funding, 2010a Publications Citations Citations per Publication Impact Index Publications With 0 Citations, No. Publications With 0 Citations, %
MSUCOM 4.995 147 1007 6.9 2.3 34 23.1
UMDNJ-SOM 1.940 136 729 5.4 1.8 32 23.5
OU-HCOM 1.448 219 1213 5.5 1.9 46 21.0
NSU-COM 1.175 63 664 10.5 2.5 16 25.4
VCOM-Virginia 0.935 134 647 4.8 1.7 28 20.9
UNECOM 0.780 52 281 5.4 2.1 11 21.2
ATSU-KCOM 0.528 57 241 4.2 1.8 22 38.6
UNTHSC/TCOM 0.473 42 287 6.8 2.2 7 16.7
PCOM 0.396 94 583 6.2 2.1 30 31.9
MWU/CCOM 0.219 68 470 6.9 2.2 14 20.6
WesternU/COMP 0.182 68 308 4.5 1.7 19 27.9
NYCOM 0.067 126 580 4.6 1.9 25 19.8
MWU/AZCOM 0 177 921 5.2 1.9 51 28.8
TUCOM 0 31 294 9.5 2.3 9 29.0
KCUMB-COM 0 108 412 3.8 1.4 22 20.4
LECOM 0 79 353 4.5 1.7 20 25.3
ATSU-SOMA 0 86 264 3.1 1.4 28 32.6
DMU-COM 0 45 231 5.1 1.8 11 24.4
TUNCOM 0 33 199 6.0 2.0 7 21.2
WVSOM 0 22 105 4.8 1.5 6 27.3
GA-PCOM 0 12 85 7.1 1.1 4 33.3
OSU-COM 0 15 57 3.8 1.4 3 20.0
TouroCOM 0 9 42 5.7 1.3 4 44.4
LECOM-Bradenton 0 9 29 3.2 0.8 4 44.4
LMU-DCOM 0 4 23 5.8 1.7 0 0.0
RVUCOM 0 4 10 2.5 1.2 1 25.0
PNWU-COM 0 2 6 3.0 1.5 0 0.0
UP-KYCOM 0 1 0 0.0 0.0 1 100.0
  a In millions of US dollars.
  Abbreviations: COM, college of osteopathic medicine; NIH, National Institutes of Health. The full names of the COMs appear in Table 1.
×