Free
Letters to the Editor  |   October 2014
Wikipedia vs Peer-Reviewed Medical Literature for Information About the 10 Most Costly Medical Conditions–III
Article Information
Neuromusculoskeletal Disorders / Arthritis
Letters to the Editor   |   October 2014
Wikipedia vs Peer-Reviewed Medical Literature for Information About the 10 Most Costly Medical Conditions–III
The Journal of the American Osteopathic Association, October 2014, Vol. 114, 764-765. doi:10.7556/jaoa.2014.148
The Journal of the American Osteopathic Association, October 2014, Vol. 114, 764-765. doi:10.7556/jaoa.2014.148
To the Editor: 
It seems to have become popular to use Wikipedia and similar online resources as references; therefore, the article by Hasty et al1 caught our attention. We are in agreement with the opinions of Hasty et al1 that health care providers should be aware of the limitations of nonrefereed online sources. However, the study could have been better designed and the data more appropriately analyzed. 
The authors looked at the accuracy of Wikipedia articles as a source of health care information, but they did not evaluate the accuracy of peer-reviewed articles. While these sources are generally accurate, errors can still exist. At the same time, the accuracy of peer-reviewed literature cannot be simply classified as either correct or incorrect. Discrepancies often occur among articles regarding a certain topic. Therefore, the measurement of the accuracy of peer-reviewed literature would not just be binary. Because Hasty et al1 did not identify a specified number of assertions for each condition and did not measure whether Wikipedia and peer-reviewed literature were correct or not, respectively, their use of the McNemar test to compare Wikipedia vs peer-reviewed medical literature was inappropriate. The McNemar test is used to compare proportions for paired data.2 If for example there were 100 items to be examined and we determined that both Wikipedia and peer-reviewed sources were correct in 70 of them, both were incorrect in 5, Wikipedia was correct but peer-reviewed sources were incorrect in 8, and Wikipedia was incorrect but peer-reviewed sources were correct in 17, then the McNemar test would be the appropriate method of comparison. However, this is not the type of data presented in Hasty et al.1 For example, if we take the data Hasty et al1 presented for osteoarthritis in the article's table 3 and interchange the rows under “dissimilar” (Chen Table), the data remain the same, but the P value for the McNemar test would be .522 instead of .003. 
Chen Table.
Dissimilar Assertions for Osteoarthritis by Concordance and Discordance
Did Wikipedia Match Peer-Reviewed Literature?
Reviewer Yes (Concordant) No (Discordant)
Reviewer 2 19 13
Reviewer 1 9 4
Chen Table.
Dissimilar Assertions for Osteoarthritis by Concordance and Discordance
Did Wikipedia Match Peer-Reviewed Literature?
Reviewer Yes (Concordant) No (Discordant)
Reviewer 2 19 13
Reviewer 1 9 4
×
Because the authors make the assumption that peer-reviewed articles are correct, it is more appropriate to study the percentage of concordance. Reviewer 1 and reviewer 2 in the study may be viewed as 2 randomly selected individuals who examined the assertions for each condition. It is unclear whether the same person reviewed more than 1 article, but we are not able to rule out that possibility. Also, according to their data, 2 reviewers for the same article, in general, did not review exactly the same assertions for the assigned article, which can be seen in the different total numbers. As a result, we can use the overall percentage from each reviewer (“Both” in table 3). It is then reasonable to use the average percentage of concordance from the 2 reviewers as if they were independent. The average percentages of concordance of 10 conditions ranged from 65.9% to 91.0%, with a median of 77.5% (mean [SE], 78.3%[2.8%]; 95% CI, 72.0%-84.6%). 
If a reviewer found concordance, it is reasonable to assume that the concordance is true. On the other hand, if a reviewer did not find concordance, the discordance is not necessarily 100% sure. Therefore, the percentages based on table 3 may likely be underestimated, which we believe would have a relatively minor impact. Another issue is that concordance and correctness are not the same. The data in the article are more adequate to reveal concordance than correctness with the peer-reviewed medical literature. 
Because the authors assumed that the peer-reviewed articles were correct, they did not define a hypothesis regarding the accuracy of the Wikipedia articles. Although one may hypothesize that the average concordance percentage of Wikipedia articles is greater than a given value (eg, 70%), this cutoff value may be subjective. Moreover, a point estimation and CI need to be provided in addition to a P value. We believe that our analyses are appropriate. A 78% concordance rate (95% CI, 72-85) reflects that many experts contributed to the Wikipedia articles. However, Wikipedia is an open-edited online encyclopedia, which may lower the accuracy.3 
From our interpretation of the data presented by Hasty et al,1 Wikipedia is not a bad online source. However, for patient care and for medical research, we agree that Wikipedia articles should not replace peer-reviewed medical literature. 
References
Hasty RT, Garbalosa RC, Barbato VAet al. Wikipedia vs peer-reviewed medical literature for information about the 10 most costly medical conditions. J Am Osteopath Assoc. 2014;114(5): 368-373. doi:10.7556jaoa.2014.035. [CrossRef] [PubMed]
Rosner B. Fundamentals of Biostatistics. 6th ed. New York, NY: Duxbury Press; 2005.
Bohannon J. Who's afraid of peer review? Science. 2013;342(6154):60-65. doi:10.1126/science.342.6154.60. [CrossRef] [PubMed]
Chen Table.
Dissimilar Assertions for Osteoarthritis by Concordance and Discordance
Did Wikipedia Match Peer-Reviewed Literature?
Reviewer Yes (Concordant) No (Discordant)
Reviewer 2 19 13
Reviewer 1 9 4
Chen Table.
Dissimilar Assertions for Osteoarthritis by Concordance and Discordance
Did Wikipedia Match Peer-Reviewed Literature?
Reviewer Yes (Concordant) No (Discordant)
Reviewer 2 19 13
Reviewer 1 9 4
×