"Don't use Wikipedia for medical advice," warns The Independent after a survey found factual errors in 9 out of 10 articles about the 10 most common medical conditions.
This story is based on a study that assessed the information in Wikipedia articles on 10 common conditions, including depression, back pain and high blood pressure.
Two researchers compared the information in each article against the peer-reviewed published literature to see if they agreed. They found that there was information that did not agree with the peer-reviewed sources in nine of the articles.
Wikipedia is a crowd-sourced information website that anyone can contribute to and edit. While the website is one of the most widely used resources online, it is vulnerable to abuse and inaccuracies.
But the study only assessed 10 articles and this may not be representative of all of the site's content. Other studies have found that Wikipedia showed good agreement with peer-reviewed sources.
The most important thing to take from this study is the need for caution when using the internet for medical information. A well-sourced and reliable article on Wikipedia (or elsewhere) will provide footnotes and links to secondary peer-reviewed sources, allowing you to confirm the accuracy of the content.
Where did the story come from?
The study was carried out by researchers from Campbell University and other research centres in the US. No funding was received for the study.
While most of the UK media's coverage of the study was reasonable, the Mail Online's headline overstated the findings. It claimed that, "90% of its medical entries are inaccurate, say experts".
The study only looked at 10 of the medical articles on Wikipedia. Considering that there are around 20,000 health articles on the site, this small sample may not be representative of the accuracy of the entire body of content.
What kind of research was this?
This was a cross-sectional study that compared medical information on Wikipedia against peer-reviewed literature.
Anyone can add to and edit the information on Wikipedia, and editors do not need to have any specialist knowledge or qualifications. However, the website does encourage the use of references to identify the source of information, as well as notes for readers where there is no source provided.
Wikipedia is a very popular site with the public, and studies have suggested that about 50-70% of doctors and medical students have used it as a source of information.
Despite this, there is concern from healthcare professionals that some of the medical information on Wikipedia may not be correct.
Some studies have suggested that Wikipedia is similar in the quality of its content to text books and other online and peer-reviewed information sources, including the Encyclopaedia Britannica.
However, other studies have suggested that it is not a reliable source for information about drugs or liver and digestive system conditions.
This study wanted to look at the medical information available on Wikipedia across a range of important conditions.
What did the research involve?
The researchers looked at Wikipedia entries on the 10 conditions that cost the US the most in terms of public and private healthcare spending.
For each factual statement in the entries, the researchers searched a medical database for peer-reviewed medical literature to see if it agreed with the statement.
Each article was reviewed separately by two researchers, who were junior doctors. Ten junior doctors took part and reviewed two articles each.
The 10 conditions and the corresponding Wikipedia article assessed (in brackets) were:
- heart disease (coronary artery disease)
- cancer (lung cancer)
- mental disorders (major depressive disorder)
- trauma-related disorders (concussion)
- osteoarthritis (osteoarthritis)
- chronic obstructive lung disease/asthma (COPD)
- high blood pressure (hypertension)
- diabetes (diabetes mellitus)
- back problems (back pain)
- high levels of lipids (fat) in the blood (hyperlipidaemia)
The researchers identified factual statements in each article, such as "diabetes is a chronic condition". They then searched for peer-reviewed literature published or updated in the past five years on this statement on a US website called UpToDate.
The UpToDate website aims to support doctors to make clinical decisions by providing evidence-based information. The content is based on information in peer-reviewed journals and other sources, and is peer reviewed.
Each reviewer recorded if the peer-reviewed literature they identified agreed with the statement on Wikipedia, or if it was contradicted by a peer-reviewed reference. Two different reviewers then checked whether the findings of the original reviewers agreed with each other.
What were the basic results?
The researchers checked between 28 and 172 statements in each article. The two researchers assessing the article often differed in the number of factual statements they identified.
For each article, between about 55% and 100% of the statements assessed by each reviewer were found to agree with the peer-reviewed literature.
In all of the articles there was at least one statement that one of the researchers felt was not supported by the peer-reviewed literature.
The researchers reported that there were significant differences between the Wikipedia entry and the peer-reviewed literature in 9 out of the 10 articles.
How did the researchers interpret the results?
The researchers concluded that most Wikipedia articles on the 10 most costly conditions in the US contain errors when compared with the peer-reviewed literature on the subject.
They suggest that healthcare professionals and patients "should use caution when using Wikipedia to answer questions about patient care".
This research has found that there are differences between the medical information found in many Wikipedia articles and the peer-reviewed literature.
The authors found significant differences in 9 out of the 10 articles on common conditions that they assessed. Between 55% and 100% of the statements checked in each article agreed with the peer-reviewed literature.
However, there are some issues to bear in mind when interpreting these results:
- The study did not assess whether the Wikipedia articles missed out any important information on the conditions.
- The researchers differed in the number of statements they identified as being factual and the number checked for each article. It may have been more informative for both researchers to check the same statements.
- The researchers only had to identify the statement in one peer-reviewed source, but different peer-reviewed sources may disagree on some issues.
- The researchers may have missed some relevant sources in their searches, which were not described in detail.
- The researchers did not differentiate between statements where they found no related information in the peer-reviewed literature and ones where the information directly conflicted with what was in the peer-reviewed literature.
- The study did not assess how serious the potential impact of the errors might be. For example, a mistake in a report of how a drug should be taken (dose or route) could have serious consequences, while other differences might have less of an impact.
- It was not entirely clear from the study how appropriate the statistical analysis that they carried out was.
The most important thing to take from this study is that we should be cautious when obtaining medical information on the internet.
Trustworthy sources of medical information should be able to show that they have based their information on peer-reviewed literature, and that this is regularly reviewed and updated.
In the UK, The Information Standard has been set up to show readers which medical information sites use reliable processes to produce their medical information.
It is important never to rely on a single source when assessing medical and health information. Reputable and trustworthy sources of information, such as NICE clinical guidelines or articles published in peer-reviewed journals such as the BMJ or The Lancet, will always provide footnotes to supporting evidence.
Authors should also make clear what limitations exist about the totality of information regarding a specific topic. If an article claims to be 100% certain about an issue, it is almost certainly the work of a "quack".