5 major sources of error in medical studies and how VitaminDWiki minimizes them
It Ain't Necessarily So: Why Much of the Medical Literature Is Wrong
Nice study - here are the topics covered along with a few examples
Reverse Causality
The Play of Chance and the DICE Miracle
- Review of 45 highlighted studies in major medical journals.
- 24% were never replicated,
- 16% were contradicted by subsequent research,
- 16% were shown to have smaller effect sizes than originally reported.
Bias: Coffee, Cellphones, and Chocolate
Confounding
Exaggerated Risk
Example: risk ratio = 7% but odds ratio (which was reported) = 40%,
Relative risk of drinking a cup of coffee and then getting heart attack 1.5X
- However, absolute risk of heart attack after a cup of coffee is 1 in 2,000,000 cups
^ A Dirty Dozen: Twelve P-Value Misconceptions (2008) is referenced by the article and is attached below
1 If P .05, the null hypothesis has only a 5% chance of being true.
2 A nonsignificant difference (eg, P>.05) means there is no difference between groups.
3 A statistically significant finding is clinically important.
4 Studies with P values on opposite sides of .05 are conflicting.
5 Studies with the same P value provide the same evidence against the null hypothesis.
6 P.05 means that we have observed data that would occur only 5% of the time under the null hypothesis.
7 P.05 and P <.05 mean the same thing.
8 P values are properly written as inequalities (eg, “P<.02” when P .015)
9 P.05 means that if you reject the null hypothesis, the probability of a type I error is only 5%.
10 With a P.05 threshold for significance, the chance of a type I error will be 5%.
11 You should use a one-sided P value when you don’t care about a result in one direction, or a difference in that direction is impossible.
12 A scientific conclusion or treatment policy should be based on whether or not the P value is significant^
VitaminDWiki conclusions minimize the above errors
Emphasize RCT
Minimize observational studies unless it was clear which came first
- If 3+ different studies have the same conclusion ==> FACT
- If the studies disagree ==> lack of consensus (example lack of consensus: 5 say yes, 2 say no)
VitaminDWiki also does not report on studies which used too little vitamin D or for too short of time
One study used 40 IU intervention and found no affect
One study only gave Vitamin D for 1 month and found no affect
One study had all patients with very low level of vitamin D, and found no difference in disease with 1 nanogram difference between the groups
One study raised vitamin D levels by 10 ng and found no improvement: levels went to 10 ng to 20 ng – no where near to the 40 ng needed to make a difference