Abstract

This paper highlights the lack of consideration that is given to power in the health and social sciences, which is a continuing problem with both single study research and more importantly for meta-analysis. The power of a study is the probability that it will lead to a statistically significant result. By ignoring power the single study researcher makes it difficult to get negative results published and therefore affects meta-analysis through publication bias. Researchers using meta-analysis, who also ignore power, then compound the problem by including studies with low power that are more likely to show significant effects. A simple means of calculating an easily understood measure of effect size from a contingency table is demonstrated in this paper. A computer programme for determining the power of a study is recommended and a method of reflecting the adequacy of the power of the studies in a meta-analysis is suggested. An example of this calculation from a meta-analytic study on intravenous magnesium, which produced inaccurate results, is provided. It is demonstrated that incorporating power analysis into this meta-analysis would have prevented misleading conclusions being reached. Some suggestions are made for changes in the protocol of meta-analytic studies, which highlight the importance of power analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.