Abstract

screening Two members of the review team will independently screen candidate studies (duplicate decisions). If a large number of studies are involved, a cutoff will be made where the two authors will double screen e.g. 30% of the abstracts in order to establish reliability at this stage. If the Cohen’s kappa inter-rater reliability for inclusion or exclusion, as indicated by Cohen’s kappa, is satisfactory (above .80), the remaining references will be split in half and screened by either the first or second coder. If the inter-rater reliability is below .80 the two screeners will go through their conflicts and agree on the criteria’s before continuing screening. If necessary, the coding sheets will be revised.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call