Abstract

A study was undertaken to assess the reproducibility of pathological grading for breast cancer and to determine the effect of education on agreement. Three pathologists graded 39 randomly selected cases of breast cancer on three separate occasions. Overall level of agreement and kappa were used to measure intraobserver and interobserver agreement for three grading schemes: (1) standard grade (SG) using a modified Scarff-Bloom-Richardson grading scheme, (2) revised grade (RG) and (3) clinical grade (CG). The intra-observer agreement and kappa for SG were 75% and 0.62 respectively. The interobserver agreement and kappa for SG ranged from 62–65% and 0.4–0.48 respectively. The RG scheme which condensed the mitotic count into two levels instead of three performed much better. Level of agreement and kappa rose to 85% (k = 0.74) for intra-observer agreement; improvement was also seen in interobserver agreement with overall agreement ranging from 72–79% with kappas of 0.5–0.65. The CG scheme did not show further improvement in agreement over RG. Pathological grading was also found to be very consistent. A formal educational session actually lowered the level of interobserver agreement. Pathological grading was found to be reproducible and consistent. The best results were obtained with RG and CG. Education was not effective in increasing reproducibility.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.