Abstract

Automatic cell counting in pathology images is challenging due to blurred boundaries, low-contrast, and overlapping between cells. In this paper, we train a convolutional neural network (CNN) to predict a two-dimensional direction field map and then use it to localize cell individuals for counting. Specifically, we define a direction field on each pixel in the cell regions (obtained by dilating the original annotation in terms of cell centers) as a two-dimensional unit vector pointing from the pixel to its corresponding cell center. Direction field for adjacent pixels in different cells have opposite directions departing from each other, while those in the same cell region have directions pointing to the same center. Such unique property is used to partition overlapped cells for localization and counting. To deal with those blurred boundaries or low contrast cells, we set the direction field of the background pixels to be zeros in the ground-truth generation. Thus, adjacent pixels belonging to cells and background will have an obvious difference in the predicted direction field. To further deal with cells of varying density and overlapping issues, we adopt geometry adaptive (varying) radius for cells of different densities in the generation of ground-truth direction field map, which guides the CNN model to separate cells of different densities and overlapping cells. Extensive experimental results on three widely used datasets (i.e., VGG Cell, CRCHistoPhenotype2016, and MBM datasets) demonstrate the effectiveness of the proposed approach.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.