Abstract

In this paper, we present an accurate and comprehensive semi-automated algorithm for detection and counting the chemically etched tracks on polycarbonate nuclear Track detectors. The proposed algorithm consists of two main phases: a preprocessing phase for image preparation and quality improvement. It uses a format conversion into gray-scale image and contrast enhancement procedure. The contrast enhancement procedure employs the first-order fuzzy moment of the image and the Sugeno class of fuzzy complements to maximize the parametric indices of fuzziness of the image. In one feature extraction phase, two intrinsic characteristics of tracks are defined and formulated as topological feature (TF) and chromatic feature (CF). TF refers to tracks shape and geometry which are characterized and quantized by calculating the local Gray Level Spatial Correlation histogram of the tracks and CF refers to the color intensity difference of the tracks borders and tracks centric areas. The CF characterizes this difference specified to the tracks. After the feature extraction phase, the algorithm enters the detection and counting phase in which the fuzzy sets of background, noise (including small objects, notches, etc.) and track are constructed. Detected objects are then characterized according to their membership values to the fuzzy sets. The algorithm is designed in order to be applicable for different forms of tracks and track counting systems. Accuracy, linearity, sensitivity to the input parameters and comprehensibility of the method are validated in the simulation section.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.