Abstract

PurposeIn recent years, efforts to apply artificial intelligence (AI) to the medical field have been growing. In general, a vast amount of high-quality training data is necessary to make great AI. For tumor detection AI, annotation quality is important. In diagnosis and detection of tumors using ultrasound images, humans use not only the tumor area but also the surrounding information, such as the back echo of the tumor. Therefore, we investigated changes in detection accuracy when changing the size of the region of interest (ROI, ground truth area) relative to liver tumors in the training data for the detection AI.MethodsWe defined D/L as the ratio of the maximum diameter (D) of the liver tumor to the ROI size (L). We created training data by changing the D/L value, and performed learning and testing with YOLOv3.ResultsOur results showed that the detection accuracy was highest when the training data were created with a D/L ratio between 0.8 and 1.0. In other words, it was found that the detection accuracy was improved by setting the ground true bounding box for detection AI training to be in contact with the tumor or slightly larger. We also found that when the D/L ratio was distributed in the training data, the wider the distribution, the lower the detection accuracy.ConclusionsTherefore, we recommend that the detector be trained with the D/L value close to a certain value between 0.8 and 1.0 for liver tumor detection from ultrasound images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call