Abstract

The purpose of this research is to demonstrate the feasibility of automating ‘die-polygon-capturing’, an economical yet still labor-intensive technique for circuit extraction during the reverse-engineering of simple integrated circuits. As microchip designs become increasingly diverse with the ongoing trend of using application-specific integrated circuits, and considering the importance of reverse-engineering these components, die-polygon-capturing is set to play a more critical role in the future. Due to the apparent absence of prior scientific publications and limited automation efforts in this area, this paper presents a proof of concept for automating the die-polygon-capturing technique, thereby addressing a notable gap in the existing literature, with an overarching goal of reducing the labor-intensity of die-polygon-capturing. Our method consists of training deep neural networks on variations of a dataset and evaluating their segmentation capabilities of various layers and connections in an integrated circuit. Given the limited accessibility to high-quality labeled datasets, the dataset used for this research consists of two images of a single microchip, the AMD 9085D. We implemented a data augmentation process that expanded this dataset to as many as 4872 images. The experiment’s results proof the automation’s feasibility, demonstrating high scores on Intersection over Union and F-beta evaluation metrics. However the primary conclusion drawn is the need for focus on generalizability towards other types of microchips in order to effectively automate this technique for circuit extraction.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.