Grain boundaries play an important role in governing the mechanical and physical properties of polycrystalline materials. Therefore, quantitative analysis of grain structure is an important prerequisite for establishing structure-property relationships. However, this quantitative analysis is hampered by the low contrast and non-uniform illumination of optical microscopy images. Although previous studies have used neural networks to detect grain boundaries in optical micrographs, their practical applications have been restricted due to labeling costs for supervised learning and their sensitivity to defects present in a grain structure image. In this paper, we approach grain boundary detection as a real-to-virtual (R2V) translation problem, mapping a single real microstructure to virtual microstructures. With this perspective, our framework benefits from two learning schemes: unsupervised learning and physics-inspired learning. Extensive experiments demonstrate the superiority and generality of our framework. We expect that our approach will facilitate the use of data-driven techniques in the field of quantitative microstructure analysis.