Abstract

Due to poor ambient light or uneven lighting, the old decoration image acquisition methods are easy to cause the image blur. To solve this problem, this paper proposes a neural network-based filtering enhancement method for ancient architectural decoration images, which preserves image details by enhancing contrast, smoothing noise reduction and edge sharpening. Based on the convolutional neural network which is composed of encoder, decoder and layer hop connection, the residual network and hole convolution are introduced, and the hole U-Net neural network is constructed to fuse the pixel feature blocks of different levels. This method enhanced the image contrast according to the gray level and frequency histogram, and aiming at the gray value of the pixel to be processed in the image. And the middle value of the gray value of the neighborhood pixel is used to filter the noise of the ancient building decoration image. The paper also analyzes the joint strength of beams and columns in ancient buildings, and calculates the elastic constants of beams and columns and the stress at the joint of them, considering the image texture characteristics of the wood in ancient buildings with the mortise and tenon connection of beams and columns. Experimental results show that the proposed method has good noise suppression performance, can effectively obtain image detail features, and significantly improve the subjective visual effect of ancient architectural decoration images.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.