Abstract
This paper presents a framework for low-light color imaging using a dual camera system that combines a high spatial resolution monochromatic (HSR-mono) image and a low spatial resolution color (LSR-color) image. We propose a cross-camera synthesis (CCS) module to learn and transfer illumination, color, and resolution attributes across paired HSR-mono and LSR-color images to recover brightness- and color-adjusted high spatial resolution color (HSR-color) images at both camera views. Jointly characterizing various attributes for final synthesis is extremely challenging because of significant domain gaps across cameras. Our proposed CCS method consists of three subtasks: reference-based illumination enhancement ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">RefIE</b> ), reference-based appearance transfer ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">RefAT</b> ), and reference-based super resolution ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">RefSR</b> ), by which we can characterize, transfer, and enhance illumination, color, and resolution at both views. Each subtask is implemented using deep neural networks (DNNs) that are first trained for each subtask separately and then finetuned jointly. Experimental results show the superior qualitative and quantitative results of the proposed CCS model on both synthetic content from popular datasets and real-captured scenes. Ablation studies further verify the model generalization to various exposures and camera baselines. We will make our work accessible at <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://njuvision.github.io/CCS</uri> for reproducible research.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Journal of Selected Topics in Signal Processing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.