Abstract
Environmental illumination information is necessary to achieve a consistent integration of virtual objects in a given image. In this paper, we present a gradient-based shadow detection method for estimating the environmental illumination distribution of a given scene, in which a three-dimensional (3-D) augmented reality (AR) marker, a cubic reference object of a known size, is employed. The geometric elements (the corners and sides) of the AR marker constitute the candidate’s shadow boundary; they are obtained on a flat surface according to the relationship between the camera and the candidate’s light sources. We can then extract the shadow regions by collecting the local features that support the candidate’s shadow boundary in the image. To further verify the shadows passed by the local features-based matching, we examine whether significant brightness changes occurred in the intersection region between the shadows. Our proposed method can reduce the unwanted effects caused by the threshold values during edge-based shadow detection, as well as those caused by the sampling position during point-based illumination estimation.
Highlights
Understanding the environmental illumination information of a scene is important when rendering virtual objects in a way that matches the given image; it helps during the generation of convincing vertical shadows onto the real scene [1,2]
The proposed method measures the edge support, which indicates that the image gradient has the same direction as the cast shadow boundary
We computed the angular difference of the fitted line segments and the candidate shadow boundary, and we computed the corner support that represents the distance between the intersection point of the fitted line segments and the corner point of the 3-D marker that is casting the shadow
Summary
Understanding the environmental illumination information of a scene is important when rendering virtual objects in a way that matches the given image; it helps during the generation of convincing vertical shadows onto the real scene [1,2]. Many studies have described the generation of realistic images that reflect the environmental illumination of their scene [3,4,5,6,7]. Some of these studies have employed additional camera equipment (e.g., a light probe or a fish-eye camera) to estimate real-world illumination conditions [8,9]. One previous study used mobile sensors (e.g., ambient sensors), a global positioning system (GPS), and a weather application programming interface (API) to estimate outdoor illumination for a mobile AR application [10]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.