Abstract

Continuous luminance monitoring is challenging because high-dynamic-range cameras are expensive, they need programming, and are intrusive when placed near the occupants’ field-of-view. A new semi-automated and non-intrusive framework is presented for monitoring occupant-perceived luminance using a low-cost camera sensor and Structure-from- Motion (SfM)-Multiview Stereo (MVS) photogrammetry pipeline. Using a short video and a few photos from the occupant position, the 3D space geometry is automatically reconstructed. Retrieved 3D context enables the back-projection of the camera-captured luminance distribution into 3D spaces that are in turn re-projected to occupant-FOVs. The framework was tested and validated in a testbed office. The re-projected luminance field showed with good agreement with luminance measured at the occupant position. The new method can be used for non-intrusive luminance monitoring integrated with daylighting control applications.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call