Abstract

We present a new approach to build a schematic representation of indoor environments for phosphene images. The proposed method combines a variety of convolutional neural networks for extracting and conveying relevant information about the scene such as structural informative edges of the environment and silhouettes of segmented objects. Experiments were conducted with normal sighted subjects with a Simulated Prosthetic Vision system.

Highlights

  • Retinal degenerative diseases such as retinitis pigmentosa and age-related macular degeneration cause loss of vision due to the gradual degeneration of the sensory cells in the retina

  • From the actual technologies for retinal implants, one of the most active line of research is based on implants with a micro camera that captures external stimuli and a processor that converts the visual information in microstimulations in the implant

  • We evaluate and compare the proposed semantic and structural image segmentation with baseline methods (Edge and Direct) through a Simulated Prosthetic Vision (SPV) experiment, which is a standard procedure for noninvasive evaluation using normal vision subjects

Read more

Summary

Introduction

Retinal degenerative diseases such as retinitis pigmentosa and age-related macular degeneration cause loss of vision due to the gradual degeneration of the sensory cells in the retina. Visual prosthesis are currently the most promising technology to improve vision in patients with such degenerative diseases. As a result, implanted patients are able to see patterns of spots of light called phosphenes that the brain interprets as a visual information [1].

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.