Abstract

This paper describes the architecture and demonstrates the capabilities of a newly developed, physically-based imaging simulator environment called SISPO, developed for small solar system body fly-by and terrestrial planet surface mission simulations. The image simulator utilises the open-source 3-D visualisation system Blender and its Cycles rendering engine, which supports physically based rendering capabilities and procedural micropolygon displacement texture generation. The simulator concentrates on realistic surface rendering and has supplementary models to produce realistic dust- and gas-environment optical models for comets and active asteroids. The framework also includes tools to simulate the most common image aberrations, such as tangential and sagittal astigmatism, internal and external comatic aberration, and simple geometric distortions. The model framework's primary objective is to support small-body space mission design by allowing better simulations for characterisation of imaging instrument performance, assisting mission planning, and developing computer-vision algorithms. SISPO allows the simulation of trajectories, light parameters and camera's intrinsic parameters.

Highlights

  • A versatile image-simulation environment is required in order to design advanced deep-space missions, to simulate large sets of mission scenarios in parallel, and to develop and validate algorithms for semi-autonomous operations, visual navigation, localisation and image processing

  • This is especially true in the case of Small Solar System Body (SSSB) mission scenarios, where the mission has to be designed with either very limited information about the target or the targets can remain a near-complete mystery before their close encounter

  • The produced renderings are mature for deep-space mission design and algorithm development for semi-autonomous operations, visual navigation, localisation and image processing

Read more

Summary

Introduction

A versatile image-simulation environment is required in order to design advanced deep-space missions, to simulate large sets of mission scenarios in parallel, and to develop and validate algorithms for semi-autonomous operations, visual navigation, localisation and image processing. The Cycles rendering engine supports various shaders, the most relevant of those for this paper being: (i) diffuse bidirectional scattering distribution function, which provides access to Lambertian and Oren–Nayal shaders based on surface roughness; (ii) emission shader that allows surfaces or volumes to emit light; (iii) subsurface scattering that supports cubic, Gaussian and Christensen–Burley models; (iii) glossy shader that supports Sharp, Beckmann, GGX, Asikhmin–Shirley and multi-scatter GGX models; (iv) volume scattering shader that allows simulating light scattering in volumes [15] These shaders can be combined with both procedurally generated or pre-existing texture maps to change their parameters and mix various shaders on and within the scene models. The main rendering discrepancies are induced by the inaccuracy of the 3D model [14] and local variations of albedo and roughness

Discussion and future work
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call