Abstract

The precipitous growth of the Anthropogenic Space Object population demands new approaches to high-volume detection and measurement for orbital analyses and conjunction assessment. Optical sensors, which can be smaller and more cost-effective than other mechanisms such as radar sensors, have potential for rapid network expansion, community-driven data acquisition, and space-based sensing. However, traditional detection methods of source extraction and track filtering require high-fidelity a-priori assumptions of data content and are limited by extensive calibration requirements costing time, resources, and expert knowledge of particular detection pipelines. Machine learning methods mitigate a-priori assumptions but require extensive training data which is costly to manually label and restricts applicable use cases. We instead consider a new approach to optical Space Surveillance and Tracking based on a-contrario analysis, detecting features which cannot be attributed to noise from the hardware, atmosphere, or background celestial environment. In such an approach, we reduce epistemic uncertainty by recursive conditioning and inference of data structures. This approach is implemented and assessed on data from the ASTRIANet telescope network of The University of Texas at Austin, and we demonstrate the capacity for orbit determination by quantifying detection rates for actively tracked objects, discussing applicability to sidereal tracking and object discovery, and evaluating the resulting orbit estimates.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.