Abstract

The purpose of the research is to substantiate and select the architecture of a neural network for the possibility of implementing the cognitive functions of network software for controlling a grouping of interacting small spacecraft.Methods are based on the concepts of AI theory for managing the grouping of small spacecraft - the use of adaptive methods and tools that allow making decisions, similar to the mechanisms of human thinking. With regard to space communication systems with a heterogeneous structure, AI methods and technologies are aimed at the processes of predicting the state in communication channels between network nodes and automatic reconfiguration of the network of devices based on the learning processes of a neural network (NN).Results. In the learning and forecasting mode, it is necessary to use time series of parameters and coordinates of specific pairs of small spacecraft with non-zero line of sight. Especially for time series analysis, recurrent neural networks (RNN) are used, in particular, LSTM. The idea of RNN operation is to use as input data for the current forecast not only the state vectors of the SVs and their coordinates, but also the previous value of the communication quality, actual or predictive. The paper shows that the onboard computing power of a separate MSC does not allow performing forecasting and training on board. Therefore, a dedicated ground segment of forecasting and monitoring is required, which will collect a posteriori information, periodically train the cognitive model, use it to predict the quality of communication, and transmit the results to the network nodes to build data transmission routes.Conclusion. The analysis of modern solutions and the choice of neural network architecture for the implementation of the cognitive functions of the network software for controlling the grouping of interacting small spacecraft showed that the neural networks of the Transformer architecture, which are based on the mechanism of internal attention, most fully meet the requirements of the project. The Transformer architecture allows using the entirety of a priori data, has a high learning and forecasting speed.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.