Abstract

The restrictions that are related to using single distribution resampling for some specific computing devices’ memory gives developers several difficulties as a result of the increased effort and time needed for the development of a particle filter. Thus, one needs a new sequential resampling algorithm that is flexible enough to allow it to be used with various computing devices. Therefore, this paper formulated a new single distribution resampling called the adaptive memory size-based single distribution resampling (AMSSDR). This resampling method integrates traditional variation resampling and traditional resampling in one architecture. The algorithm changes the resampling algorithm using the memory in a computing device. This helps the developer formulate a particle filter without over considering the computing devices’ memory utilisation during the development of different particle filters. At the start of the operational process, it uses the AMSSDR selector to choose an appropriate resampling algorithm (for example, rounding copy resampling or systematic resampling), based on the current computing devices’ physical memory. If one chooses systematic resampling, the resampling will sample every particle for every cycle. On the other hand, if it chooses the rounding copy resampling, the resampling will sample more than one of each cycle’s particle. This illustrates that the method (AMSSDR) being proposed is capable of switching resampling algorithms based on various physical memory requirements. The aim of the authors is to extend this research in the future by applying their proposed method in various emerging applications such as real-time locator systems or medical applications.

Highlights

  • Over the past two decades, particle filtering has emerged as a procedure for sequential signal processing (Refs. [1,2,3,4] presents the review)

  • By limiting the utilisation of single distribution resampling for particular computing devices, memory-related issues have proven to be difficult for the developer because they resulted into the need for more time and effort during particle filter development

  • Restricting the use of single distribution resampling in case of the memory of specific computing devices causes difficulties for the developer, because of the additional time and effort needed to develop a particle filter

Read more

Summary

Introduction

Over the past two decades, particle filtering has emerged as a procedure for sequential signal processing (Refs. [1,2,3,4] presents the review). This algorithm is called residual systematic resampling This process collects the fractional contributions that each particle found in a search systematic contributes until a sample can be generated (in a manner that is the same to the collection idea that was implemented in the systematic resampling method). By limiting the utilisation of single distribution resampling for particular computing devices, memory-related issues have proven to be difficult for the developer because they resulted into the need for more time and effort during particle filter development. This meant that a new sequential resampling algorithm was needed. The section will talk about the objective that was presented in this current paper

Objective
Conclusions and future implementations
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.