People-following in a leader-follower scheme is considered as a major contemporary research problem in the domain of human-robot coexistence. The problem essentially requires human detection and tracking, and, in recent times, vision sensing based shoe detection has evolved as an effective mechanism for human detection. The problem gets more challenging when the environment becomes affected by photometric conditions like varying illumination, shadows, specularities, etc. Recently, a state-of-the-art fast image template matching algorithm, called photometric-invariant CFAsT-match (PICFAsT-match), has been specifically proposed for shoe detection based people-following in those challenging environments. In this approach, for determining the best shoe matching result corresponding to a frame, each single transformation included in a number of specified grids of general affine transformations is evaluated. However, in order to achieve a high speed real-life implementation of PICFAsT-match, every transformation is assessed by considering a handful number of pixels sampled randomly from the images. Research beyond PICFAsT-match reveals that the matching accuracy significantly relies on the quality of the random sampling. In this paper, we propose an improved variant of PICFAsT-match, called chaos based PICFAsT-match (CBPICFAsT-match), where a novel approach for this process of random sampling of pixels has been developed. This strategy is based on the state-time histories of multiple fractional order chaotic systems. Experimental results aptly demonstrate the superiority of the proposed algorithm, when compared to competing state-of-the-art algorithms, successfully employed for the said shoe detection purpose in human-robot coexisting environments.