Recent advances in visual motion detection and interpretation have made possible the rising of new robotic systems for autonomous and active surveillance. In this line of research, the current work discusses motion perception by proposing a novel technique that analyzes dense flow fields and distinguishes several regions with distinct motion models.The method is called Wise Optical Flow Clustering (WOFC) and extracts the moving objects by performing two consecutive operations: evaluating and resetting. Motion properties of the flow field are retrieved and described in the evaluation phase, which provides high level information about the spatial segmentation of the flow field. During the resetting operation, these properties are combined and used to feed a guided segmentation approach. The WOFC requires information about the number of motion models and, therefore, this paper introduces a model selection method based on a Bayesian approach that balances the model’s fitness and complexity. It combines the correlation of a histogram-based analysis with the decay ratio of the normalized entropy criterion. This approach interprets the flow field and gives an estimative about the number of moving objects.The experiments conducted in a realistic environment have proved that the WOFC presents several advantages that meet the requirements of common robotic and surveillance applications: is computationally efficient and provides a pixel-wise segmentation, comparatively to other state-of-the-art methods.