Accurately and reliably obtaining the three-dimensional motion data of individuals in fish schools is not only valuable for fish behaviour analysis and hydrodynamics studies but also be helpful in areas such as bio-inspired robot design. Video tracking is the most effective gateway to obtain the quantitative motion data of continuously moving objects. In this paper we propose a method for obtaining the quantitative three-dimensional trajectory of individuals in fish schools. The proposed method works on videos captured by multiple synchronized cameras with the help of a suggested three-camera imaging system. The proposed method follows the master-slave paradigm in which it tracks fish in the master view with the help of the convolutional neural network in the first, and then associates each certain fish in the master view with detections in the slave view by formulating the cross-view data association as the moment-wise linear assignment problems. Experiments have conducted on public datasets to completely evaluate the performance. The proposed method outperforms other state-of-the-art methods.