A new fast method for pupil detection and eyetracking real time is being developed based on the study of a boundary-step model of a grayscale image by the Laplacian-Gaussian operator and finding a new proposed descriptor of accumulated differences (point identifier), which displays a measure of the equidistance of each point from the boundaries of some relative monotonous area (for example, the pupil of the eye). The operation of this descriptor is based on the assumption that the pupil in the frame is the most rounded monotonic region with a high brightness difference at the border, the pixels of the region should have an intensity less than a predetermined threshold (but the pupil may not be the darkest region in the image). Taking into account all of the above characteristics of the pupil, the descriptor allows achieving high detection accuracy of its center and size, in contrast to methods based on threshold image segmentation, based on the assumption of the pupil as the darkest area, morphological methods (recursive morphological erosion), correlation or methods that investigate only the boundary image model (Hough transform and its variations with two-dimensional and three-dimensional parameter spaces, the Starburst algorithm, Swirski, RANSAC, ElSe). The possibility of representing the pupil tracking problem as a multidimensional unconstrained optimization problem and its solution by the Hook-Jeeves non-gradient method, where the function expressing the descriptor is used as the objective function, is investigated. In this case, there is no need to calculate the descriptor for each point of the image (compiling a special accumulator function), which significantly speeds up the work of the method. The proposed descriptor and method were analyzed, and a software package was developed in Python 3 (visualization) and C ++ (tracking kernel) in the laboratory of the Physics and Mathematics Faculty of Kamchatka State University of Vitus Bering, which allows illustrating the work of the method and tracking the pupil in real time.