SUMMARYExhaustive searches in regular grids is a traditional and effective method for inversion, that is numerical solution of systems of non-linear equations which cannot be solved using formal algebraic techniques. However, this technique is effective for very few (3–4) variables and is slow. Recently, the first limitation was to a major degree overpassed with the new TOPological INVersion (TOPINV) algorithm which was used for inversion of systems with up to 18, or even more unknown variables. The novelty of this algorithm is that it is not based on the principle of the mean minimum misfit (cost function) between observations and model predictions, used by most inversion techniques. The new algorithm investigates for each gridpoint whether misfits of each observation are within specified uncertainty intervals, and stores clusters of ‘successful’ gridpoints in matrix form. These clusters (ensembles, sets) of gridpoints are tested whether they satisfy certain criteria and are then used to compute one or more optimal statistical solutions. The new algorithm is efficient for highly non-linear problems with high measurement uncertainties (low signal-to-noise ratio, SNR) and poor distribution of observations, that is problems leading to complicated 3-D mean misfit surfaces without dominant peaks, but it is slow when running in common computers. To overcome this limitation, we used GPUs which permit parallel processing in common computers, but faced another computational problem: GPU parallel processing supports only up to three dimensions.To solve this problem, we used CUDA programming and optimized the distribution of the computational load to all GPU cores. This leads up to 100x speedup relative to common CPU processing, as is derived from comparative tests with synthetic data for two typical inversion geophysical problems with up to 18 unknown variables, Mogi magma source modeling and elastic dislocation modeling of seismic faults. This impressive speedup makes the GPU/CUDA implementation of TOPINV practical even for low-latency solution of certain geophysical problems.This speedup in calculations also permitted to investigate the performance of the new algorithm in relation to the density of the adopted grids. We focused on a typical problem of elastic dislocation in unfavorable conditions (poor observations geometry, data with low SNR) and on synthetic observations with noise, so that the difference of each solution from the ‘true’/reference value was known (accuracy-based approach). Application of the algorithm revealed stable, accurate and precise solutions, with quality increasing with the grid density. Solution defects (bias), mainly produced by very coarse grids, can be identified through specific diagnostic criteria, dictating finer search grids.