Abstract An estimator in the linear model is defined by minimizing an objective function, the derivative of which is a signed rank statistic. The scores are generated from a function h+ : (0, 1) → [0, ∞), which is not necessarily nondecreasing, as is usually assumed. It is shown that this estimator can be chosen with a maximal breakdown point of. 5. Moreover, strong consistency and asymptotic normality (with convergence rate n −1/2 of the proposed estimator are proved under various regularity conditions. Because the objective function generally is not convex in the regression parameters, the usual proofs of asymptotic normality do not carry over. Instead the proof is based on an asymptotic linearity result, similar to that obtained by Huber for M estimates, and some moment estimates for signed rank statistics. Numerical examples illustrate the behavior of the estimator.