This paper presents a globally convergent model algorithm for the minimization of a locally Lipschitzian function. The algorithm is built on an iteration function of two arguments, and the convergence theory is developed parallel to analogous results for the problem of solving systems of locally Lipschitzian equations. Application of the theory to a wide range of nonsmooth optimization problems is discussed.These include the minimax problem, the composite optimization problem, the implicit programming problem, and others. A recently developed nonmonotone linesearch technique is shown to be applicable in this nonsmooth context, and an extension to constrained problems is also presented.