A new type of method for unconstrained optimization, called a tensor method, is introduced. It is related in its basic philosophy to the tensor methods for nonlinear equations for Schnabel and Frank [SIAM J. Numer. Anal., 21 (1984), pp. 815–843], but beyond that the methods have significant differences. The tensor method for unconstrained optimization bases each iteration upon a fourth order model of the objective function. This model consists of the quadratic portion of the Taylor series, plus low-rank third and fourth order terms that cause the model to interpolate already calculated function and gradient values from one or more previous iterates. This paper also shows that the costs of forming, storing, and solving the tensor model are not significantly more than these costs for a standard method based upon a quadratic Taylor series model. Test results are presented for sets of problems where the Hessian at the minimizer is nonsingular, and where it is singular. On all the test sets, the tensor method solves considerably more problems than a comparable standard method. On problems solved by both methods, the tensor method requires about half as many iterations, and half as many function and derivative evaluations as the standard method, on the average.