This paper investigates the minimization of the diameter of wire-wound pressure vessels. The study is conducted in two stages. Firstly, the stresses arising from the wire winding of the vessel and the internal pressure application are extracted based on the distortion energy theory. It is assumed that the equivalent stress within the winding area in each layer, under the internal pressure condition, equals the allowable stress of the wire, and in the cylinder area, it does not exceed the allowable stress of the cylinder. The required number of layers and the wire tension during winding (two essential unknown parameters in the industrial winding process) are determined to satisfy these assumptions. In the second stage, the diameter optimization of the vessel is studied using the Distortion Energy (DE) theory. It is found that reducing the cylinder diameter results in a decrease in the vessel diameter. The occurrence of buckling or yielding in the cylinder is a limiting constraint for reducing the cylinder diameter. For validation, the results obtained from the derived equations are compared and confirmed against finite element results and the equations provided in the ASME code. Wire-winding simulation around the cylinder uses the element birth and death technique. The agreement between the derived equations and the code equations, as well as finite element results, is excellent. In the examined case study, using the DE theory leads to a 24 % reduction in wire consumption compared to the maximum shear stress (MSS) theory. In the optimization stage, the vessel weight is further reduced by 24 %. Compared to monobloc shells, the optimized wire-wound vessel weight based on the DE theory experiences a 77 % reduction.