Convex delay models like the Elmore model, the related Logical Effort model, posynomial, and generalized posynomial models have always been favored by researchers, as convexity has a priori guarantees of global optimum solutions. The accuracy of the model may be sacrificed in this quest to generate convex delay models. In this paper, we investigate the use of signomial delay modeling for area/delay optimization. We present a procedure to automatically generate signomial gate delay models by nonlinear least squares fitting. As opposed to posynomial models, signomial models achieve better fits to SPICE generated data. However, signomials are not convex in general. Nevertheless, we show via duality arguments that we obtain near optimum (within 1%) solutions. Our optimization considers beta-ratio constraints, minimum and maximum size constraints for n- and p-transistors, rise/fall delays, and edge rates. The gate sizes for the fastest delay solution for a 44000-cell design, using the IBM 130-nm process, can be achieved in about 16 min of CPU time on a PC, and the area-delay tradeoff curve for 21 points can be generated in about 2 h of CPU time. To the best of our knowledge, this is the first report of using a true signomial delay model and its application to optimum gate sizing. In addition, we give performance details for the automatic data fitting for an 11-function library of static CMOS gates.