This paper describes modifications to two multistart algorithms for global optimization which enable them to find feasible solutions to a system of nonlinear constraints more efficiently. The multistart algorithms, called OptQuest-NLP (OQNLP) and Multistart-NLP (MSNLP), start a local NLP Solver from a set of starting points and return the best solution found. Candidate starting points are generated either by a scatter search heuristic or by a randomized process. Two adaptive filters choose a small subset of the candidate points as starting points. The modifications to facilitate feasibility seeking include replacing the exact penalty function used to measure the goodness of a starting point with the sum of infeasibilities, and terminating when a feasible solution is found. We describe experimental results on a large and diverse set of smooth nonlinear nonconvex problems coded in the GAMS modeling language. These are chosen so that a single application of a selected solver from the user-specified starting point terminates infeasible, yet the problems all have feasible solutions. Our results show that MSNLP's feasibility mode is able to find feasible solutions to almost all problems. It is moderately faster than MSNLP not using feasibility mode, and is somewhat better a finding feasible solutions when they exist. It is now an option within MSNLP, and can be invoked by inserting an appropriate record into the options file.