The authors present a method for model-based programming and control of a sensing and manipulation system for assembly tasks. The system contains both an off-line subsystem and a runtime subsystem. The off-line portion of the system is used to describe the desired task and specify the accompanying workpieces, and to compute the object features the sensor systems can expect to detect. The runtime system corrects the nominal manipulator paths using sensor data. The purpose of the sensor systems is to establish a correspondence between the off-line program and the actual situation. Several different sensor systems are required because each sensor only returns partial information on the task state. The use of redundant sensor information allows task execution to proceed despite the failure of an individual sensor system, while combining multiple measurements from sensor systems reduces the effect of noise in the individual sensor systems. In this article vision and range sensors are used to demonstrate the method.