AbstractIndoor farming is expected to have an increasing impact on future food and agricultural systems. It can produce crops year‐round without relying on weather patterns, bringing crop production systems closer to advanced manufacturing systems for industrial and consumer goods. Indoor farming also provides an ideal platform for implementing intelligence‐driven and empowered agricultural systems that can autonomously gather/process data, build a knowledge base, provide management decision support, and execute crop cultural tasks. Towards that goal, a mobile robotics platform (MRP) has been under development. This platform will provide new capabilities for implementing indoor farming cyber–physical systems. The ultimate purpose is to equip the MRP with systematically integrated capabilities of perception, reasoning, learning, communication, and task planning/execution for use in precision indoor farming systems. This presentation reports the current state of MRP's development. The MRP can monitor the growing status of individual strawberry plants and fruit with a perception system and nondestructively harvest individual ripe strawberries with an onboard robot. The robotic system has a stereo camera, a six‐degrees of freedom robotic manipulator, and a pneumatically actuated soft gripper (i.e., end‐effector) mounted on an autonomous mobile base. These subsystems were specifically chosen from commercially available products. An open‐loop fruit positioning and handling algorithm, in a software form, capable of performing the tasks of fruit detection, localization, grasping, detachment, and placement, has been developed. The location and ripeness of each fruit can be simultaneously obtained through deep learning techniques. We found that a mean average precision value of 96.4% for ripe strawberries could be obtained in a custom indoor strawberry data set. Three‐dimensional (3D) locations of individual strawberries are used in the motion planner to guide the movements of the mobile platform, manipulator, and end‐effector. The robot can autonomously classify each strawberry fruit's growth scene through the fusion of the detection results and the corresponding depth information, to determine whether to perform the harvesting action. The accuracy of the proposed strawberry growth scene categorization approach is 89.1%, and the weighted‐averaged F1 score is 0.7. The soft gripper can approach and gently hold a target strawberry fruit. A novel approach for fruit detachment has been developed to separate the fruit's sepal and the stalk by a drag‐and‐rotate motion. The experiments conducted in a commercial indoor strawberry farm showed an overall success rate of 78% in handling harvestable strawberries with a damage rate of 23%. The average harvesting cycle time was found to be 10.5 s. We found that poor fruit visibility mainly reduced the success rate of the target fruit's robotic detection and grasping action. The grasping and detachment tasks were affected by fruit size and stalk length. These factors together determined the overall harvesting performance of the robotic system. The robotic system showed better performance when harvesting fruit of small size (width is less than 31 mm and height is less than 33 mm) with a medium stalk (between 81 and 101 mm). The MRP's functions are expected to be transferable and expandable to other crop production monitoring and cultural tasks.