Platforms sharing autonomous electric vehicles have to decide which vehicle should pick up which type of customer and when vehicles should be recharged, based on remaining battery level and proximity to the charging station. We model and analyze this problem. To guarantee maximum service and cost performance, in the future, vehicle sharing platforms will be unmanned, autonomous, and electric. Methods are needed to control the vehicles in such systems. To evaluate vehicle control policies, we first model the system as a semi-open queueing network (SOQN) with multiple synchronization stations for matching customer battery-demand classes with vehicles holding ample remaining battery capacity. If a vehicle's battery level drops below a threshold, it is routed to a nearby charging station for partial or full charging. We solve the analytical model of the SOQN and obtain approximate system performance with known vehicle routings. Then the SOQN is used in a Markov decision process (MDP) to minimize total cost, and to find good heuristic policies. Simulation results indicate that the performance of the approximate SOQN network model is accurate under given vehicle routings. We then test the performance of different policies for a small-scale network. The experiments show that a state-dependent policy is near optimal. The policies are also tested in a real case with shared vehicles. Partial charging can effectively increase customer throughput when servicing large demand with a limited number of vehicles. Dynamic vehicle allocation and charging can reduce system cost substantially. An interesting finding of our research is that reserving some idle vehicles to wait for future short-distance customers even when long-distance customers are waiting for vehicles can be beneficial.