People use their hands for intricate tasks like playing musical instruments, employing myriad touch sensations to inform motor control. In contrast, current prosthetic hands lack comprehensive haptic feedback and exhibit rudimentary multitasking functionality. Limited research has explored the potential of upper limb amputees to feel, perceive, and respond to multiple channels of simultaneously activated haptic feedback to concurrently control the individual fingers of dexterous prosthetic hands. This study introduces a novel control architecture for three amputees and nine additional subjects to concurrently control individual fingers of an artificial hand using two channels of context-specific haptic feedback. Artificial neural networks (ANNs) recognize subjects’ electromyogram (EMG) patterns governing the artificial hand controller. ANNs also classify the directions objects slip across tactile sensors on the robotic fingertips, which are encoded via the vibration frequency of wearable vibrotactile actuators. Subjects implement control strategies with each finger simultaneously to prevent or permit slip as desired, achieving a 94.49% ± 8.79% overall success rate. Although no statistically significant difference exists between amputees’ and non-amputees’ success rates, amputees require more time to respond to simultaneous haptic feedback signals, suggesting a higher cognitive load. Nevertheless, amputees can accurately interpret multiple channels of nuanced haptic feedback to concurrently control individual robotic fingers, addressing the challenge of multitasking with dexterous prosthetic hands.