The sense of touch is a property that allows humans to interact delicately with their physical environment. This article reports on a technological advancement in intuitive human-robot interaction that enables an intrinsic robotic sense of touch without the use of artificial skin or tactile instrumentation. On the basis of high-resolution joint-force-torque sensing in a redundant arrangement, we were able to let the robot sensitively feel the surrounding environment and accurately localize touch trajectories in space and time that were applied on its surface by a human. Through an intertwined combination of manifold learning techniques and artificial neural networks, the robot identified and interpreted those touch trajectories as machine-readable letters, symbols, or numbers. This opens up unexplored opportunities in terms of intuitive and flexible interaction between human and robot. Furthermore, we showed that our concept of so-called virtual buttons can be used to straightforwardly implement a tactile communication link, including switches and slider bars, which are complementary to speech, hardware buttons, and control panels. These interaction elements could be freely placed, moved, and configured in arbitrary locations on the robot structure. The intrinsic sense of touch we proposed in this work can serve as the basis for an advanced category of physical human-robot interaction that has not been possible yet, enabling a shift from conventional modalities toward adaptability, flexibility, and intuitive handling.