In this paper, we describe our own stance on a research area called Humatronics, which aims at establishing a (more) symmetric interaction relationship between and computer systems. In particular, we will advocate a novel approach to understanding that is based on largely involuntary and unconscious physiological information and gaze behavior rather than purposeful and conscious actions or behaviors. Understanding humans here refers to users' states related to emotion and affect, attention and interest, and possibly even to their intentions. A key feature of our approach is that it provides insight into a person's cognitive-motivational state without relying on cognitive judgements, such as answers to dedicated queries. Lifelike interface agents are endowed with synthetic bodies and faces and can be considered as prime candidates for outbalancing the asymmetric relationship in current human-computer interaction. As example applications, we will report on two recent studies that utilized lifelike agents as presenters or interaction partners of users. The resulting interactions can be conceived as implementing initial steps toward symmetric multimodality in user interfaces