Social relationships are constructed by and through the relational communication that people exchange. Relational messages are implicit nonverbal and verbal messages that signal how people regard one another and define their interpersonal relationships—equal or unequal, affectionate or hostile, inclusive or exclusive, similar or dissimilar, and so forth. Such signals can be measured automatically by the latest machine learning software tools and combined into meaningful factors that represent the socioemotional expressions that constitute relational messages between people. Relational messages operate continuously on a parallel track with verbal communication, implicitly telling interactants the current state of their relationship and how to interpret the verbal messages being exchanged. We report an investigation that explored how group members signal these implicit messages through multimodal behaviors measured by sensor data and linked to the socioemotional cognitions interpreted as relational messages. By use of a modified Brunswikian lens model, we predicted perceived relational messages of dominance, affection, involvement, composure, similarity and trust from automatically measured kinesic, vocalic and linguistic indicators. The relational messages in turn predicted the veracity of group members. The Brunswikian Lens Model offers a way to connect objective behaviors exhibited by social actors to the emotions and cognitions being perceived by other interactants and linking those perceptions to social outcomes. This method can be used to ascertain what behaviors and/or perceptions are associated with judgments of an actor’s veracity. Computerized measurements of behaviors and perceptions can replace manual measurements, significantly expediting analysis and drilling down to micro-level measurement in a previously unavailable manner.