Life-like characters are playing vital role in social computing by making human-computer interaction more easy and spontaneous. Nowadays, use of these characters to interact in online virtual environment has gained immense popularity. In this paper, we proposed a framework for a text-based chat system embodied with a life-like virtual agent that aims at natural communication between the users. To achieve this kind of system, we developed an agent that performs some nonverbal communications such as generating facial expression and motions by analyzing the text messages of the users. More specifically, this agent is capable of generating facial expressions for six basic emotions such as happy, sad, fear, angry, surprise, and disgust along with two additional emotions, irony and determined. Then to make the interaction between the users more realistic and lively, we added motions such as eye blink and head movements. We measured our proposed system from different aspects and found the results satisfactory, which make us believe that this kind of system can play a significant role in making an interaction episode more natural, effective, and interesting. Experimental evaluation reveals that the proposed agent can display emotive expressions correctly 93% of the time by analyzing the users’ text input.