Large language models (LLMs) exhibit outstanding reasoning abilities when robots perform tasks, but are limited to the passively receiving information and lack multi-feature perception of realistic objects. Humans obtain complementary and coupled information from interactions with objects through the skin. Here, a multimodal tactile sensory system is reported to assist the LLM-equipped robots in actively perceiving and understanding multiple features of objects including material, compliance and thermal insulation. A multi-parametric tactile sensing array, where each sensor in the array comprises of a triboelectric unit, a decoupling pressure-temperature dual-mode sensing unit and a heating unit, is constructed to acquire the multiparametric tactile signals. The pressure-temperature sensing unit is fabricated from the developed aerogel/sponge nested composite structures, which exhibit stable compression resilience under the compression strain of over 90 %, impressive fatigue resistance with stable responses after 10,000 compression cycles, and linear thermoelectric responses over wide temperature gradients. The multi-parametric tactile signals are analyzed using Mamba time series network model, resulting in the high-precision clustering of multiple features (up to 98.3 %). Supported by a multimodal interactive perception framework, the LLM realizes multi-feature understanding of objects. With large-scale sample datasets and multi-task training, the LLM-equipped robots powered by the multimodal tactile sensory system are expected to realize embodied intelligence.