As monitoring and diagnostic tools for long COVID-19 cases, wearable systems and supervised learning-based medical image analysis have proven to be useful. Current research on these two technical roadmaps has various drawbacks, despite their respective benefits. Wearable systems allow only the real-time monitoring of physiological parameters (heart rate, temperature, blood oxygen saturation, or SpO2). Therefore, they are unable to conduct in-depth investigations or differentiate COVID-19 from other illnesses that share similar symptoms. Medical image analysis using supervised learning-based models can be used to conduct in-depth analyses and provide precise diagnostic decision support. However, these methods are rarely used for real-time monitoring. In this regard, we present an intelligent garment combining the precision of supervised learning-based models with real-time monitoring capabilities of wearable systems. Given the relevance of electrocardiogram (ECG) signals to long COVID-19 symptom severity, an explainable data fusion strategy based on multiple machine learning models uses heart rate, temperature, SpO2, and ECG signal analysis to accurately assess the patient's health status. Experiments show that the proposed intelligent garment achieves an accuracy of 97.5 %, outperforming most of the existing wearable systems. Furthermore, it was confirmed that the two physiological indicators most significantly affected by the presence of long COVID-19 were SpO2 and the ST intervals of ECG signals.