Chest X-ray (CXR) images are usually used to identify the causes of patients’ symptoms, including the classes of lung or heart disorders. In visualization examination, CXR imaging in anterior–posterior (A–P) views is a preliminary screening method used by clinicians or radiologists to diagnose possible lung abnormalities, such as pneumothorax (Pt), emphysema (E), infiltration (In), lung cancer (M), pneumonia (P), pulmonary fibrosis (F), and pleural effusion (Ef). However, the identification of the causes of multiple abnormalities associated with coexisting conditions presents a challenge. In ruling out a suspected lung disease, the signs and symptoms of physical conditions need to be identified to arrive at a definitive diagnosis. In addition, low contrast CXR images and manual inspection restrict automated screening applications. Hence, this study aims to propose an iterated function system (IFS) and a multilayer fractional-order machine learning classifier to rapidly screen the possible classes of lung diseases within regions of interest on CXR images and to improve screening accuracy. For digital image processes, a two-dimensional (2D) fractional-order convolution is used to enhance symptomatic features. The IFS with nonlinear interpolation functions is then used to reconstruct the 2D feature patterns. These reconstructed patterns are self-affine in the same class and thus help distinguish normal subjects from those with lung diseases. The accuracy rate is thus improved. Pooling is performed to reduce the dimensions of the feature patterns and speed up complex computations. A gray relational analysis-based classifier is used to identify the possible classes of the signs and symptoms of lung diseases. For digital CXR images in A-P view, the proposed multilayer machine learning classifier with k-fold cross-validation presents promising results in screening lung diseases and improving screening accuracy rate relative to traditional methods. The proposed classifier is evaluated in terms of recall (99.6%), precision (87.78%), accuracy (88.88%), and F1 score (0.9334).