Examination requests and imaging reports are the most important communication instruments between clinicians and radiologists. An accurate and clear report helps referring physicians make care decisions for their patients. To evaluate the contents of initial and re-reported chest reports, assess the inter-observer agreement, and evaluate the clarity of the report contents from the viewpoint of the referring physicians. The content and agreement of the reports were analyzed by comparing the initial reports with re-reports prepared by a chest radiologist. The referring physicians evaluated the contents of 50 reports regarding their medical facts, clarity, and intelligibility. The results were analyzed using cross-over tables, the Pearson Chi-Square, and kappa statistics. Radiologists mostly addressed the questions posed by the referring physicians. General radiologists included separate conclusions in their reports more frequently (22%) than the chest radiologist in her re-reports. Reports prepared by the chest radiologist contained nearly 50% more findings than the general radiologists' reports. Inter-observer agreement between the initial and specialist re-reported reports was 66%, but the kappa value was 0.31. The reports were considered clear/intelligible by the referring physicians in 68% of the initial reports by the general radiologists and in 94% of the re-reported studies by the chest radiologist. Radiology report quality was rather high despite their contents varying depending on the radiologist. Inter-observer agreement of the chest radiographs was low due to the non-structured reports containing different quantities of information, thus complicating the comparison. Referring physicians considered both short and long radiology reports to be clear.