Establishing a quality assurance baseline for radiological protection of patients undergoing diagnostic radiology

Introduction The high rate of poverty, illiteracy and disease in most developing countries results in high demands for effective healthcare. Expensive diagnostic equipment compounded by a poor economy, low technical resource capabilities, inadequate policies, and costly control systems have hindered the implementation of quality assurance. Additionally, the International Commission of Radiological Protection (ICRP) has in their Publication 60 recommended the need to adopt elaborate quantitative analysis of quality control tests as well as optimisation of radiological examinations. This recommendation poses a challenge to the majority of diagnostic X-ray departments which are slowly replacing standard speed with high-speed film/screen combination, and rarely with computed or digital radiography. The widespread use of X-rays in diagnosis and management of patients has led to increased exposure to this man-made radiation. Although the clinical use of X-rays is governed by optimisation, justification and the as-low-as-reasonably-achievable (ALARA) principle, more aggressive methods have been proposed. According to Hart et al., the 97/43/ Euratom Council directive which promoted establishment and use of diagnostic reference levels (DRLs), also motivated legislative status. Essentially, the use of DRLs should form the core part of good imaging practice. Imaging professionals should develop clearly defined guidelines that promote quality assurance in accordance with the latest technical knowledge of the equipment concerned. This professional approach will promote the due process of developing technical specifications, standards, and quality management. According to the American College of Radiology (ACR), the use of DRLs assists imaging professionals in managing radiation exposure by exercising good practice based on current knowledge, obtainable resources, and the specific needs of patients, in a safe and cost-effective medical care environment. This system falls within the recommendations of the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) and the International Commission on Radiological Protection, and as so far revealed by research conducted by the International Atomic Energy Agency. At the departmental level, local DRLs support quality assurance by specifying investigation levels for unusually high radiation exposure. The measurement of these dose quantities can be achieved by direct measurements using thermoluminescent dosimeters (TLDs), indirect measurement of physical test phantoms and kerma area product (KAP) meters. European Commission (EC) Report 16262 recommends the use of widely accepted simple methods of measurement that clearly define diagnostic reference levels, expressed as dose quantities with meaningful indications of patient exposure and considering the clinical imaging technique. Our study was undertaken to establish the baseline data for radiographic image processes and image quality to optimise radiological protection for patients.


Introduction
The high rate of poverty, illiteracy and disease in most developing countries results in high demands for effective healthcare.Expensive diagnostic equipment compounded by a poor economy, low technical resource capabilities, inadequate policies, and costly control systems have hindered the implementation of quality assurance.Additionally, the International Commission of Radiological Protection (ICRP) has in their Publication 60 recommended the need to adopt elaborate quantitative analysis of quality control tests as well as optimisation of radiological examinations. 1This recommendation poses a challenge to the majority of diagnostic X-ray departments which are slowly replacing standard speed with high-speed film/screen combination, and rarely with computed or digital radiography.
The widespread use of X-rays in diagnosis and management of patients has led to increased exposure to this man-made radiation.Although the clinical use of X-rays is governed by optimisation, justification and the as-low-as-reasonably-achievable (ALARA) principle, more aggressive methods have been proposed.According to Hart et al., the 97/43/ Euratom Council directive which promoted establishment and use of diagnostic reference levels (DRLs), also motivated legislative status. 2 Essentially, the use of DRLs should form the core part of good imaging practice.Imaging professionals should develop clearly defined guidelines that promote quality assurance in accordance with the latest technical knowledge of the equipment concerned.This professional approach will promote the due process of developing technical specifications, standards, and quality management.According to the American College of Radiology (ACR), the use of DRLs assists imaging professionals in managing radiation exposure by exercising good practice based on current knowledge, obtainable resources, and the specific needs of patients, in a safe and cost-effective medical care environment. 3his system falls within the recommendations of the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) 4 and the International Commission on Radiological Protection, 5 and as so far revealed by research conducted by the International Atomic Energy Agency. 6t the departmental level, local DRLs support quality assurance by specifying investigation levels for unusually high radiation exposure. 7he measurement of these dose quantities can be achieved by direct measurements using thermoluminescent dosimeters (TLDs), indirect measurement of physical test phantoms 8 and kerma area product (KAP) meters.European Commission (EC) Report 16262 recommends the use of widely accepted simple methods of measurement that clearly define diagnostic reference levels, expressed as dose quantities with meaningful indications of patient exposure and considering the clinical imaging technique. 7Our study was undertaken to establish the baseline data for radiographic image processes and image quality to optimise radiological protection for patients.

Materials and methods
The study was done over 1 month in 1 representative X-ray room using a 400-speed film/screen system at 4 hospitals in Kenya.The examination frequencies were obtained from patient records at each hospital.The film processors shown in Table I were assessed for performance according to base plus fog, speed index and contrast index.The optical density was measured using an X-Rite model 341C densitometer (X-Rite Inc., USA) on an aluminium 11-step wedge image produced using adult chest PA exposure factors.During the study period, film rejects from each room were collected, counted and grouped according to size, type and cause of rejection by a senior imaging technologist.
A calibration validation was done on the Harshaw 4500 TLD system (Erlangen, Germany) used to read the TLD cards for patient dose.Four sets of 10 dosimeters each and 3 control dosimeters were sent to the Secondary Standards Dosimetry Laboratory (SSDL) in Arusha, Tanzania, for exposure to a known dose.The TLD sets and 3 control dosimeters were read using the TLD system and results provided to the SSDL for comparison.The validated system was used to read the TLD cards for measuring entrance skin dose (ESD) on patients.Throughout the study, 2 non-irradiated TLDs were included in the batch to evaluate the contribution of background radiation.While measuring doses on adult patients, the following parameters were recorded: exposure factors, focus to film distance, and the patient's age, weight and thickness.An open TLD card was placed on the central position of the beam on the patient's skin using tape.The measured ESD was then compared with diagnostic reference levels (DRLs), and appropriate dose reduction measures issued to each participating X-ray facility.
The image quality of the radiograph was assessed by radiologists who awarded grade A, B or C based on the EC quality criteria. 7Grade A meant features detected and fully reproduced, details visible and clearly defined; B meant features just visible, details just visible but not clearly defined; and C meant features invisible, and details invisible and undefined.
At the end of the study period, quality control tests were performed using standard methods on each X-ray machine.The tests performed included kVp accuracy, reproducibility of exposure, timer accuracy, mA and exposure time linearity, radiation output, light/radiation beam alignment and total filtration (mm Al).The tests were considered 'Passed' or 'Failed' according to the New South Wales Environment Protection Authority Methods and Standards. 9

Examination frequencies
Fig. 1 indicates the percentage distribution of annual examination frequencies.In this study, the other examinations performed included postnasal space (PNS), skull, thoracic, thoracic lumbar, cervical spine, abdomen and intravenous urogram (IVU).

Rejects analysis
Fig. 2 indicates trends and causes of film rejects.The leading causes were related to equipment (21%), positioning (7%), human error (3%), and other causes (8%).In this study, rejects owing to image blur, processor failures, film storage and cassette clips were all grouped as other causes.

Processor performance
The sensitometry results in Fig. 3 indicate the performance level at each hospital.Only hospital 4 had good darkroom conditions, as indicated in the good fog plus base test results.

TLD reader system validation
At the SSDL, the 4 sets of 10 dosimeters each were exposed to a Cs-137 source with the following specific known doses: 0.2 mGy, 0.5 mGy, 1.0 mGy and 2.0 mGy.The results obtained from reading the TLD cards are shown in Table II.There was an evenly increasing deviation from the lowest absorbed dose reading of 8% for 0.12 mGy to 15% for 2 mGy.These results validated the performance of the TLD system to be credible; therefore, ESD results reported in this study are accurate and representative of patient dose.

X-ray equipment performance
The results of the quality control assessment of the X-ray equipment are presented in Table III.Analysis of the measured quality control results indicated that X-ray equipment performance in the 4 hospitals can be ranked as follows: hospital 4 -90%, hospital 1 -80%, hospital 2 -70%, and hospital 3 -67%.

Radiographic technique and patient dose
Table IV indicates the exposure factors and patient parameters used at each hospital.There was a general tendency in most hospitals not to use high kVp radiographic techniques.

Image quality assessment
Image quality assessment results for the radiographs for which doses were measured are shown in Fig. 4.

Discussion
The annual average number of patients for hospitals 1, 2, 3 and 4 was 72 000, 9 600, 12 000, and 14 400 patients respectively.Hospital 1 comprised 5 times more patients than the other participating hospitals.
The examination frequency distribution in Fig. 1 did not exhibit notable variations between the hospitals.The high rate of lumbar spine examination was noted at hospital 1, which is a referral hospital.The UNSCEAR 1993 report did not find significant gender difference except for pelvis and hip examinations. 10The results in this study found a distribution of 43% male and 57% female, of whom 18% were children.
The film reject rate per hospital was correlated with age of equipment age and state of maintenance (Tables I and III).The highest rejects came from hospital 2 (13%); hospitals 1 and 3 produced 11%, and hospital 4 was 6%.The largest proportion of rejects due to positioning came from hospital 3 and 4. Training should reduce film reject rates.Although the number of film rejects was comparable with IAEA reported values, 10 there is adequate scope for dose reduction that would tend to improved image quality, patient dose and use of resources.The annual estimate of film rejects in these 4 hospitals implied that 15 048 radiographic films were wasted and the same number of persons exposed unnecessarily to radiation.The amount of rejects could be reduced through proper choice of film processor, standard radiographic techniques, QC tests, and appropriate education and training of imaging staff.
The film processors passed the speed index and developer temperature tests.However, there were deviations from the expected contrast index values.Hospital 4 deviated by 32% lower, while the other 3 hospitals deviated by more than 30%.The overall sensitometry performance per hospital in descending order was hospital 4, hospital 1, hospital 3 and hospital 2, respectively.Performance correlated with equipment age, film processor maintenance and level of QC in the hospital.Hospital 1 used an established processor maintenance programme but showed a low level of QC test performance, as did the other participating hospitals.The film processor in hospital 2 was a refurbished unit that was not given any QC tests.In this study, contrast index was the most sensitive film processor test; the routine performance and daily plotting of the values obtained would be a good processor performance  indicator.In addition, there was a need to expand QC to cover storage, retrieval, and change of films.The accuracy of exposure factors was essential for consistently highquality diagnostic clinical images.The device performance results in Table III, based on 10 equally weighted quality control tests, showed a correlation with equipment age.The best performing equipment was at hospital 4 (2 years old), and the worst at hospital 3 (25 years old).The kVp accuracy and consistency test indicated good generator performance in most hospitals, except hospital 3. Timer accuracy for the same number of X-ray machines was good except in hospital 3, which used fixed mA or falling load operation that made it difficult to assess exposure time.The overall X-ray equipment QC results obtained in this study indicated proper functioning of the tube voltage, tube voltage ripple, tube current and total filtration. 6,12Beam quality results were also consistent with the American Association of Physicists in Medicine (AAPM) 2002 recommendation of beam quality for all tube settings on any focal spot size and kVp. 13The average performance in collimator tests for all the X-ray machines showed shifts of mirror position, or collimator position in the tube head.There was therefore a possibility of scatter radiation affecting image contrast.X-ray equipment performance can be improved by training, establishing a QA programme and not relying on QC tests performed by service and maintenance engineers alone.

Mean values of radiographic technique and patient parameters at each hospital EC-recommended radiographic technique parameters
The TLD reader must perform within permissible limits to ensure accurate and reliable results.The ESD range in this study was 0.41 mGy -28.6 mGy, with an associated corresponding error limit of 0.04% and 1.5% respectively (see Table II).Although the minimum detectable dose of <0.05 mGy was not achieved for the TLD reader used, the results were comparable with IAEA (2004) dosimetry system requirement of 5% standard deviation per TLD batch and <30% for dose measurements at 0.1 mGy. 6hest examination was the least compliant with international DRLs.The non-compliance was attributed to imaging technique and poor equipment performance.The lumbar spine and pelvis examinations were compliant with international DRLs but they constituted the highest proportion of patient dose.This reveals a new perspective of optimisation that can be exploited.The use of high kVp radiographic technique and standard focus-to-film distance can help to attain optimum imaging range at all the hospitals concerned.This optimisation process can be enhanced by maintaining optimal device performance, promoting the use of local DRLs, continuous image quality assessment, and sound selection of the X-ray equipment during procurement.
To facilitate such an optimisation process, documentation and analysis of patient data and technical factors (as shown in Table IV) are necessary.The average estimated body depths of 22 cm for chest PA, 23 cm for lumbar spine AP, and 25 cm for lumbar spine LAT did not have a significant effect on patient dose.However, the average body weight of 70 kg at hospitals 1 and 4, 71 kg at hospital 2, and 82 kg at hospital 3 had an effect on patient dose measurements.The method employed to measure patient dose was therefore fundamental in developing standard imaging techniques.It was necessary to extensively collect and analyse exposure and patient parameters to develop a quality management system commensurate with a specific radiological facility.This process would facilitate a comparison between diagnostic facilities nationally and internationally including the EC radiographic technical factors. 7t might be hampered by the absence of integrated KAP meters in the X-ray equipment; inbuilt dosimeters would allow the setting of dose action levels, DRLs and validation of optimum image quality.
The relationship between radiation exposure and image quality is essential for radiologists to institute corrective optimisation measures without any loss of clinical information.The results in Fig. 4 show that, out of 542 radiographs, the scores were as follows: 63% grade A, 31% grade B and 6% grade C. The radiologists noted the additional work owing to detailed quality criteria for clinical images, and observed the variation owing to radiographic and processing techniques.Sensitisation of the imaging technologist on radiological image quality factors is essential and could be adopted by hospitals as part of the quality improvement process.A nationwide study could provide results that could be adopted by imaging professionals for accrediting diagnostic departments.

Conclusions and recommendations
The magnitude of patient dose due to rejects, poor equipment performance, poor radiographic techniques and equipment age can be significant.High film rejects results in unnecessary cost that can be avoided if effective quality assurance measures are in place.Quality improvement processes within radiological facilities could be enhanced through accreditation of diagnostic facilities, audits and surveillance programmes.
The results from the present study showed that aged X-ray equipment coupled with poor or no maintenance can have significant effects on radiographic image quality.Good equipment selection, an effective QA programme, and dosimeters can ensure patient radiation protection.The use of high-speed film/screen contributes to compliance with internationally acceptable DRLs without compromising the diagnostic value of images.There is a need to use anatomical image quality assessment such as the EC image quality criteria during the adoption of a new imaging technique.
Radiological protection of patients should be an integral part of a radiological facility's QA programme.X-ray equipment should therefore be installed with KAP meters to facilitate routine patient dose measurements.Developing countries such as Kenya, with a minimal number of radiology experts, can achieve quality assurance through collaboration between regional hospitals and a national referral hospital where the radiology experts are based.The imaging professionals can then perform the necessary QC tests, assess the level of quality improvement, and do acceptance tests on imaging equipment.

Fig. 3 .
Fig. 3. Mean values of the quality control tests for each film processor.

Table III . Quality control test results
*=test not done; -=no comment; HVL measurements done at 80 kVp.