The number of computed tomography (CT) examinations performed each year has been increasing since the introduction of this imaging modality, with more than 80 million performed in the United States in 2014. Along with its popularity, the concern of radiation exposure during CT studies has also increased, not only among clinicians but also among patients. Recent technical improvements, such as automated tube current and voltage modulation, and the use of iterative reconstruction (IR) have allowed significant reductions in radiation exposure without compromising image quality.
The perceived image quality in CT is determined by the interaction of different parameters such as the image contrast, spatial resolution, artifacts and image noise. Knowledge of the physics behind these determinants has permitted the design of low-dose CT protocols by modifying the scanning parameters (tube voltage, tube current, pitch and rotation time). Some of these modifications increase noise, but good quality images can nonetheless be obtained through the use of IR.
IR algorithms are software tools that reduce image noise in low-dose CT examinations while preserving the image quality. Commercially available IR algorithms include statistical IR and model-based (or full) IR. These tools achieve higher image quality than traditional filtered back projection (FBP) and have great potential for radiation dose reduction. The technical calculation process of IR is complex and exceeds the aim of this review. Each scanner vendor has developed its own algorithm with specific strength levels for noise reduction. In general, image noise decreases gradually as the strength level of IR increases. Unfortunately, some newer modeled IR techniques need much longer computational time than the standard FBP.