1) What Is Digital Image Processing?
An image may be defined as a two-dimensional function, f(x,y), where x and y are spatial (plane)
coordinates, and the amplitude of f at any pair of coordinates (x, y) is called the intensity or gray
level of the image at that point. When x, y, and the intensity values of f are all finite, discrete
quantities, we call the image a digital image.
A digital image is composed of a finite number of elements, each of which has a particular location
and value. These elements are called picture elements, image elements, pels, and pixels. Pixel is the
term used most widely to denote the elements of a digital image.
Electromagnetic waves
Unlike humans, who are limited to the visual band of the electromagnetic (EM) spectrum, imaging
machines cover almost the entire EM spectrum, ranging from gamma to radio waves.
The principal energy source for images in use today is the electromagnetic energy spectrum. Other
important sources of energy include acoustic, ultrasonic, and electronic (in the form of electron beams
used in electron microscopy). Synthetic images, used for modeling and visualization, are generated by
computer.
Electromagnetic waves can be conceptualized as propagating sinusoidal waves of varying
wavelengths, or they can be thought of as a stream of massless particles, each traveling in a wavelike
pattern and moving at the speed of light. Each massless particle contains a certain amount (or bundle)
of energy. Each bundle of energy is called a photon. If spectral bands are grouped according to energy
per photon, we obtain the spectrum shown in Fig. 1.5, ranging from gamma rays (highest energy) at
one end to radio waves (lowest energy) at the other. The bands are shown shaded to convey the fact
that bands of the EM spectrum are not distinct but rather transition smoothly from one to the other.
• Gamma Rays: include nuclear medicine and astronomical observations.
• X-rays: best known use of X-rays is medical diagnostics, but they also are used extensively
• in industry and other areas, like astronomy.
• Ultraviolet Band: lithography, industrial inspection, microscopy, lasers, biological imaging, and
astronomical observations
• Visible & Infrared Bands: infrared band often is used in conjunction with visual imaging. An
area is light microscope. Another major area of visual processing is remote sensing, which
usually includes several bands in the visual and infrared regions of the spectrum.
• Microwaves: The dominant application of imaging in the microwave band is radar. The unique
feature of imaging radar is its ability to collect data over virtually any region at any time,
regardless of weather or ambient lighting conditions.
• Radiowaves: radio waves are used in magnetic resonance imaging (MRI
In medicine, radio waves are used in magnetic resonance imaging (MRI). This technique places a
patient in a powerful magnet and passes radio waves through his or her body in short pulses. Each
pulse causes a responding pulse of radio waves to be emitted by the patient’s tissues.The location from
which these signals originate and their strength are determined by a computer, which produces a two-
dimensional picture of a section of the patient. MRI can produce pictures in any plane.
Imaging using “sound” finds application in geological exploration, industry, and medicine.
,Fundamental Steps in Digital Image Processing
• Image acquisition - involves preprocessing, such as scaling.
• Image enhancement is the process of manipulating an image so that the result is more suitable
than the original for a specific application. The word specific is important here, because it
establishes at the outset that enhancement techniques are problem oriented.
• Image restoration is an area that also deals with improving the appearance of an image.
However, unlike enhancement, which is subjective, image restoration is objective, in the sense
that restoration techniques tend to be based on mathematical or probabilistic models of image
degradation. Enhancement, on the other hand, is based on human subjective preferences
regarding what constitutes a “good” enhancement result.
• Color image processing is an area that has been gaining in importance because of the
significant increase in the use of digital images over the Internet.
• Wavelets are the foundation for representing images in various degrees of resolution.
• Compression, as the name implies, deals with techniques for reducing the storage required to
save an image, or the bandwidth required to transmit it.
• Morphological processing deals with tools for extracting image components that are useful in
the representation and description of shape.
• Segmentation procedures partition an image into its constituent parts or objects. In general,
autonomous segmentation is one of the most difficult tasks in digital image processing. A
rugged segmentation procedure brings the process a long way toward successful solution of
imaging problems that require objects to be identified individually
• Representation and description almost always follow the output of a segmentation stage, which
usually is raw pixel data, constituting either the boundary of a region (i.e., the set of pixels
separating one image region from another) or all the points in the region itself. In either case,
converting the data to a form suitable for computer processing is necessary. The first decision
that must be made is whether the data should be represented as a boundary or as a complete
region. Boundary representation is appropriate when the focus is on external shape
characteristics, such as corners and inflections. Regional representation is appropriate when
the focus is on internal properties, such as texture or skeletal shape.
• Recognition is the process that assigns a label (e.g., “vehicle”) to an object based on its
descriptors.
,2) Digital Image Fundamentals
Light and the Electromagnetic Spectrum
The energy of the various components of the electromagnetic spectrum is given by the expression:
Electromagnetic waves can be visualized as propagating sinusoidal waves with wavelength (Fig.
Below), or they can be thought of as a stream of massless particles, each traveling in a wavelike pattern
and moving at the speed of light. Each massless particle contains a certain amount (or bundle) of
energy. Each bundle of energy is called a photon. We see from Eq. (2.2-2) that energy is proportional
to frequency, so the higher-frequency (shorter wavelength) electromagnetic phenomena carry more
energy per photon. Thus, radio waves have photons with low energies, microwaves have more energy
than radio waves, infrared still more, then visible, ultraviolet, X-rays, and finally gamma rays, the most
energetic of all. This is the reason why gamma rays are so dangerous to living organisms.
The colors that humans perceive in an object are determined by the nature of the light reflected from
the object. A body that reflects light relatively balanced in all visible wavelengths appears white to the
observer. However, a body that favors reflectance in a limited range of the visible spectrum exhibits
some shades of color. For example, green objects reflect light with wavelengths primarily in the 500 to
570 nm range while absorbing most of the energy at other wavelengths.
, Light that is void of color is called monochromatic (or achromatic) light. The only attribute of
monochromatic light is its intensity or amount. Because the intensity of monochromatic light is perceived
to vary from black to grays and finally to white, the term gray level is used commonly to denote
monochromatic intensity. We use the terms intensity and gray level interchangeably in subsequent
discussions. The range of measured values of monochromatic light from black to white is usually called
the gray scale, and monochromatic images are frequently referred to as gray-scale images.
In addition to frequency, three basic quantities are used to describe the quality of a chromatic light
source:
• Radiance is the total amount of energy that flows from the light source, and it is usually
measured in watts (W).
• Luminance, measured in lumens (lm), gives a measure of the amount of energy an observer
perceives from a light source. For example, light emitted from a source operating in the far
infrared region of the spectrum could have significant energy (radiance), but an observer would
hardly perceive it; its luminance would be almost zero.
• Finally, brightness is a subjective descriptor of light perception that is practically impossible to
measure. It embodies the achromatic notion of intensity and is one of the key factors in
describing color sensation.
Image Sampling and Quantization
To create a digital image, we need to convert the continuous sensed data into digital form. This involves
two processes: sampling and quantization.
The basic idea behind sampling and quantization is a continuous image f that we want to convert to
digital form. An image may be continuous with respect to the x- and y-coordinates, and also in amplitude
((intensity level). To convert it to digital form, we have to sample the function in both coordinates and in
amplitude. Digitizing the coordinate values is called sampling. Digitizing the amplitude values is called
quantization.