Focal Facts.



Digital Photography touches each of our lives in some way today. Digital Cameras sell like hot cakes but do we know the ingredients behind the perfect image? Here’s a look at how it’s done…

Tracing breakthroughs in technology has been both, one of our favourite pastimes as well as our solemn duty as harbingers of technology related information. The era of digitalisation is both the present and the future. And this has touched our every day lives too, as fluctuating waves (representatives of the analog era) have given way to zeroes and ones. Vinyl gave way to CDs and later MP3s, video cassettes to VCDs and DVDs, similarly did film eventually take the digital plunge, and it’s a path from whence there is no return. And why should there be? The digital photography revolution brought about drastic reductions in the cost of adoption of photography as a hobby while reducing the learning curve and the cost of making mistakes while learning. Sure in the beginning there were teething issues and sceptics wrote off digital photography as an amateur hobby, professionals stuck with film. It’s been years since then, and today film is all but dead, restricted to museums and a very few niche professionals who still don’t believe.

In what follows, we’ll look into one of the most remarkable breakthrough products in modern times—the digital camera. A product that has revolutionised a profession and changed the way people create and share memories, and simultaneously gained adoption into nearly each and every one of our homes.

It’s All About Pixels, Really

The first thing about digital photography is its complete compatibility with anything digital, be it a computer, a cell phone, another digital camera or even a portable multimedia player. This is because digital images are nothing but a long sequence of zeroes and one’s, making up the bits and bytes that only digital devices with digital processors recognise. Also, a digital camera is ultimately tasked with creating a digital image, which is made up of pixels—the smallest unit of any image whether still or motion picture. Images themselves very simply consist of light waves and their interaction with colours. Therefore a digital camera is tasked with capturing a scene (also called a subject) and converting this captured scene into pixel values. Of course capturing a scene is the big part of the job. Just like a traditional camera, a digital camera uses a set of lenses that focus light to create an image of a scene. While film cameras focus this light directly onto a piece of chemically reactive film, the lenses of a digital camera focus this light onto a semiconductor device (popularly called an image sensor) which is capable of storing light electronically.

Cool Fact
The image sensor’s ability to capture detail is measured in pixels or megapixels. A 7.1 megapixel camera displays 3072×2304 pixels or 7,077,888 pixels, when in fact it is supposed to display 7,100,000 pixels. This loss of a few pixels is attributed to the CCD circuitry. This loss isn’t present in CMOS sensors.

 

Now that’s all that there is to a digital camera. Did that sound easy? No? Good, because it’s not. Let’s look at the significant individual parts of a digital camera one at a time and list out their role (no pun) in creating a digital image from light.

The Image Sensor Photophobic And Loving It

We’ve already mentioned the role of the image sensor in capturing an image. Image sensors are basically silicon chips, (like your PC’s CPU), around the size of your fingernail.

An Image Sensor

The surface consists of millions of photosensitive diodes, each capable of capturing a single pixel of the entire image (for ease of understanding you may assume each diode to be a pixel and the terms have been used interchangeably). These diodes convert the light that falls on them into electrons or an electrical charge. We all know when an image is clicked, the lens shutter opens briefly allowing light in. Now this light falls on each of these photosensitive diodes, (but with varying intensities according to the subject in the photograph), and is then converted into electrons. The more the light hitting a diode, the greater the charge it will record. For example, pixels capturing light from bright areas of a scene (bulbs, shiny objects, sunlight etc) will store a larger charge while those capturing light from dimmer areas (dark objects, shadows, etc) will have a lower charge. After the shutter closes to end the exposure, the charge from each pixel is measured and stored. From these numbers, result a digital image.

You will (surprisingly) not know that each diode can only capture brightness and not colour. Therefore, each pixel records an intensity of shade, a series of tones ranging from pure white to pure black. This is what we call greyscale. Wait now… where are all those gorgeous colours you see? That’s the job of the next bad boy we’re going to see—the Image Processor. But for now let’s look at the two types of Image Sensors—CMOS (Complementary Metal Oxide Semiconductor) and CCD (Charge Coupled Device).

CCD sensors are generally smaller resulting in their use in mostly compact digital cameras. In a CCD sensor, every pixel’s charge is transported across the chip to a single output node (the modern cameras may have more than one such node) where they are converted to analog voltages. This voltage is then buffered and sent off the chip as an analog signal. Then an ADC (Analog to Digital Converter) turns each pixel’s analog value into a digital value by measuring the amount of charge present at that particular pixel’s corresponding area.

CMOS sensors tend to be larger and till very recently were considered to be superior to CCDs which accounts for their prevalent use in DSLR (Digital Single Lens Reflex) a.k.a. professional cameras. On the CMOS sensor each pixel has its set of transistors that amplify, convert and move the charge. Often CMOS sensors also have additional circuitry for noise correction and digitalisation circuits, so that the sensor itself outputs digital bits rather than analog signals doing away with the need for an ADC in many cases.

How the filter works

These different technologies have a number of advantages and disadvantages apparent from their functional peculiarities. Since each pixel on a CMOS sensor has a number of transistors neighbouring it the light sensitivity is reduced since with the same exposure some light will hit these dead zones i.e. the transistors. A CCD doesn’t have this problem, but they consume a lot more power—in the range of 50 to a 100 times more than CMOS sensors.

For each image you click, there are millions of calculations that take place inside the camera. It’s these calculations that allow the camera to interpolate, preview, capture, compress, filter, store and transfer the image as well as adjust settings, and allow manual tuning before taking a shot. The latest image processors are programmable which allows them to perform some advanced functions like in-camera photo editing, red eye removal, picture borders, panorama views and stitching, and even removing the blur caused by a shaky hand.

We spoke about each diode working in greyscale only. The next big step to an image is to add colours to it. This is where the Image Processor and the Image Sensor work together. In order to get colours into the image, the sensor must use filtering to look at the light (that they have captured) in its three primary colours. Once these three colours have been recorded, they are used in combination with each other to create all the colours. The three primary colours are also called RGB, where R is Red, G is Green and B is Blue. Nearly all cameras use what is called a Bayer Filter. This filter consists of a row of red and green blocks alternated with a row of blue and green blocks, something like a set of mosaic tiles. Note that the total number of individual blocks of each colour isn’t identical and there are around two times more green blocks than any other colour simply because the eye is more sensitive to the colour green. Also note that the pixels on the filter are such that they can only capture one colour (that is, the colour corresponding to the colour of the filter placed directly above that pixel). Such a colour filter is placed over each pixel-capturing micro site and the image processor guesses the exact colour (read true colour) of a pixel by combining the actual colour that it captured corresponding to that pixel with the other two colours captured by the pixels surrounding it. If the raw output of a digital camera image sensor with a Bayer filter were to be viewed it would appear as a grid of red, green and blue pixels of varying intensities. There is then a process to break down this mosaic of estimated colours into another mosaic pattern with true colours. One major advantage is that a particular colour (of a particular pixel) can be used again by another pixel if required.

Exposure and Focusing: Camera Optics

A digital camera has to control the amount of light reaching the Image Sensor, just like a film camera has to control exposure of the film. There are two components that work together to control the amount of light that is captured. They are the aperture and shutter speed. The Aperture is very simply the size of the opening in the camera that is the opening which allows light inside, and on to the image sensor. Mostly digital cameras have automatic apertures, although a manual setting is provided for enthusiasts and professionals to have more control over the final image. The shutter speed controls the duration for which light is allowed to pass through the aperture. Digital cameras can have both analog and digital shutters.

The bayer filter

The lens of a digital camera is very similar to the lens on a conventional film camera. The role of the lens too remains the same—to control how light is focussed on the image sensor for which the lens actually moves. All digital cameras use autofocus—a feature where the lenses automatically adjust to focus on the subject. Where there are more than one subjects advanced autofocus methods may be used or manual focus is also available on digital SLR cameras.

Cool Fact
In 1860 James Clerk Maxwell created a colour photograph by using a black and white camera to take three photos of a tartan ribbon, and each photo was taken with a single colour filter in place (red, green and blue). The three black and white images were then projected onto a screen using three discrete projectors. When brought into alignment this formed a full colour photograph, and forms the basis for all cameras since.


The last part of the camera optics is the focal length. The focal length is defined as the distance between the lens and the surface of the sensor. But whereas for a film camera the size of the film is fixed at 35 mm, for a digital camera the size of the image sensor changes from manufacturer to manufacturer and model to model. In most digital cameras the size of this sensor is smaller than the size of a 35-mm film, so in order to project the image on this sensor the focal length also has to be reduced in proportion. With a zoom lens (optical zoom) the focal length is actually changed so that the subject appears to be much closer.

In Conclusion

Think of your digital camera as a miniature computer. As camera resolutions increase and image sensors have evolved, digital photography has become good enough to become a household commodity, and the focus has now shifted to features and usability. Features like face recognition, on-the-fly red eye removal and even colour and tone adjustment in an image, grant more control over image settings even as they complicate things (unnecessarily for some) over the default Auto mode. With the latest consumer cameras upping the MP rating and optical zooms, and a host of other specifications we’ve come a long way from black and white giving way to colour.

Mr Maxwell would be pleased…
 

 

Michael Browne
Digit.in
Logo
Digit.in
Logo