The digital camera is one of the most remarkable instances of this shift because it is so truly different from its predecessor. Conventional cameras depend entirely on chemical and mechanical processes — you don’t even need electricity to operate them. On the other hand, all digital cameras have a built-in computer, and all of them record images electronically.
The new approach has been enormously successful. Since film still provides better picture quality, digital cameras have not completely replaced conventional cameras. But, as digital imaging technology has improved, digital cameras have rapidly become more popular.
Digital Camera Basics
Let’s say you want to take a picture and e-mail it to a friend. To do this, you need the image to be represented in the language that computers recognize — bits and bytes. Essentially, a digital image is just a long string of 1s and 0s that represent all the tiny colored dots — or pixels — that collectively make up the image. (For information on sampling and digital representations of data, see this explanation of the digitization of sound waves. Digitizing light waves works in a similar way.)
If you want to get a picture into this form, you have two options:
- You can take a photograph using a conventional film camera, process the film chemically, print it onto photographic paper and then use a digital scanner to sample the print (record the pattern of light as a series of pixel values).
- You can directly sample the original light that bounces off your subject, immediately breaking that light pattern down into a series of pixel values — in other words, you can use a digital camera.
At its most basic level, this is all there is to a digital camera. Just like a conventional camera, it has a series of lenses that focus light to create an image of a scene. But instead of focusing this light onto a piece of film, it focuses it onto a semiconductor device that records light electronically. A computer then breaks this electronic information down into digital data. All the fun and interesting features of digital cameras come as a direct result of this process.
CCD and CMOS: Filmless Cameras
Instead of film, a digital camera has a sensor that converts light into electrical charges.
The image sensor employed by most digital cameras is a charge coupled device (CCD). Some cameras use complementary metal oxide semiconductor (CMOS) technology instead. Both CCD and CMOS image sensors convert light into electrons. If you’ve read How Solar Cells Work, you already understand one of the pieces of technology used to perform the conversion. A simplified way to think about these sensors is to think of a 2-D array of thousands or millions of tiny solar cells.
Once the sensor converts the light into electrons, it reads the value (accumulated charge) of each cell in the image. This is where the differences between the two main sensor types kick in:
- A CCD transports the charge across the chip and reads it at one corner of the array. An analog-to-digital converter (ADC) then turns each pixel’s value into a digital value by measuring the amount of charge at each photosite and converting that measurement to binary form.
- CMOS devices use several transistors at each pixel to amplify and move the charge using more traditional wires.
Differences between the two types of sensors lead to a number of pros and cons:
For more Detail: How Digital Cameras Work