
Finally, these values are then passed on to a digial image processor. We’ll talk a bit more about the image processing below. With this, we can already start to understand why the number of megapixels don’t matter for image quality. Or rather: what’s very important for the quality of the image is the size of the individual pixel. These pixel sensors are tiny.
On an iPhone 6, they’re 1.5 µm (microns or micrometers) on each side. On prosumer DSLR cameras, they can be as large as 9 µm on each side. Two things happen as the size increases. First, the larger the pixel, the more light will hit it, and the more charge will build up. The more charge we have, the lower the noise in the readout.
Imagine you’re listening to music next to a busy street. If all you have is the built-in speaker of your phone, you may barely be able to make out the music. If you have a large stereo set, the noise of the street will disappear. The same for the charge in the pixels vs.
Larger pixels are good. A 9 µm image pixel will capture 36 times as many photons as a 1.5 µm pixel. The second thing is that larger pixels are less affected by bleed. The image sensor is a semiconductor made out of silicon, just like the CPU and RAM. As light hits the sensor, it will, to some extent, bleed into neighboring pixels in a similar fashion to light hitting frosted glass.
As pixels get smaller and smaller, the amount of light bleeding into neighboring pixels increases: the value for each pixel is affected more and more by the light that is actually hitting its neighboring pixels, but bleeds into this pixel. Film cameras use a mechanical shutter, a delicate mechanism that would open in front of the film, and then close after the time specified by the shutter speed has expired. Larger digital cameras still use mechanical shutters, but smartphones and other small digital cameras use an electronic shutter.
Many of these, including iOS devices, use a so-called rolling shutter, which reads out image data line by line. Since lines are not read out at the same time, but in turn, this can lead to odd artifacts in images when objects in the scene move at a fast speed.
Some of these are quite funny. Now we know how the iPhone measures how much light hits each pixel. But this would only result in a black and white photo. Color photos require additional technologies. Before we dive into those, let’s take a look at what color is. We will sidetrack a bit to scratch the surface of what is known as color science. It may seem too obvious that a deep green forest is deep green, and that a bright yellow bike is bright yellow. But what is this thing called “color,
” When working with computers, we might be tempted to answer that a particular color is just a combination of certain amounts of red, green, and blue. But in reality, things are more complicated. Color is the attribute of visual perception consisting of any combination of chromatic and achromatic content.
They define color recursively by referring to color itself, or chromatic, which is just another word for color. The important takeaway from the above is: “Color is a visual perception.” Someone has to be looking at the thing for there to be color. Color doesn’t exist outside our perception. You need a light source and something that reflects this light.


0 Comments