how iphone works
Your camera just turned photons into bits. You took a photo with your iPhone. Let’s say you’re outside and looking around you. The sun, 150 million kilometers away, is emitting photons. These travel from the sun to our cozy planet in about eight minutes. Some of these photons hit the things around you and get reflected and hit the retina inside your eyes, and this triggers your brain to create an image, a visual representation of your surroundings. Photography is about capturing that image.

It was invented some 200 years ago. And even before then, for thousands upon thousands of years, humans tried to capture what they saw by making drawings. Most of us carry a camera with us almost every minute of the day; today’s smartphones are some of the most used cameras. Before the era of digital photography, photography would record light onto paper or film.

Today, photography turns light into bits and bytes. This article will go through some of what’s making this happen — how a smartphone camera works. Before we dive into the magic of turning photons into a JPEG file, we will take a look at some of the general concepts of how a photo comes to life.

These concepts are as true today as they were back in the days of film photography. Not so long ago, almost all photography was done with film. A biochemical process affected by light captured the image, while today we’re using a digital image sensor. But since everything else having to do with taking pictures is based on the same principles, a lot of what was true for film photography with bulky film-based cameras still applies when we’re shooting images with an iPhone. The process of capturing a single image is sometimes referred to as an exposure.

Exposure also refers to the amount of light per unit area. This amount of light needs to be within a certain range. If we don’t capture enough light, the image will be underexposed — the image is drowning in the inherent noise floor of the image sensor or film. If we capture too much light, the image will be overexposed — the image sensor/film is too saturated and can no longer differentiate between different amounts of light, meaning all areas will seem to have the same exposure.

When taking a photo, we must adjust the camera in such a way that the amount of light is sufficiently high but not too high. Here are samples of the same scene, underexposed and overexposed. The right-hand side shows the same image with the exposure adjusted with Pixelmator. In the underexposed image, even after trying to make it brighter, the dark regions of the image are “stuck” as black, and there is no way to make out that the pens in the image actually have different colors.

The overexposed image has large regions that are stuck at the same level of white/gray. Note how the pattern on the fabric band and the coins is completely lost. There are three things that affect the amount of light of an exposure: shutter speed, ISO value, and aperture. We will go through these in a bit.

The tricky part is that all three (shutter speed, ISO, and aperture) also affect other aspects of the exposure. And there are countless combinations of these three parameters that result in the same amount of light. Let’s take a closer look. When we capture an image, the image sensor captures light for a specific amount of time. This duration is called the shutter speed, because it describes how fast the shutter opens and closes.

0.02 s or 20 ms). If we change the shutter speed to 1/25 s (40 ms), the image sensor will capture light for twice as long, and it will capture twice the amount of photons, i.e. twice the amount of light. A so-called stop for the shutter speed either doubles or halves the shutter speed.

Going from 1/50 s to 1/25 s is an adjustment by one stop. The iPhone 6 can adjust the shutter speed from 1/8000 s up to ½ s. We can change the shutter speed to adjust the amount of light, but it will also affect motion blur of the image.