Stanford EE259 I Lenses, image sensors, image signal processing I 2023 I Lecture 20
05 Feb 2024 (9 months ago)
- Image formation in cameras involves light rays passing through imaging optics and focusing on an image sensor.
- Digital image sensors consist of pixels that measure light intensity at different wavelengths, allowing for color cameras and specialized cameras like thermal cameras.
- Exposure (H) is the amount of light energy per unit area reaching the image sensor and is determined by irradiance (E) and exposure duration (T).
- Aperture size controls irradiance, with larger apertures increasing light and reducing depth of field (range of distances in focus).
- Shutter speed controls exposure duration, with slower speeds increasing motion blur and faster speeds freezing motion.
- The correct exposure ensures highlights and shadows are properly represented in the final image, avoiding underexposure (too dark) or overexposure (washed out).
- Lenses are essential for image formation and focusing light onto a point in the image plane.
- The focal length of a lens determines the field of view of the camera.
- The F number of a lens determines the effective aperture size and affects the depth of field.
- Geometric optics is a framework used to analyze lenses and image formation, but it ignores wave effects such as diffraction and interference.
- Ideal or thin lenses have a simple geometry and can be analyzed using geometric optics principles.
- The focal length of an ideal lens is related to the geometry of the lens, such as the radii of curvature of its surfaces.
- The focal length of a lens is determined by the radii of curvature of its surfaces and the refractive index of the lens material.
- The Gaussian lens formula relates the object distance, image distance, and focal length of a thin lens.
- The lateral magnification of a thin lens is the ratio of the image height to the object height and is negative, indicating an inverted image.
- Thick lenses can be modeled using additional parameters beyond the focal length and radii of curvature.
- Ray tracing principles apply to both thin and thick lenses, and the same image formation principles hold.
Focusing and Depth of Field
- For a fixed focal length lens and a fixed image sensor position, only objects at a specific distance will be in perfect focus.
- To focus on objects at different distances, the sensor must be moved relative to the lens.
- Lenses transform a 3D object or scene into a 3D image, but sensors can only capture a 2D slice of that image.
- Changing the focus distance moves the sensor relative to the lens to bring different planes into focus.
- Changing the focal length affects the field of view: longer focal lengths reduce the field of view, giving a zooming effect.
- The F-number (f/) controls the aperture size and affects the total light energy reaching the sensor.
- Aperture size also affects the depth of field, which is quantified using the permissible circle of confusion.
- The permissible circle of confusion (C) is the size of the spot on the image that is considered acceptably sharp or almost in focus.
- C is typically taken to be the size of one pixel for digital sensors.
- The depth of field (D) is the range of distances from the lens that all subjects that fall within that range of distances are reasonably in focus or acceptably sharp.
- The depth of field is affected by several factors, including the F number (N), the distance from the lens to the in-focus object plane (U), and the focal length (F).
- The depth of field is inversely proportional to the F number, meaning that a larger F number (smaller aperture) will result in a greater depth of field.
- The depth of field is proportional to the square of the distance from the lens to the in-focus object plane, meaning that objects that are closer to the lens will have a shallower depth of field.
- The depth of field is inversely proportional to the square of the focal length, meaning that lenses with shorter focal lengths will have a greater depth of field.
CMOS Image Sensors
- CMOS sensors are more commonly used in digital imaging due to their flexibility, cost-effectiveness, and good performance.
- CMOS APS (Active Pixel Sensor) is a type of CMOS sensor that consists of an array of active photosensitive pixels.
- Each pixel in a CMOS APS consists of a photosensitive component (reverse-biased pin photodiode), readout circuitry, and reset circuitry.
- The 3T CMOS APS (3-Transistor CMOS APS) is the most widely used architecture for small, inexpensive cameras due to its simplicity, practicality, and good performance.
- MOSFET transistors are used as switches in the 3T CMOS APS design, operating either in the on or off state.
- The digital switch in a MOSFET transistor is controlled by the voltage between the gate and the source (Vgs) and the voltage between the drain and the source (Vds).
- The MOSFET transistor acts as a switch controlled by the gate-source