LandEye
logo

© 2025 Landeye.com | All Rights Reserved

LandEye

LandEye DOCS

Help Center, tutorials, and technical articles to help you explore, order, and manage satellite and aerial imagery with LandEye.

knowledge baseOptical Imagery Foundamentals

Overview

Optical satellite imagery relies on passive sensors that measure reflected sunlight to capture detailed information about Earth’s surface. Through this process, these sensors capture optical imagery.
In practice, passive sensors measure how Earth’s surface reflects sunlight and convert this reflected energy into images identical to what the human eye sees. They also capture spectral information beyond the visible range.
Optical sensors include:
Panchromatic (single grayscale band, high spatial detail)
Multispectral (multiple broad bands for land, vegetation, water analysis)
Hyperspectral (hundreds of narrow bands for material-level identification)

When to Use Optical Imagery?

This type of Earth imagery is helpful when users require:
• Natural color visualization
• Land cover classification
• Vegetation, water, soil, and environmental analysis
• Urban and infrastructure mapping
• Change detection in cloud-free conditions

How Optical Sensors Work

Two components define how optical imagery works:

  1. Passive sensing: Optical sensors are passive instruments that don’t emit energy. These sensors measure how surface materials, including vegetation, water, soil, and urban surfaces, reflect sunlight across a range of wavelengths.
  2. Spectral bands: Optical satellites use visible bands (RGB) for capturing natural color, near-infrared (NIR) for vegetation health and vigor, and shortwave infrared (SWIR) for moisture, minerals, and burn severity. This means that each band offers a different layer of information about surface properties.

Reflectance and Raw Pixel Values (DN)

Raw optical imagery stores Digital Numbers (DNs). Optical sensors have integer pixel values that represent the amount of light detected. Optical systems apply calibration to convert DNs into Top-of-Atmosphere (TOA) reflectance and surface reflectance (after atmospheric correction).
This allows users to compare images across dates, sensors, and various environmental conditions.

Resolution Fundamentals

In optical satellite imagery, resolution refers to the amount of detail a sensor can capture across different dimensions. Optical systems deliver four independent resolution types:

Spatial Resolution

Spatial resolution defines the area of ground represented by each pixel and is typically expressed as GSD (Ground Sampling Distance), for example, 30cm, 1m, or 10m.
This type of resolution controls the detectability of objects in the captured image and is influenced by sensor optics, satellite altitude, and off-nadir angle.
Resolution for optical imagery is classified as follows:
Very High Resolution (VHR): < 30–50 cm
High Resolution (HR): 0.5–2 m
Medium Resolution (MR): 5–20 m
Low Resolution (LR): > 20 m

Spectral Resolution

Spectral resolution defines the number and width of spectral bands the sensor can detect. More bands mean more detailed material information.
Higher spectral detail improves the identification of vegetation types, soil conditions, water quality, minerals, and surface materials that cannot be distinguished in standard RGB imagery.
Common spectral groups are:
Visible (Blue, Green, Red): Natural color
Near-Infrared (NIR): Vegetation condition
Shortwave Infrared (SWIR): Moisture, minerals, burn severity
Thermal Infrared (TIR): Surface temperature

Panchromatic (PAN)

This is a single broad band that delivers very high spatial detail, for example, 30-50 cm. The PAN output is grayscale, but sharp.

Multispectral (MS)

MS mode uses several broad spectral bands to provide both spatial detail and spectral diversity. This mode is suitable for land cover, NDVI, and environmental analysis.

Hyperspectral

Uses hundreds of narrow, contiguous bands, useful for material identification. Hyperspectral mode delivers lower spatial resolution but extremely rich spectral data, suitable for detecting minerals, mapping vegetation species, and analyzing soil chemistry.

Radiometric Resolution

This type of resolution defines how precisely the sensor records brightness levels and is determined by bit depth, or the number of possible intensity values per pixel. Radiometric resolution is useful for quantitative analysis or for comparing images across time.
Higher radiometric resolution enhances:
• Shadow detail
• Subtle tone differences in vegetation
• Water and soil boundary detection
• Low-light or high-contrast scenes
Common bit depths are:
8-bit: 256 values
12-bit: 4096 values
16-bit: 65,536 values

Temporal Resolution

Temporal resolution refers to how often a satellite revisits a location and captures the area. The revisit frequency is determined by:
• Constellation size (more satellites = faster revisits)
• Orbit characteristics
• Off-nadir tasking capabilities
Environmental factors also influence this type of resolution. Cloud cover can significantly reduce the number of usable observations. Variations in the sun's angle can affect brightness and shadow length.
High temporal resolution is necessary for:
• Monitoring change over days and weeks
• Detecting short-term events
• Agricultural growth cycles
• Disaster and environmental management

Acquisition Geometry

This type of imagery is affected by the relative angles between sun, sensor, and ground:
Nadir: Sensor points straight down; minimal distortion and best spatial resolution.
Off-nadir: Sensor is tilted off-nadir, allowing faster access to a location but increasing shadows and geometric distortion.
Sun geometry is defined by sun azimuth (horizontal direction) and sun elevation (height above horizon). It controls shadow length, brightness, and overall scene illumination. Low sun angles increase shadows and contrast; high sun angles reduce shadows but may cause glare.
Sun geometry directly affects reflectance, influencing vegetation indices and land cover analysis.
Viewing angle also matters. Off-center (off-nadir) viewing introduces parallax, in which elevated objects shift depending on the viewing angle.
Parallax helps 3D reconstruction (stereo imagery) but reduces positional accuracy in single images. Steep viewing angles increase terrain displacement and reduce measurement reliability. Large angles reduce clarity by amplifying atmospheric path length.

Atmospheric and Environmental Factors

These atmospheric and environmental factors influence optical imagery:

Cloud, haze, fog
Block or scatter sunlight, reducing visibility and contrast; often make optical imagery partially or fully unusable.
Aerosols and particulates
Dust, smoke, and pollution scatter incoming light, causing brightness changes and color shifts.
Shadowing
Sun angle and terrain create shadows that hide features and reduce interpretability, especially in urban or mountainous areas.
Seasonal Variations
Vegetation cycles, snow, and moisture differences alter reflectance patterns across seasons, affecting time-series comparisons.
Surface vs TOA Reflectance
TOA: raw reflectance including atmospheric effects. Surface: corrected reflectance suitable for analysis and multi-date consistency.

Conclusion

Optical imagery is only one part of the broader Earth observation landscape. To build a complete understanding of EO data and choose the most suitable source for each scenario, you can explore the related KB articles on SAR imaging, hyperspectral sensing, and stereo acquisition. These topics expand on the capabilities and limitations of each data type, helping you make informed decisions across a wide range of use cases.