Processing Workflows for Airborne Remote Sensing

Transcription

Processing Workflows for Airborne Remote Sensing
08/07/2011
Processing Workflows for Airborne Remote Sensing
Koen Meuleman, Jan Biesemans, Tim Deroose, Kristin Vreys, Walter Horsten, Sindy Sterckx,
Dries Raymaeckers
Content
Introduction: motivation
Processing workflows: functional flows
Processing workflows: middleware
Processing workflows: hardware
Archiving workflow (Level0 – Level1)
Processing workflow (Level1 – Level2/3/4)
Algorithms Level0 – Level1:
•
Interest point selection
•
Co-registration
•
Block bundle adjustment
Algorithms Level1 – Level2:
•
ortho-rectification
•
DEM/DSM generation
•
MODTRAN4
•
water vapor, visibility
•
haze removal, shadow removal
•
atmospheric BRDF, topographic BRDF, target BRDF
Algorithms Level2 – Level3: mosaic generation
Algorithms Level2/3 – Level4: classification algorithms, change detection, …
Algorithmic components: Software development requirements
08/07/2011
© 2009, VITO NV – All rights reserved
2
Introduction: motivation
Video
Pushbroom
whiskbroom
Frame Camera
The software development process of the processing workflows for airborne remote sensing started at July
2004 and was triggered by:
•
PEGASUS, MERCATOR-1 (Flemish Government): High-Altitude Long Endurance Solar Powered UAV
(operations between 14 and 20 km)
•
MEDUSA Phase C/D (ESA PRODEX): design & assembly of an ultra-light RGB camera (< 2 kg, data
throughput is around 20 MBit/s) for UAV platforms (i.e. the Mercator-1 platform) in support of
photogrammetric and disaster management applications.
•
AGIV (Flemish Government): prototyping photogrammetric workflows for GRB anomaly detection.
The GRB is the large scale geographic reference database of Flanders containing a set of GIS and CAD
layers for roads, road elements, buildings, railways, waterways, and parcels
•
APEX (ESA PRODEX): 312 spectral rows in the VNIR and 195 spectral rows in the SWIR. Data
throughput is around 500 MBit/s.
•
BELSPO funded hyperspectral campaigns: CASI-SASI in 2002, CASI-ATM in 2003, HYMAP in 2004,
AHS160 and CASI550 campaign in 2005, AHS160 & CASI550 & AISA & UltraCAM campaigns in 2007.
•
FP6 OSIRIS: Sensor Web: to enable the user to deploy, task and query sensors in support of disaster
management applications which require near-real-time information (to be demonstrated in 2009 in a
forest fire application).
•
RIMS (BELSPO): Generating image mosaics from video in support of disaster management or
situation awareness services.
•
ISYSS (IBBT): sensor positioning, event detection, hardware implementations
08/07/2011
© 2009, VITO NV – All rights reserved
3
Introduction: development strategy
Project-driven: user requirements and system requirements are not always constant (especially in prototyping
research projects) evolutionary life-cycle model (i.e. build a larger prototype system by pooling
single projects to serve a common objective)
08/07/2011
© 2009, VITO NV – All rights reserved
4
Introduction: development strategy
Archive: raw data + metadata & data products
APEX Operational platform V1
AGIV Operational platform V1
V2
V3
SLA (Service Level Agreement) 1
SLA 2
SLA 3
2011
2012
Research, assembly and integration test platform
Innovation by means of classical “project work” (FP7, BELSPO, IWT, …):
AGIV, VITO, universities, companies, …
08/07/2011
© 2009, VITO NV – All rights reserved
5
Processing Workflows: functional flow “classical” airborne missions
08/07/2011
© 2009, VITO NV – All rights reserved
6
Processing Workflows: functional flow “classical” airborne missions
Archiving
workflow
“Classical” airborne applications:
• Very high resolution and high-precision mapping using
photogrammetric camera systems (DMC, Ultracam, …) for
civil structure mapping and civil structure change mapping.
•
Calibration Support Tools: Support tools for
checking the interior and exterior orientation
parameters and the spectral and radiometric
calibration.
•
Archiving: The generation of standardized Level1
HDF5 files (i.e. “archive objects”) from the raw
incoming image data (Level0) and image metadata in
the framework of long-term archiving strategies. Not
all data arrives at Level0. Sometimes, only Level2,
Level3 or intermediate image products are available.
After the calibration checks, the incoming data is
transformed to self-descriptive Level2 or Level3 HDF5
archive objects.
3D workflow
The automated image-based generation of area-wide DEM
and DSM models at a user specified spatial resolution. Datafusion aspects with LIDAR are not yet included.
Processing
workflow
This workflow is subdivided in three sub-systems:
•
Level1 to Level2 processing workflow: the
automated
generation
of
orthorectified
and
atmospheric corrected single images.
• mapping of the quality physical environment (air quality, water
quality, land cover quality mapping) using multispectral and
hyperspectral camera systems (APEX, HYMAP, AHS160,
HYSPEX)
•
Level2 to Level3 workflow: the
generation of image mosaic products.
automated
Level2/3 to Level4 change detection workflow:
a.
the generation of a soft classification based on
Level2/3 image data, auxiliary data (DSM,
texture parameters, …) and a “field-truth” data
set.
b.
Modules for interpreting the soft classification
and the creation of a change-vector-layer
HDF5 (www.hdfgroup.org) is used for all product packaging
and archiving.
•
Main user requirement is “high-precision” both in
georeferencing and radiometric/spectral calibration
08/07/2011
8/07/2
011
© 2009, VITO NV – All rights reserved
7
Processing Workflows: functional flow near-real-time situation awareness
services
Sensor Web
Enablement
(SWE):
Mercator-Low (1/4 scale
model of the Mercator-1)
equipped with a PAL video
camera (Military test
range, Limburg, Belgium,
6 June 2006)
08/07/2011
© 2009, VITO NV – All rights reserved
8
08/07/2011
© 2009, VITO NV – All rights reserved
9
Processing Workflows: middleware for distributed computing
Middleware is computer software that connects software components or applications. The software
consists of a set of enabling services that allow multiple processes running on one or more
machines to interact across a network.
• Airborne missions generate thousands of images need for distributed computing need to
chose patterns for parallelism:
Master/Worker: Master application constructs a job-list and maintains the jobdependency. Worker applications ask the master for a job and execute this job.
Task/Data Decomposition: algorithmic module is executed on smaller subsets of data.
Master applications implements the task and data decomposition.
• Parallelism is implemented in the middleware, NOT in the applications (this keeps the
C++/C/Fortran/Java/IDL code of the applications as simple as possible)
08/07/2011
© 2009, VITO NV – All rights reserved
10
Processing Workflows: middleware for distributed computing
Middleware for cluster/distributed computing is developed by VITO:
•
•
•
•
•
Message passing (over reliable TCP/IP sockets) and job-pulling Master-Worker pattern developed
in Java (VITO, Dept. Remote Sensing)
Multiple Masters can run next to each other. Multiple masters can run on one machine.
Masters can be configured to take only processing jobs from specific registered users/operators.
One worker per machine. Workers keep multiple threads alive which invoke the running of
applications. The number of threads can be altered on-the-fly.
Workflow Monitoring Software: Java GUI application for monitoring and configuring the processing
cluster
08/07/2011
© 2009, VITO NV – All rights reserved
11
Processing Workflows: Hardware Components
»
»
»
»
»
Database Server
Master Nodes (are assigned to specific user and workflow types)
Worker Nodes (are assigned to a specific Master)
Web Server
Storage
» Long Term Data Archive:
» optimized for growth and expansion
» contains L1/L2/L3/L4 archived products
» Short Term Storage:
» optimized for throughput
» Contains temporary data between processing steps
08/07/2011
© 2009, VITO NV – All rights reserved
12
Processing Workflows: Hardware Components
Prototype Research and Development Platform
Operational Platform
Total 172 cores on 19 machines
08/07/2011
© 2009, VITO NV – All rights reserved
13
Archiving workflow: geometric calibration
Quality control and transformation to internal standards:
•
Interior orientation (FOV, focal length, CCD properties,
…)
•
Exterior orientation
1. dGPS correction (POSPAC)
2. GPS X, Y, Z & IMU roll, pitch, yaw
3. Block bundle adjustment
4. Boresight angles
•
Coordinate transformations (internal standard for linesensors: native GPS coordinate system; for framecameras: in the coordinate system of the block bundle
adjustment but with the full specification for transformation
towards GPS coordinate system).
Geometric calibration
using GCP’s via a
Monte Carlo
simulation (10000
runs). APEX test-flight
example (October
2008).
08/07/2011
© 2009, VITO NV – All rights reserved
14
APEX PAF - level 0 -1 Processing
RSL
development
VITO
development
Archiving workflow
-) Level 1 X data
-) Applanix data
-) Config XML’s
APEX level 1 standard
product (HDF5)
Can be delivered to the
users (ftp, harddisk)
PosPac
Processing
08/07/2011
© 2009, VITO NV – All rights reserved
15
Archiving workflow: Spectral calibration
14
Spectral calibration, i.e. the determination of shifts in
central wavelength and bandwidth, only possible if
spectral resolution is fine enough (i.e. to see the
effect in absorption regions)
12
at-sensor radiance
10
O2
8
H20
H20
6
4
Fraunhofer
line
O2
2
0
400
H20
450
500
550
600
650
700
750
800
850
900
950
1000
wavelength (nm)
CASI reflectance spectra of a sand pixel.
Left: original with wrong wavelength
calibration file. Right: after recalibration of
the sensor by the subcontractor:
reflectance spectra in blue region are
correct and less spikes are observed near
the absorption bands (however still some
spikes are visible near the O2 band).
08/07/2011
© 2009, VITO NV – All rights reserved
16
Archiving workflow: file format and unit standardization
Function: After quality control, geometric calibration, spectral calibration and radiometric calibration, (a)
the packaging of image data and image metadata in one single “archive object”, i.e. a Level1 HDF5
file, (b) the production of an orthorectified quicklook.
All products are distributed in
the HDF5 self-descriptive
format to ensure that data and
meta-data are kept together.
HDF5 is the standard image
product distribution format of
USGS and NASA.
17 April 2009
Processing Workflows
08/07/2011
Airborne Remote Sensing
© 2009, VITO NV – All rights reserved
17
17
Processing workflow (Level1 to Level2/3/4)
Subdivided in 3 workflows grouped in one single WWW user interface:
•
•
•
Level1 (raw) to Level2 (geometric and atmospheric corrected block of images)
Level2 to Level3 (mosaic of Level2)
Level2/3 to Level4 (e.g. change detection products, soft classifications, …)
This allows for:
(a) Different processing “entry points”.
(b) The option to forward Level2, 3 or 4 products in the archive system
08/07/2011
© 2009, VITO NV – All rights reserved
18
Algorithms Level1-Level2: Orthorectification
For:
•
•
•
A better interfacing with Modtran4, and
Given the requirement that all Level2 algorithms
have to work on the raw sensor geometry and
resampling has to be done at the very end of the
workflow, and
To support frame sensors, whiskbroom and
pushbroom sensors
It was decided to develop an in-house C++ module.
This C++ module was fully validated against the Inpho
OrthoVista package using UltracamD imagery and
Parge using AHS data
08/07/2011
© 2009, VITO NV – All rights reserved
19
Algorithms Level1-Level2: DEM/DSM generation
VITO currently uses Correlator3D (www.simactive.com) for DEM/DSM extraction (very easily
operated and ultra-fast due to GPU aided processing). : C++ implementation of the paper: C.
Strecha and L. Van Gool. 2008. A generative model for true orthorectification. ISPRS 2008 (Beijing)
08/07/2011
© 2009, VITO NV – All rights reserved
20
Algorithms Level1-Level2: On-the-fly configuration of MODTRAN4
(AFRL: US Air Force Research Laboratory)
Atmospheric Correction = Determining the at-target radiance (LT) by correcting the measured at-sensor
spectral radiance for the “path radiance” LP (Haze) and “background radiance” (LB). LP and LT are
influenced by: earth-sun distance, solar incident angle (i), solar azimuth, sensor zenith angle (e), sensor
viewing azimuth, atmospheric composition (water vapor, O3, CO2, CO, …) and cloud cover.
MODTRAN4 has 176 configurable parameters, as such, pre-calculated look-up
tables (e.g. ATCOR 6D LUT) are non-generic (a) An XML configuration file can
be uploaded to fully customize the MODTRAN4 processing; (b) image based
parameter estimation (water vapor, visibility, illumination geometry).
Modtran 5 will be implemented in course of 2011
08/07/2011
© 2009, VITO NV – All rights reserved
21
Algorithms Level1-Level2: Overview Atmospheric Correction
MODTRAN4: from at-sensor radiance to at-surface reflectance
Estimation of the atmospheric composition
Elimination of “contaminations”
Image-based haze correction (supervised method, not suitable for chain-processing)
Image-based Shadow correction (supervised method, not suitable for chain-processing)
Correction for BRDF effects
Image-based visibility (Aerosol Optical Depth) estimation (only if no spectal GCPs available)
Image-based Water Vapor estimation
Atmospheric BRDF (via MODTRAN4 viewing-geometry dependent simulations)
Topographic BRDF (supervised method, not suitable for chain-processing)
Target BRDF (supervised method, not suitable for chain-processing)
Color balancing multiple images
Multi-resolution spline blending
Radiometric triangulation (20XX: under development)
Legend:
Empirical based on between-band relations
Empirical single band based
Mixture empirical/physical
Stochastic
Atmospheric correction is no “exact” science
08/07/2011
© 2009, VITO NV – All rights reserved
22
Algorithms Level1-Level2: Water Vapor, Visibility
Water vapor estimation:
Rodger, A. and M.J. Lynch, 2001. Determining atmospheric column water vapour in the 0.4-2.5
µm spectral region. Proceedings of the AVIRIS Workshop 2001, Pasadena, California
Visibility (aerosol optical depth) estimation:
Richter, R., D. Schläpfer and A. Müller. 2006. An automatic atmospheric correction algorithm
for visible/NIR imagery. International Journal of Remote Sensing, 27(10), pp. 2077-2085.
08/07/2011
© 2009, VITO NV – All rights reserved
23
Algorithms Level1-Level2: Haze removal
Haze removal:
Richter, R., 2005. Atmospheric/Topographic Correction for Airborne Imagery. ATCOR-4 User Guide Version
4.0. DLR, Wessling, Germany, 104p.
UltracamD
08/07/2011
© 2009, VITO NV – All rights reserved
24
Algorithms Level1-Level2: Shadow removal
Shadow removal:
Richter, R., 2005. Atmospheric/Topographic Correction for Airborne Imagery. ATCOR-4 User Guide Version
4.0. DLR, Wessling, Germany, 104p.
Hymap
08/07/2011
© 2009, VITO NV – All rights reserved
25
Algorithms Level1-Level2: Atmospheric BRDF
Atmospheric BRDF: Viewing the same object from various angles may lead to significant variations of
the path scattered radiance component, an effect known as atmospheric BRDF. Can be taken into
account by detailed MODTRAN4 simulations during the atmospheric correction.
Difference images: MODTRAN nadir only simulation minus
MODTRAN view and illumination specific simulations for a
blue, green and red band. The lower image is the geometric
corrected digital number. In that image, flight direction is
from left to right (East). Solar zenith was 29.4°and Solar
azimuth was 161.5°. Remark that in some areas, the
AHS160 sensor completely saturated.
MODTRAN4 rather effective to remove “blue haze”
08/07/2011
© 2009, VITO NV – All rights reserved
26
Algorithms Level1-Level2: Topographic BRDF
• The VITO orthorectification module determines the complete viewing and illumination geometry.
• Richter, R., 2005. Atmospheric/Topographic Correction for Airborne Imagery. ATCOR-4 User Guide Version
4.0. DLR, Wessling, Germany, 104p.
08/07/2011
© 2009, VITO NV – All rights reserved
27
Algorithms Level1-Level2: Target BRDF (Kernel BRDF models)
Target BRDF correction according: Jupp, D. L. B., 2000: A compendium of kernel & other (semi-)empirical
BRDF Models. Office of Space Science Applications - Earth Observation Centre, available only as
online document (May 2002): www.cossa.csiro.au/tasks/brdf/k_summ.pdf
Ideal surface:
No BRDF effects
AMBRALS (the Algorithm for Modeling[MODIS] Bidirectional Reflectance
Anisotropies of the Land Surface) allows a variety of kernel-driven semiempirical BRDF models to be explored. Kernel BRDF methods are empirical
models based on linear combinations of “kernels”:ρ = fiso + fgeokgeo + fvolkvol
which represent surface reflectance (ρ) as a function of component reflectances
(fx) and the kernels (kx) which are mathematical functions that depend on sun
(or incident) and view (or observer) angles. The subscripts “geo” and “vol” refer
to the physical bases for some kernels in which there is an identification of a
“geometric” or hotspot factor and a “volume” or path length and scattering factor
Grass vegetation:
BRDF effects @ 600 nm
UltracamD (raw)
UltracamD (Kernel BRDF corrected)
08/07/2011
© 2009, VITO NV – All rights reserved
28
Configurable tilling option
08/07/2011
© 2009, VITO NV – All rights reserved
29
Algorithms Level2-Level3: Mosaic generation & automated optimal seamline estimation
C++ application:
1. cost grid based on the combination of
similarity within image and between
images
2. Iterative cost grid masking
3. Ford-Fulkerson Graph-Cut method on
masked cost grid.
Iterative masked cost grid = searching for the
best possible solution = just before
connectivity gets lost.
08/07/2011
© 2009, VITO NV – All rights reserved
30
Algorithms Level2/3-Level4
Level1/2/3 – Level4 C++ implementation of following algorithms:
•
Unsupervised K-means classification
•
Spectral angular mapper
•
Maximum likelihood classification
•
Linear discriminant analysis (LDA)
•
Quadratic discriminant analysis (QDA)
•
…
08/07/2011
© 2009, VITO NV – All rights reserved
31
Processing Workflows: functional flow “classical” airborne missions
Example: civil structure change detection as operational service for AGIV (www.agiv.be).
Processing steps:
1. All pixels visible in multiple images determine
pixel elevation from parallax information.
2. Geometric correction of every image.
3. Radiometric correction of every image.
4. Automatic classification with “field truth” extracted
from the AGIV vector database of civil
infrastructure (i.e. the GRB).
5. Post-classification logics to construct a vector file
with new buildings, renovations and building
demolition.
08/07/2011
8/07/2
011
© 2009, VITO NV – All rights reserved
32
32
Algorithms Level2/3-Level4
Example: GRB mutation and anomaly detection.
GRB (Grootschalig Referentiebestand) change detection: blue polygons are already mapped buildings, green polygons are
the result of an automatic building detection process (combined K-means, Quadratic Discriminant Analysis and postclassification logic)
08/07/2011
© 2009, VITO NV – All rights reserved
33