Pictures by NASA, ESA, and The Hubble Heritage Team (STScI
Transcription
Pictures by NASA, ESA, and The Hubble Heritage Team (STScI
Pictures by NASA, ESA, and The Hubble Heritage Team (STScI/AURA) 1 2 3 Picture by NASA 4 Sensors for On-Orbit Docking Stephen Granade [email protected] 5 What Do You Need to Measure? 6 Make a Relative Measurement of Six-DoF y Orientation (roll, pitch, yaw) y yaw roll z pitch z x Position (x, y, z) 7 x How Far Away is Hubble? Picture by NASA 8 Ways of Measuring Position and Orientation A Tale of Two Approaches 9 Direct vs. Indirect Measurement Direct Indirect Picture by Arild Storaas 10 Types of Direct and Indirect Measurement Direct Measure distance directly using light or radar pulses Indirect Measure x and y position by where the spacecraft is in an image Measure distance (z) by looking at how big the spacecraft appears in the image Measure angles (roll, pitch, yaw) by looking at the spacecraft’s shape or the relative location of points on the spacecraft 11 Direct: Radar or Laser Rangefinding y z 0.3 m/ns 12 x Indirect: Use a Camera to Estimate 6DoF Position: x, y Original pictures by NASA Size: z (distance) 13 Shapes or Points: Orientation Look At How Shapes Are Distorted to Get Angles 14 Apparent Distortion is Related to Angles 15 Measure Relative Location of Known Features 16 Measure Relative Location of Known Features 1 3 1 4 5 2 4 2 17 3 5 A Camera Makes a 2D Projection of a 3D Scene Field of View Lens Focal Point Imager (Focal Plane Array) Warning: Not to scale 18 Pixels Correspond To Angular Location Lens Imager (Focal Plane Array) 19 Perspective Lets You Calculate Angles Spacecraft Spacecraft Camera Camera 20 Shuttle Dockings: Direct & Indirect Measurements Pictures by NASA 21 A Closer Look at the ISS Target Picture by Advanced Optical Systems 22 Locating Features on Spacecraft A Tale of Two Types of Spacecraft 23 Cooperative vs. Uncooperative Spacecraft Uncooperative Pictures by NASA Cooperative 24 Cooperative Targets Are (Relatively) Easy Pictures by DARPA (left) and AOS (right) Hubble’s Soft Capture Mechanism Included Targets Pictures by NASA and AOS (inset picture) 26 Uncooperative Targets Require Image Recognition AOS Space Vision Tracking System Boeing Vis-STAR 27 What Makes For Good Features? Individual feature’s sizes Distribution of features Widely-spaced side to side Spaced front to back Contrast Spatially distinctive Corners, edges, color changes 28 Choose Your Sensors to Match Your Targets Sensors can make direct or indirect measurements Get distance directly using rangefinding with light or radar Get distance indirectly by looking at size of objects on an image or distance between points on a spacecraft Get orientation indirectly by looking at spacecraft shape or relative configuration of features (points) on the spacecraft Targets can be cooperative or uncooperative Cooperative spacecraft have distinctive features that are easy to find Uncooperative ones require image processing techniques to recognize features 29 Docking Sensors That Have Flown 30 30 IGLA and KURS Russian system Multiple radar antennas Multi-stage process Acquisition: target spacecraft broadcasts beacon signal Tracking: target spacecraft re-broadcasts what the Progress sends Works over tens of km Most flown relative navigation system Pictures by NASA 31 Advanced Video Guidance Sensor (AVGS) NASA/AOS/Orbital Sciences system Laser-based Camera images spots of light reflected from targets Processes spot locations to determine target’s position and orientation Works to 1 km / 300 m Flown on DART and Orbital Express missions Pictures by Orbital Sciences/AOS (top), AOS (inset), and NASA (bottom) 32 ARCSS Boeing system Visible-light and IR cameras Images (including silhouette and edgeenhanced) correlated against library images Uses features at short ranges Works to 200 km / 60 m Flown on Orbital Express Pictures by Boeing 33 Relative Navigation Sensor System NASA and AOS system Three visible-light cameras Matched features in edgeenhanced image against a model (NASA GNIFR) or correlated features against a library (AOS ULTOR) Worked to 150 m Flown on Hubble Servicing Mission 4 Pictures by NASA (top and bottom left) and AOS (bottom right) 34 TriDAR Neptec system Laser rangefinding, laser triangulation, and an IR camera 3D point cloud compared to models Uses IR camera for long ranges Works to ~40 km / 400 m Flown on three Shuttle missions and Cygnus Pictures by NASA (top) and Neptec (bottom) 35 Vision Navigation Sensor (VNS) NASA/Ball/Lockheed Martin system Flash LIDAR and visiblelight camera Range measurements to reflective targets Visible-light camera for situational awareness Works to 200 km / 60 m Flown on a Shuttle mission Pictures by Ball Aerospace 36 Why So Many? Because There’s No “Best” Solution Cooperative gives you the most control but requires modifying the spacecraft you want to dock with Uncooperative is the most flexible, but much harder to get right There is no one image processing solution that works in all situations Because space is a low-volume business, we’re not seeing big jumps in technology powered by commercial demand 37 Note: Autonomous On-Orbit Docking is Hard IGLA: Three failures on orbit, one due to a trash bag KURS: Multiple failures, including one that led to a Progress hitting Mir, and another that left a Progress adrift near the ISS DART: Failed to turn on AVGS, and hit MUBLCOM Orbital Express: Had to adjust ARCSS object recognition settings on orbit Orbital Express and Japan’s ETS-VII: Target satellite got lost and had to be recovered Hubble RNS: GNIFR tracked only at first, ULTOR not at all while on orbit 38 Where Next? ISS resupply craft are using docking sensors Progress-M: Kurs ESA ATV and JAXA HTV: Scanning LIDAR looks for reflections from targets Orbital Express Cygnus: TriDAR SpaceX Dragon: developing the DragonEye flash LIDAR Orion’s Vision Navigation Sensor is still in the wings GSFC RAVEN DARPA Phoenix program for satellite servicing A few commercial efforts 39 Sensors for On-Orbit Docking Stephen Granade [email protected] 40