Nicht aus der Schweiz? Besuchen Sie lehmanns.de

Spacecraft Optical Navigation (eBook)

(Autor)

eBook Download: EPUB
2024
177 Seiten
Wiley (Verlag)
978-1-119-90445-8 (ISBN)

Lese- und Medienproben

Spacecraft Optical Navigation - William M. Owen
Systemvoraussetzungen
96,99 inkl. MwSt
(CHF 94,75)
Der eBook-Verkauf erfolgt durch die Lehmanns Media GmbH (Berlin) zum Preis in Euro inkl. MwSt.
  • Download sofort lieferbar
  • Zahlungsarten anzeigen

UNIQUE RESOURCE EXPLORING HOW SPACECRAFT IMAGERY PROVIDES PROFESSIONALS WITH ACCURATE ESTIMATES OF SPACECRAFT TRAJECTORY, WITH REAL-WORLD EXAMPLES AND DETAILED ILLUSTRATIONS

Spacecraft Optical Navigation provides detailed information on the planning and analysis of spacecraft imagery to help determine the trajectory of a spacecraft. The author, an experienced engineer within the field, addresses the entirety of celestial targets and explains how a spacecraft captures their imagery.

Aimed at professionals within spacecraft navigation, this book provides an extensive introduction and explains the history of optical navigation, reviewing a range of optical methods and presents real world examples throughout. With the use of mathematics, this book discusses everything from the orbits, sizes, and shapes of the bodies being imaged, to the location and properties of salient features on their surfaces.

Specific sample topics covered in Spacecraft Optical Navigation include:

  • History of various past spacecraft, including Mariner and Viking, Voyager, Galileo, NEAR Shoemaker, and Cassini, and flight hardware, star catalogs, and stereophotoclinometry
  • Cameras, covering the gnomonic projection (and deviations from it), creation of a digital picture, picture flattening, and readout smears
  • Modeling optical navigation observables, covering apparent directions to an object, star, and limbs or terminators, and orientation of cameras
  • Obtaining optical navigation observables, covering centerfinding for stars and resolved and unresolved bodies, and using opnav data in orbit determination

Spacecraft Optical Navigation is an ideal resource for engineers working in spacecraft navigation and optical navigation, to update their knowledge of the technology and use it in their day-to-day. The text will also benefit researchers working with spacecraft, particularly in navigation, and professors and lecturers teaching graduate aerospace courses.


UNIQUE RESOURCE EXPLORING HOW SPACECRAFT IMAGERY PROVIDES PROFESSIONALS WITH ACCURATE ESTIMATES OF SPACECRAFT TRAJECTORY, WITH REAL-WORLD EXAMPLES AND DETAILED ILLUSTRATIONS Spacecraft Optical Navigation provides detailed information on the planning and analysis of spacecraft imagery to help determine the trajectory of a spacecraft. The author, an experienced engineer within the field, addresses the entirety of celestial targets and explains how a spacecraft captures their imagery. Aimed at professionals within spacecraft navigation, this book provides an extensive introduction and explains the history of optical navigation, reviewing a range of optical methods and presents real world examples throughout. With the use of mathematics, this book discusses everything from the orbits, sizes, and shapes of the bodies being imaged, to the location and properties of salient features on their surfaces. Specific sample topics covered in Spacecraft Optical Navigation include: History of various past spacecraft, including Mariner and Viking, Voyager, Galileo, NEAR Shoemaker, and Cassini, and flight hardware, star catalogs, and stereophotoclinometryCameras, covering the gnomonic projection (and deviations from it), creation of a digital picture, picture flattening, and readout smearsModeling optical navigation observables, covering apparent directions to an object, star, and limbs or terminators, and orientation of camerasObtaining optical navigation observables, covering centerfinding for stars and resolved and unresolved bodies, and using opnav data in orbit determination Spacecraft Optical Navigation is an ideal resource for engineers working in spacecraft navigation and optical navigation, to update their knowledge of the technology and use it in their day-to-day. The text will also benefit researchers working with spacecraft, particularly in navigation, and professors and lecturers teaching graduate aerospace courses.

2
History


Much of the material in this chapter was originally presented in Owen et al. (2008). It has been updated for this book.

Optical navigation got its start at Jet Propulsion Laboratory (JPL) as an experiment on the Mariner 6 and 7 missions to Mars in 1969 (Duxbury and Breckenridge 1970, Duxbury 1970) and again on Mariner 9 in 1971 (Breckenridge and Acton 1972). The justification for opnav was to ensure quality navigation results at the outer planets: as radio tracking data are geocentric, a radio-only orbit determination solution will tend to become less accurate with increasing distance from the earth.

Opnav was used operationally for both Viking orbiters at Mars, but it came into its own with the Voyager missions to the outer planets. Pictures of the Galilean satellites of Jupiter, of Titan and the smaller icy satellites of Saturn, of the five classical Uranian satellites, and of Triton, Nereid, and Proteus (S/1989 N 1) at Neptune helped immensely to shrink the size of the B plane error ellipse.1 Optical navigation engineers were also responsible for several of Voyager’s discoveries. The serendipitous discovery of volcanic plumes on Io is best known, but optical navigators also found new satellites at all four outer planets and determined their orbits.

Both hardware and software advances have occurred since the 1970s. Vidicon television cameras have given way to charge-coupled detectors (CCDs) and complementary metal-oxide semiconductors (CMOS) detectors. Ground data processing has moved from mainframes to minicomputers and now to workstations. We once used special frame buffers and monitors for display; now, our workstations bring up pictures inside an X window. The software has progressed from a mixture of Fortran 66 and assembly language to Fortran 77 and C, and much of it has been rewritten in C++ and Python.

Perhaps the most promising advance in optical navigation technology is the migration from ground processing to onboard processing. JPL’s onboard autonomous navigation system was demonstrated on Deep Space 1 and used on Stardust. Autonav’s greatest success to date was on Deep Impact, where it ran on both the Impactor and Flyby spacecraft and guided the Impactor onto a collision course with the nucleus of comet 9P/Tempel 1 while the Flyby spacecraft was taking pictures of the event.

Opnav figures to play a prominent role in many future JPL missions: to small bodies, to Uranus and Neptune, and even to support precision landings on the moon or Mars. Whether the processing is done on the ground or onboard, whether the imager is like the dedicated Optical Navigation Camera flown on Mars Reconnaissance Orbiter or a science instrument, optical navigation data will continue to enable the kind of precision navigation which is required for mission success.

Optical navigation as practiced at JPL usually requires knowledge of several disciplines and interactions with both the flight team and the science team at various phases of a mission. Opnav analysts must know enough about the optical and physical characteristics of the onboard cameras to be able to simulate images and command pictures correctly. We must participate in orbit determination (OD) studies to find out how much optical data is necessary to meet navigation accuracy requirements; then we must negotiate for observing time, plan the pictures to be taken, develop the sequence products, and verify their correctness through the uplink process. After the pictures have been obtained, we must process them, extracting the observed locations of the images, and pass the results along to the rest of the navigation team. The optical navigation group must therefore have people who know about optics, some facets of astronomy, surface modeling, spacecraft commanding, image processing, and spacecraft navigation.

2.1 The Early Years: Mariner and Viking


The first so-called “optical navigation experiment” was carried out in 1969 on Mariners 6 and 7 to Mars. Tom Duxbury and the late Bill Breckenridge used the “far-encounter planet sensor” to take pictures (50 for Mariner 6, 93 for Mariner 7) of Mars in the last two or three days before each spacecraft’s flyby (Duxbury and Breckenridge 1970). Figure 2.1 shows one such image. They measured the location of Mars in each picture by putting a clear plastic overlay on top of a hard copy of the picture, matching a circle on the overlay to Mars, and reading off the coordinates. (The paper makes a point of saying that each picture was measured by two “observers” and the results averaged.) The Mars image centers thus obtained were transferred to punched cards.

Figure 2.1 Mariner 1969 picture of Mars.

Source: NASA.

There were no stars in the pictures; the spacecraft attitude came from real-time telemetry. Careful calibrations before launch and during cruise had given the orientation of the camera relative to the spacecraft. The calibrations also revealed distortions in the camera, which were used to correct the observed image locations.

The image location and camera attitude were then transformed into the observed inertial direction to Mars. The right ascension and declination were the optical observables (not the image coordinates as is now the case). These angles were fed into a navigation filter which read in the current radio OD solution (from magnetic tape) and updated it.

Real-time opnav operations were successful. The pictures were received and processed within the allotted time. The resulting solutions were generally within one sigma of the radio-only solutions, and the major axis of the B-plane error ellipse shrunk by half.

After encounter, Duxbury devised a new technique (Duxbury 1970) for determining the center of Mars. Each line of the picture was scanned to find the lit limb, defined as the first of three successive pixels brighter than a threshold. As the limb of an ellipsoidal body projects into an ellipse in the focal plane, they fit an ellipse to the limb points, subject to a priori constraints on its shape from the known shape of Mars and the viewing geometry. The center of the ellipse was identified with the center of the planet. There were trends in the limb residuals indicating systematic effects, particularly near the poles, but the resulting data were good to about 1 pixel east–west and 0.3 pixel north–south. The worse performance in the horizontal direction was attributed to the fact that only the lit limb was used.

2.1.1 Mariner 9


Opnav was demonstrated again in the next mission, Mariner 9 in 1971. Tom Duxbury and Chuck Acton oversaw the creation of two sets of programs: the Optical Navigation Image Processing System (ONIPS), for picture display and image centerfinding, and the Optical Navigation Program set (ONP), for scene prediction, calculation of residuals and partial derivatives with respect to parameters of interest, and filtering. They planned 18 pictures of Deimos and 3 of Phobos, to be taken between 66 and 9 h before Mars orbit insertion. Deimos, farther from Mars, was the preferred target, and two of the Phobos pictures were timed to capture the satellite in transit across Mars. Several sets of calibration pictures, obtained during cruise, served to characterize the performance of the spacecraft attitude system, various camera misalignment angles, electromagnetic distortion in the camera’s vidicon scan pattern, and the overall sensitivity of the camera.

Each of the 21 opnav pictures was analyzed within an hour of its receipt. Stars as faint as magnitude 8.9 were detected in 6-s exposures. The rms star residual, in pictures containing more than two stars, was 0.4 pixel. Results from the first 14 pictures were passed to the navigation team, which thereby obtained a more accurate OD at Mars than any previous mission. Improvements to the satellite ephemerides also enabled close-up imagery of Phobos and Deimos during the orbital phase of the mission.

The real benefit of Mariner 9, though, was the development of the opnav camera models, processing techniques, and team procedures. Much of what we do today traces its roots to 1971. ONIPS and ONP are still used, and even some of the program names are unchanged though the code has been rewritten several times.

Duxbury and Acton received NASA’s Exceptional Scientific Achievement Medal for this work. They were also accorded the Institute of Navigation’s Samuel M. Burka award for their paper (Duxbury and Acton 1972), along with the princely sum of $175 each – but they had to pay their own way to the ceremony (Figure 2.2).

2.1.2 Viking


The two Viking missions (1976) used optical navigation in operations not only on approach but also in orbit (Jerath 1978). The approach OD and the insertion maneuver were so accurate that the Viking 1 orbiter overflew the proposed landing site on its first orbit, rendering unnecessary two weeks of contingency operations. Dozens of opnav pictures taken by the orbiters enabled close encounters with both Martian satellites; a 20 km flyby of Deimos was in error by less than 2 km. Opnav was also useful in the orbiters’ extended mission: as the amount of radio tracking data decreased, radio-only solutions became less sensitive to the node of the spacecraft’s orbit on the plane of the...

Erscheint lt. Verlag 11.10.2024
Reihe/Serie JPL Deep-Space Communications and Navigation Series
Sprache englisch
Themenwelt Technik Maschinenbau
Schlagworte cameras • Image Processing • interplanetary optical navigation • Optics • Planetary orbits • spacecraft attitude • spacecraft math • spacecraft navigation • spacecraft technology • star catalogs • Surface modeling • Telescopes • terrain-relative navigation
ISBN-10 1-119-90445-5 / 1119904455
ISBN-13 978-1-119-90445-8 / 9781119904458
Haben Sie eine Frage zum Produkt?
EPUBEPUB (Adobe DRM)
Größe: 9,9 MB

Kopierschutz: Adobe-DRM
Adobe-DRM ist ein Kopierschutz, der das eBook vor Mißbrauch schützen soll. Dabei wird das eBook bereits beim Download auf Ihre persönliche Adobe-ID autorisiert. Lesen können Sie das eBook dann nur auf den Geräten, welche ebenfalls auf Ihre Adobe-ID registriert sind.
Details zum Adobe-DRM

Dateiformat: EPUB (Electronic Publication)
EPUB ist ein offener Standard für eBooks und eignet sich besonders zur Darstellung von Belle­tristik und Sach­büchern. Der Fließ­text wird dynamisch an die Display- und Schrift­größe ange­passt. Auch für mobile Lese­geräte ist EPUB daher gut geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen eine Adobe-ID und die Software Adobe Digital Editions (kostenlos). Von der Benutzung der OverDrive Media Console raten wir Ihnen ab. Erfahrungsgemäß treten hier gehäuft Probleme mit dem Adobe DRM auf.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen eine Adobe-ID sowie eine kostenlose App.
Geräteliste und zusätzliche Hinweise

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich