Vision-Based Self-Motion Estimation in a Fixed-Wing
Aerial Vehicle
Matthew R. Parks
Thesis submitted to the faculty of
Virginia Polytechnic Institute and State University
in partial fulfillment of the requirements for the degree of
Master of Science
In
Computer Engineering
Dr. Lynn Abbott, Chair
Dr. Chris Wyatt
Dr. Pushkin Kachroo
January 20, 2006
Blacksburg, VA
Keywords: Computer Vision, Optic Flow, Feature Tracking, Matching, Motion
Estimation
Vision-Based Self-Motion Estimation in a Fixed-Wing
Aerial Vehicle
Matthew R. Parks
ABSTRACT
This paper describes a complete algorithm to estimate the motion of a fixed-wing
aircraft given a series of digitized flight images. The algorithm was designed for fixed-
wing aircraft because carefully procured flight images and corresponding navigation data
were available to us for testing. After image pre-processing, optic flow data is
determined by automatically finding and tracking good features between pairs of images.
The image coordinates of matched features are then processed by a rigid-object linear
optic flow-motion estimation algorithm. Input factors are weighed to provide good
testing techniques. Error analysis is performed with simulation data keeping these factors
in mind to determine the effectiveness of the optic flow algorithm. The output of this
program is an estimate of rotation and translation of the imaged environment in relation
to the camera, and thereby the airplane. Real flight images from NASA test flights are
used to confirm the accuracy of the algorithm. Where possible, the estimated motion
parameters are compared with recorded flight instrument data to confirm the correctness
of the algorithm. Results show that the algorithm is accurate to within a degree provided
that enough optic flow feature points are tracked.
iii
ACKNOWLEDGEMENTS
I would like to thank my advisor, Dr. A. Lynn Abbott, for helping me throughout
my research, Gary Fleming and the rest of the people at NASA Langley who provided all
the flight information and image sequences, and my parents who supported me in my
decision to enter graduate study. Also, thanks to Phichet Trisirisipal and Xiaojin Gong
for helping when I had computer vision questions, and Nathan Herald for his help
creating an illustration.
iv
TABLE OF CONTENTS
ABSTRACT .................................................................................................................................................. ii
ACKNOWLEDGEMENTS ........................................................................................................................ iii
TABLE OF CONTENTS ............................................................................................................................ iv
LIST OF FIGURES...................................................................................................................................... v
LIST OF TABLES....................................................................................................................................... vi
1 INTRODUCTION ..................................................................................................................................... 1
1.1 COMPUTER VISION ......................................................................................................................... 1
1.2 VISION-BASED CONTROL OF AIRCRAFT ................................................................................... 1
1.3 CONTRIBUTIONS OF THIS RESEARCH........................................................................................ 2
1.4 ORGANIZATION OF THESIS .......................................................................................................... 2
2 BACKGROUND AND LITERATURE REVIEW ................................................................................. 4
2.1 VISION-BASED NAVIGATION ....................................................................................................... 4
2.2 FEATURE TRACKING...................................................................................................................... 5
2.3 OPTIC FLOW ..................................................................................................................................... 5
2.3.1 Introduction to Optic Flow .......................................................................................................... 5
2.3.2 Derivation of Linear Optic Flow Motion Algorithm .................................................................... 5
2.3.3 Solving for Motion Parameters.................................................................................................... 8
2.3.4 Synopsis ..................................................................................................................................... 11
3 METHODOLOGY .................................................................................................................................. 12
3.1 IMAGE ACQUISITION.................................................................................................................... 12
3.2 PREPROCESSING OF IMAGE SEQUENCES................................................................................ 14
3.3 FEATURE TRACKING.................................................................................................................... 15
3.4 IMPLEMENTATION OF LINEAR OPTIC FLOW MOTION ALGORITHM................................ 19
3.5 TESTING OPTIC FLOW ALGORITHM WITH SIMULATED DATA.......................................... 20
3.6 GEOMETRY OF AIRCRAFT AND CAMERA MOTION .............................................................. 30
4 RESULTS................................................................................................................................................. 33
4.1 COMPARISON WITH CAIS DATA, USING 8 FEATURE POINTS............................................. 33
4.2 STATISTICAL ERROR REDUCTION............................................................................................ 34
4.3 COMPARISON WITH GPS DATA ................................................................................................. 35
4.4 IMPROVING RESULTS WITH BETTER IMAGES....................................................................... 36
5 DISCUSSION........................................................................................................................................... 40
5.1 ANALYSIS OF IMAGE QUALITY................................................................................................. 40
5.2 ANALYSIS OF EXTERNAL FACTORS ON IMAGE DATA........................................................ 42
6 SUMMARY AND CONCLUSIONS ...................................................................................................... 44
REFERENCES ........................................................................................................................................... 45
APPENDICES ............................................................................................................................................ 47
APPENDIX A: LISTING OF FEATURE TRACKING CODE.............................................................. 47
APPENDIX B: LISTING OF MATLAB OPTIC FLOW CALCULATION SCRIPT ............................ 54
APPENDIX C: WINGTIP CAMERA SPECIFICATIONS .................................................................... 55
v
LIST OF FIGURES
Figure 1: OV-10A in flight (Photo courtesy of NASA Langley Research Center) .......... 12
Figure 2: Wingtip camera placement (NASA Langley Research Center)........................ 13
Figure 3: Field of view of wingtip cameras (NASA Langley Research Center) .............. 13
Figure 4: An original flight sequence image with image imperfections illustrated.......... 14
Figure 5: Original image after cropping to remove artifacts introduced in image
acquisition. World coordinate axes x, y, and z are shown along with image coordinate
axes u and v. Note that the scene extends forward into the –z direction.......................... 15
Figure 6: Example image pair from original image sequence. Williamsburg-Jamestown
Airport, left camera – image spacing 8 frames. ................................................................ 16
Figure 7: KLT program image output – image spacing 8 frames, 21 of the 25 requested
features successfully tracked (red highlighting added manually to feature markers)....... 17
Figure 8: From left to right – Williamsburg-Jamestown landing images 000, 050, and 100
from left wingtip camera................................................................................................... 18
Figure 9: Number of matches versus increasing gap between images ............................. 18
Figure 10: Error in calculated direction of translation, with separate rotations around x, y,
and z axes. First synthetic data set. .................................................................................. 21
Figure 11: Simulated motion with second synthetic data set. Blue asterisks indicate
image coordinates in the initial position; red dots indicate coordinates after simulated
motion. (a) Translation only, (b) with 5 degree change of pitch, (c) with 5 degree change
of heading, (d) with 5 degree change of roll..................................................................... 22
Figure 12: Calculated error in direction of translation, varying rotation around x, y, and z
axes. Second synthetic data set – small translation.......................................................... 23
Figure 13: Calculated error in direction of translation, varying rotation around x, y, and z
axes. First synthetic data set – larger translation. ............................................................ 24
Figure 14: Average error as a function of feature position uncertainty (second synthetic
data set, 8 data points, 100 runs for each 0.1 degree step)................................................ 26
Figure 15: Highest observed error as a function of feature position uncertainty.............. 27
Figure 16: Error in direction of travel calculation as image center is moved (second
synthetic data set).............................................................................................................. 28
Figure 17: Average error as a function of feature position uncertainty - 16 data points .. 29
Figure 18: Maximum error as a function of feature position uncertainty - 16 data points29
Figure 19: Camera and plane coordinate systems............................................................. 31
Figure 20: (a) Camera motion constraint mechanism; (b) Close-up of degree wheel ...... 37
Figure 21: A sample image (pict4788) from the half-degree stepped image sequence.... 38
Figure 22: Image interlacing. (a) Original image; (b) After deinterlacing ...................... 40
Figure 23: Radial lens distortion (exaggerated): (a) undistorted image, (b) barrel
distortion, (c) pincushion distortion.................................................................................. 42
Figure 24: Image defects caused by glare (Newport News/Williamsburg International
Airport) ............................................................................................................................. 43