Unmanned
Helicopter Flight Stabilization
|
|
Vision algorithms were
implemented on an FPGA to provide additional
information to supplement the insufficient data of
a standard IMU in order to create a previously
unrealized completely-on-board vision system for
micro-UAVs. The on-board vision system is composed
of an FPGA board, and a custom interface
daughter-board which allow it to provide data
regarding drifting movements of the micro-UAV not
detected by IMUs. The algorithms implemented for
the vision system include a Harris feature
detector, template matching feature correlator,
similarity-constrained homography by random sample
consensus (RANSAC), color segmentation, radial
distortion correction, and an extended Kalman
filter with a standard-deviation outlier rejection
technique (SORT). This vision system was designed
specifically for use as an on-board vision
solution for determining movement of micro-UAVs
that have severe size, weight, and power
limitations. Results show that the vision-system
is capable of real-time on-board image processing
with sufficient accuracy to allow a micro-UAV to
control itself without power or data tethers to a
ground station.
Our latest work on this project proposes to use a
smartphone as the sole computational device to
stabilize and control a quad-rotor. The goal
is to use the readily available sensors in a
smartphone such as the GPS, the accelerometer, the
rate-gyros, and the camera to support
vision-related tasks such as flight stabilization,
estimation of the height above ground, target
tracking, obstacle detection, and
surveillance. We use a quad-rotor platform
that has been built in our Robotic Vision Lab for
our development and experiments. An Android
smartphone is connected through the USB port to an
external hardware that has a microprocessor and
circuitries to generate pulse-width modulation
signals to control the brushless servomotors on
the quad-rotor. The high-resolution camera
on the smartphone is used to detect and track
features to maintain a desired altitude
level. The vision algorithms implemented
include template matching, Harris feature
detector, RANSAC similarity-constrained
homography, and color segmentation. Other
sensors are used to control yaw, pitch, and roll
of the quad-rotor. This smartphone-based
system is able to stabilize and control micro-UAVs
and is ideal for micro-UAVs that have size,
weight, and power limitations.
|
Graduate Students:
|
Alok, Desai, Aaron Dennis, Spencer Fowers, Kirt
Lillywhite, and Beau Tippetts
|
Publications:
-
A. Desai, D.J. Lee, J.A. Moore,
and Y.P. Chang, “Stabilization and
Control of a Quad-Rotor Helicopter Using
Smartphone Device," SPIE Electronic
Imaging, Intelligent Robots and Computer
Vision XXX: Algorithms and Techniques, San
Francisco, CA, USA, February 3-7, 2013.
-
B.J.
Tippetts, D. J. Lee, S.G.
Fowers, and J.K Archibald, “Real-Time
Vision Sensor for an Autonomous Hovering
Micro Unmanned Aerial Vehicle,” AIAA
Journal of Aerospace Computing, Information,
and Communication, vol. 6, p. 570-584, October
2009.
-
S.G.
Fowers, B.J. Tippetts, D.J. Lee,
and J.K. Archibald, “Vision-guided
Autonomous Quad-rotor Helicopter Flight
Stabilization and Control,” AUVSI's
Unmanned Systems North America 2008, San
Diego, CA, USA, June 10 -12, 2008.
- S.G.
Fowers, D.J. Lee, B.J.
Tippetts, K.D.
Lillywhite, A.W.
Dennis, and J.K. Archibald, “Vision
Aided Stabilization and the Development of a
Quad-Rotor Micro UAV,” The 7th IEEE
International Symposium on Computational
Intelligence in Robotics and Automation (CIRA),
p. 143-148, Jacksonville, FL, USA, June 20-23,
2007.
|
(Click
image to view.)
|
|
|