没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
••••••••••••••••• ••••••••••••••
Autonomous Driving in Urban
Environments: Boss and the
Urban Challenge
Chris Urmson, Joshua Anhalt, Drew Bagnell,
Christopher Baker, Robert Bittner,
M. N. Clark, John Dolan, Dave Duggins,
Tugrul Galatali, Chris Geyer,
Michele Gittleman, Sam Harbaugh,
Martial Hebert, Thomas M. Howard,
Sascha Kolski, Alonzo Kelly,
Maxim Likhachev, Matt McNaughton,
Nick Miller, Kevin Peterson, Brian Pilnick,
Raj Rajkumar, Paul Rybski, Bryan Salesky,
Young-Woo Seo, Sanjiv Singh, Jarrod Snider,
Anthony Stentz, William “Red” Whittaker,
Ziv Wolkowicki, and Jason Ziglar
Carnegie Mellon University
Pittsburgh, Pennsylvania 15213
e-mail: curmson@ri.cmu.edu
Hong Bae, Thomas Brown, Daniel Demitrish,
Bakhtiar Litkouhi, Jim Nickolaou,
Varsha Sadekar, and Wende Zhang
General Motors Research and Development
Warren, Michigan 48090
Joshua Struble and Michael Taylor
Caterpillar, Inc.
Peoria, Illinois 61656
Michael Darms
Continental AG
Auburn Hills, Michigan 48326
Dave Ferguson
Intel Research
Pittsburgh, Pennsylvania 15213
Received 22 February 2008; accepted 19 June 2008
Journal of Field Robotics 25(8), 425–466 (2008)
C
2008 Wiley Periodicals, Inc.
Published online in Wiley InterScience (www.interscience.wiley.com).
•
DOI: 10.1002/rob.20255
426
•
Journal of Field Robotics—2008
Boss is an autonomous vehicle that uses on-board sensors (global positioning system,
lasers, radars, and cameras) to track other vehicles, detect static obstacles, and localize
itself relative to a road model. A three-layer planning system combines mission, behav-
ioral, and motion planning to drive in urban environments. The mission planning layer
considers which street to take to achieve a mission goal. The behavioral layer determines
when to change lanes and precedence at intersections and performs error recovery maneu-
vers. The motion planning layer selects actions to avoid obstacles while making progress
toward local goals. The system was developed from the ground up to address the require-
ments of the DARPA Urban Challenge using a spiral system development process with
a heavy emphasis on regular, regressive system testing. During the National Qualifica-
tion Event and the 85-km Urban Challenge Final Event, Boss demonstrated some of its
capabilities, qualifying first and winning the challenge.
C
2008 Wiley Periodicals, Inc.
1. INTRODUCTION
In 2003 the Defense Advanced Research Projects
Agency (DARPA) announced the first Grand Chal-
lenge. The goal was to develop autonomous vehi-
cles capable of navigating desert trails and roads at
high speeds. The competition was generated as a re-
sponse to a congressional mandate that a third of
U.S. military ground vehicles be unmanned by 2015.
Although there had been a series of ground vehi-
cle research programs, the consensus was that exist-
ing research programs would be unable to deliver
the technology necessary to meet this goal (Commit-
tee on Army Unmanned Ground Vehicle Technology,
2002). DARPA decided to rally the field to meet this
need.
The first Grand Challenge was held in March
2004. Though no vehicle was able to complete the
challenge, a vehicle named Sandstorm went the f ar-
thest, setting a new benchmark for autonomous ca-
pability and providing a template on how to win the
challenge (Urmson et al., 2004). The next year, five ve-
hicles were able to complete a similar challenge, with
Stanley (Thrun et al., 2006) finishing minutes ahead
of Sandstorm and H1ghlander (Urmson et al., 2006)
to complete the 244-km race in a little under 7 h.
After the success of the Grand Challenges,
DARPA organized a third event: the Urban Chal-
lenge. The challenge, announced in April 2006, called
for autonomous vehicles to drive 97 km through an
urban environment, interacting with other moving
vehicles and obeying the California Driver Hand-
book. Interest in the event was immense, with 89
teams from around the world registering interest in
competing. The teams were a mix of industry and
academics, all with enthusiasm for advancing au-
tonomous vehicle capabilities.
To compete in the challenge, teams had to pass
a series of tests. The first was to provide a credible
technical paper describing how they would imple-
ment a safe and capable autonomous vehicle. Based
on these papers, 53 teams were given the opportu-
nity to demonstrate firsthand for DARPA their ability
to navigate simple urban driving scenarios including
passing stopped cars and interacting appropriately
at intersections. After these events, the field was fur-
ther narrowed to 36 teams who were invited to par-
ticipate in the National Qualification Event (NQE) in
Victorville, California. Of these teams, only 11 would
qualify for the Urban Challenge Final Event (UCFE).
This article describes the algorithms and mech-
anism that make up Boss (see Figure 1), an au-
tonomous vehicle capable of driving safely in traffic
at speeds up to 48 km/h. Boss is named after Charles
“Boss” Kettering, a luminary figure in the automotive
industry, with inventions as wide ranging as the all-
electric starter for the automobile, the coolant Freon,
and the premature-infant incubator. Boss was devel-
oped by the Tartan Racing Team, which was com-
posed of students, staff, and researchers from sev-
eral entities, including Carnegie Mellon University,
General Motors, Caterpillar, Continental, and Intel.
This article begins by describing the autonomous
vehicle and sensors and then moves on to a discus-
sion of the algorithms and approaches that enabled it
to drive autonomously.
The motion planning subsystem (described in
Section 3) consists of two planners, each capable of
avoiding static and dynamic obstacles while achiev-
ing a desired goal. Two broad scenarios are consid-
ered: structured driving (road following) and un-
structured driving (maneuvering in parking lots).
For structured driving, a local planner generates
trajectories to avoid obstacles while remaining in its
Journal of Field Robotics DOI 10.1002/rob
mission层就是routing,
behavioral层就是决策
Urmson et al.: Autonomous Driving in Urban Environments: Boss and the Urban Challenge
•
427
Figure 1. Boss, the autonomous Chevy Tahoe that won the 2007 DARPA Urban Challenge.
lane. For unstructured driving, such as entering/
exiting a parking lot, a planner with a four-
dimensional search space (position, orientation, di-
rection of travel) is used. Regardless of which plan-
ner is currently active, the result is a trajectory that,
when executed by the vehicle controller, will safely
drive toward a goal.
The perception subsystem (described in
Section 4) processes and fuses data from Boss’s
multiple sensors to provide a composite model of the
world to the rest of the system. The model consists
of three main parts: a static obstacle map, a list of
the moving vehicles in the world, and the location of
Boss relative to the road.
The mission planner (described in Section 5) com-
putes the cost of all possible routes to the next mission
checkpoint given knowledge of the road network.
The mission planner reasons about the optimal path
to a particular checkpoint much as a human would
plan a route from his or her current position to a desti-
nation, such as a grocery store or gas station. The mis-
sion planner compares routes based on knowledge
of road blockages, the maximum legal speed limit,
and the nominal time required to make one maneu-
ver versus another. For example, a route that allows
a higher overall speed but incorporates a U-turn may
actually be slower than a route with a lower overall
speed but that does not require a U-turn.
The behavioral system (described in Section 6)
formulates a problem definition for the motion plan-
ner to solve based on the strategic information pro-
vided by the mission planner. The behavioral subsys-
tem makes tactical decisions to execute the mission
plan and handles error recovery when there are prob-
lems. The behavioral system is roughly divided into
three subcomponents: lane driving, intersection han-
dling,andgoal selection. The roles of the first two sub-
components are self-explanatory. Goal selection is re-
sponsible for distributing execution tasks to the other
behavioral components or the motion layer and for
selecting actions to handle error recovery.
The software infrastructure and tools that enable
the other subsystems are described in Section 7. The
software infrastructure provides the foundation upon
which the algorithms are implemented. Additionally,
the infrastructure provides a toolbox of components
for online data logging, offline data log playback, and
visualization utilities that aid developers in building
and troubleshooting the system. A run-time execu-
tion framework is provided that wraps around algo-
rithms and provides interprocess communication, ac-
cess to configurable parameters, a common clock, and
a host of other utilities.
Testing and performance in the NQE and UCFE
are described in Sections 8 and 9. During the develop-
ment of Boss, the team put a significant emphasis on
evaluating performance and finding weaknesses to
ensure that the vehicle would be ready for the Urban
Challenge. During the qualifiers and final challenge,
Boss performed well, but made a few mistakes.
Journal of Field Robotics DOI 10.1002/rob
428
•
Journal of Field Robotics—2008
Despite these mistakes and a very capable field of
competitors, Boss qualified for the final event and
won the Urban Challenge.
2. BOSS
Boss is a 2007 Chevrolet Tahoe modified for au-
tonomous driving. Modifications were driven by the
need to provide computer control and also to support
safe and efficient testing of algorithms. Thus, modi-
fications can be classified into two categories: those
for automating the vehicle and those that made test-
ing either safer or easier. A commercial off-the-shelf
drive-by-wire system was integrated into Boss with
electric motors to turn the steering column, depress
the brake pedal, and shift the transmission. The third-
row seats and cargo area were replaced with electron-
ics racks, the steering was modified to remove excess
compliance, and the brakes were replaced to allow
faster braking and reduce heating.
Boss maintains normal human driving controls
(steering wheel and brake and gas pedals) so that
a safety driver can quickly and easily take control
during testing. Boss has its original seats in addi-
tion to a custom center console with power and net-
work outlets, which enable developers to power lap-
tops and other accessories, supporting longer and
more productive testing. A welded tube roll cage
was also installed to protect human occupants in the
event of a collision or rollover during testing. For un-
manned operation a safety radio is used to engage
autonomous driving, pause, or disable the vehicle.
Boss has two independent power buses. The
stock Tahoe power bus remains intact with its 12-
V dc battery and harnesses but with an upgraded
high-output alternator. An auxiliary 24-V dc power
system provides power for the autonomy hardware.
The auxiliary system consists of a belt-driven alter-
nator that charges a 24-V dc battery pack that is in-
verted to supply a 120-V ac bus. Shore power, in the
form of battery chargers, enables Boss to remain fully
powered when in the shop with the engine off. Ther-
mal control is maintained using the stock vehicle air-
conditioning system.
For computation, Boss uses a CompactPCI chas-
sis with 10 2.16-GHz Core2Duo processors, each
with 2 GB of memory and a pair of gigabit Ether-
net ports. Each computer boots off of a 4-GB flash
drive, reducing the likelihood of a disk failure. Two
of the machines also mount 500-GB hard drives for
data logging. Each computer is also time synchro-
nized through a custom pulse-per-second adaptor
board.
Boss uses a combination of sensors to provide
the redundancy and coverage necessary to navigate
safely in an urban environment. Active sensing is
used predominantly, as can be seen in Table I. The de-
cision to emphasize active sensing was primarily due
to the team’s skills and the belief that in the Urban
Challenge direct measurement of range and target
velocity was more important than getting richer, but
more difficult to interpret, data from a vision system.
The configuration of sensors on Boss is illustrated in
Figure 2. One of the novel aspects of this sensor con-
figuration is the pair of pointable sensor pods located
above the driver and front passenger doors. Each pod
contains an ARS 300 radar and ISF 172 LIDAR. By
pointing these pods, Boss can adjust its field of re-
gard to cover crossroads that may not otherwise be
observed by a fixed-sensor configuration.
3. MOTION PLANNING
The motion planning layer is responsible for execut-
ing the current motion goal issued from the behav-
iors layer. This goal may be a location within a road
lane when performing nominal on-road driving, a lo-
cation within a zone when traversing through a zone,
or any location in the environment when performing
error recovery. The motion planner constrains itself
based on the context of the goal to abide by the rules
of the road.
In all cases, the motion planner creates a path
toward the desired goal and then tracks this path
by generating a set of candidate trajectories that fol-
low the path to various degrees and selecting from
this set the best trajectory according to an evaluation
function. This evaluation function differs depending
on the context but includes consideration of static
and dynamic obstacles, curbs, speed, curvature, and
deviation from the path. The selected trajectory can
then be directly executed by the vehicle. For more
details on all aspects of the motion planner, see
Ferguson, Howard, and Likhachev (2008, submitted).
3.1. Trajectory Generation
A model-predictive trajectory generator originally
presented in Howard and Kelly (2007) is responsible
for generating dynamically feasible actions from an
initial state x to a desired terminal state. In general,
this algorithm can be applied to solve the problem
of generating a set of parameterized controls u(p,x)
Journal of Field Robotics DOI 10.1002/rob
Urmson et al.: Autonomous Driving in Urban Environments: Boss and the Urban Challenge
•
429
Table I. Description of the sensors incorporated into Boss.
Sensor Characteristics
Applanix POS-LV 220/420 GPS/IMU (APLX) • Submeter accuracy with Omnistar VBS corrections
• Tightly coupled inertial/GPS bridges GPS outages
SICK LMS 291-S05/S14 LIDAR (LMS) • 180/90 deg × 0.9 deg FOV with 1/0.5-deg angular resolution
• 80-m maximum range
Velodyne HDL-64 LIDAR (HDL) • 360 × 26-deg FOV with 0.1-deg angular resolution
• 70-m maximum range
Continental ISF 172 LIDAR (ISF) • 12 × 3.2 deg FOV
• 150-m maximum range
IBEO Alasca XT LIDAR (XT) • 240 × 3.2 deg FOV
• 300-m maximum range
Continental ARS 300 Radar (ARS) • 60/17 deg × 3.2 deg FOV
• 60-m/200-m maximum range
Point Grey Firefly (PGF) • High-dynamic-range camera
• 45-deg FOV
Figure 2. The mounting location of sensors on the vehicle; refer to Table I for abbreviations used in this figure.
that satisfy state constraints C(x) whose dynamics
can be expressed in the form of a set of differential
equations f:
·
x = f[x, u(p, x)]. (1)
To navigate urban environments, position and
heading terminal state constraints are typically re-
quired to properly orient a vehicle along the road.
The constraint equation x
C
is the difference between
the target terminal state constraints and the integral
of the model dynamics:
x
C
= [x
C
y
C
θ
C
]
T
, (2)
C(x) − x
C
−
t
f
0
·
x(x, p)dt = 0. (3)
The fidelity of the vehicle model directly cor-
relates to the effectiveness of a model-predictive
planning approach. The vehicle model describes
Journal of Field Robotics DOI 10.1002/rob
剩余41页未读,继续阅读
资源评论
linxigjs
- 粉丝: 353
- 资源: 12
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功