Auto-mation
A robotic car bedecked with Lincoln Laboratory sensors takes on DARPA's Urban Challenge.
From cruise control to antilock brakes to hybrid cars' fancy energy management programs, drivers have steadily ceded more and more control over their automobiles to computers. The logical endpoint of such developments—a car that drives itself—has long remained a futuristic mirage. But a team from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) and Lincoln Laboratory recently demonstrated that a self-driving vehicle is no mirage when they participated in the Defense Advanced Research Projects Agency (DARPA) Urban Challenge. The mission: design a fully automated, independent, self-thinking, self-correcting vehicle that could operate in the complex and cluttered environs of a modern city.
Team MIT's TALOS autonomous vehicle, readied for the DARPA Urban Challenge, is outfitted with a multitude of sensors. The sensors scan and sweep across an intersection to show approaching and standing traffic as well as curbs and lane markings, which TALOS's algorithm must assess before continuing.
|
Earlier DARPA challenges required autonomous cars to make their way around the much simpler courses of deserts or mountains. MIT took a pass on these tests because they put relatively little demand on the situational awareness technologies that Lincoln Laboratory and MIT specialize in. But DARPA realized that the true test of an automated vehicle was city traffic. "In the spring of 2006, there was chatter about an urban challenge," says MIT Electrical Engineering and Computer Science and CSAIL professor Seth Teller. Urged by several students, Teller, CSAIL colleague and mechanical engineering professor John Leonard, MIT aeronautics and astrophysics professor Jonathan How, and Olin College professor David Barrett joined forces with Robert Galejs, Jonathan Williams, and Siddhartha Krisnamurthy of Lincoln Laboratory’s Advanced Capabilities and Systems Group and several other partners to develop a robotic vehicle. They named it TALOS, after the horseless golden chariot of Greek mythology.
The urban challenge was closely tied to dynamic real-world driving. The vehicles had to follow the roads, queue up at intersections, merge into traffic, park, and perform U-turns. Not only did TALOS have to sense its location on the map and travel a specific route, it also had to observe other vehicles (including both human-driven and robotic ones), lane markings, curbs, and obstructions. Consider the issue of a parked car that fills the driving lane on a road. Human drivers evaluate the problem, determine whether they have enough time to move into the opposite lane to pass the parked car, and proceed. The automated vehicles were expected to perform the same types of decisions and proceed in roughly the same time sequence as human drivers. (DARPA did simplify the task a bit by eliminating pedestrians and traffic-sign sensing.)
Each vehicle was required to follow the curve in the road from one GPS point to the next. But because DARPA provided only minimal GPS information for the course, and the GPS waypoints were so scarce, TALOS relied on a tiling of radars and other sensors—all tightly positioned near one another and each pointing in a specific direction—to observe and interpret its immediate surroundings. Teller gathered collaborators in industry, academia, and Lincoln Laboratory with one fundamental goal in mind—a single algorithm that would handle all situations. A good driver, human or algorithm, "translates the laws of the road into a recipe for good driving," he says. After building the vehicle and testing the sensors, the team needed several months to encode approximate rules of the road into the vehicle.
TALOS is preparing to negotiate a curve in the road to reach its next destination, the green marker, which is a DARPA-supplied GPS waypoint. The optical image is overlaid with a proposed path of circled destination points (white indicates that the lane-marker detection and lane-center estimation are functioning, and black indicates subsequent steps without current estimations), directional lines of motion to those points, and yellow lane and curb markers.
|
The TALOS team started with a Land Rover and automated the gas, brake, steering, and shifter controls. Sensors included one short-range 360° lidar, a dozen longer-range planar lidars with 180° fields of view (seven were oriented downward as if they were push brooms and five were oriented horizontally), 15 radars each with an 18° field of view, five wide field-of-view cameras, and one narrow field-of-view camera. The massive computer cluster processing the stream of sensor data required an air conditioner, a gas-powered generator, and a battery supply (to keep the computer running whenever the generator had to be shut down—for example, for refueling).
Combining all the sensor's inputs, the vehicle's algorithm defined a series of routes, selected the best route that complied with the rules of the road, and reevaluated its situation about ten times per second. The algorithm's long-term goal was to complete the several-mile mission defined by DARPA. Its intermediate goal was to get the vehicle to the next GPS point on the route, tens of meters ahead. And its short-term goal? "Avoid the pothole in front of the vehicle," Teller says.
With new data arriving many times every second, potentially confusing the algorithms, the vehicle might stop, think for a while, and recover—or not. "I can't go in and help TALOS: it has to figure it out by itself," says CSAIL doctoral student Edwin Olson, a member of the TALOS team. During the Urban Challenge, the MIT entry distinguished itself as the highest finisher of all those that had not been involved in the first two challenges. TALOS made further news by being involved in two bot-to-bot collisions, but it was absolved of fault in both cases. Still, "human drivers probably would have avoided those accidents, so there is still work to be done," says Olson.
Teller relied on Lincoln Laboratory to evaluate and calibrate the radars and lidars. Galejs and his associates needed to develop a long-range radar sensing capability—identify radar options; characterize radar for accuracy, multi-radar interference, and clutter rejection; and develop software algorithms to merge with the other sensors and the controls of the vehicle. "Our leverage," says Galejs, "was our test facilities and the knowledge of how to test radars." And the collaboration with MIT was a success story: TALOS was a truly autonomous vehicle.
The challenge showed some of TALOS's strengths and weaknesses. On the positive side, TALOS correctly overrode the "do not go over the center line" rule when the alternative—staying in the driving lane—would have resulted in a collision with a parked car. TALOS paused, waited until it determined that no vehicle was in the passing lane, and went around the obstruction—exactly as a human driver would be expected to behave. Two difficulties that arose were primarily in response to the camera sensors. At one point in the course, several trees cast shadows across the roadway. These shadows confused TALOS's vision-based lane detection system and made the vehicle stop until it sorted things out. TALOS also struggled to navigate a section of the course consisting of a dirt road with no curb or centerline. "Getting TALOS to drive more quickly and smoothly in perceptually difficult environments is something that the team will continue to work on," says Olson.
People might appreciate some advantages that would come with robotic cars, says Leonard: "You won't need parking lots next to strip malls. After getting out of your car, you just tell it to go park itself." |