WHEN UBER’S SELF-DRIVING Volvos hit the streets of Pittsburgh later this summer, each vehicle will be crewed by a couple of Uber employees, just in case the car’s robo-driver goes rogue. But pay no attention to the humans in the front seats.
“The goal is to wean us off of having drivers in the car, so we don’t want the public talking to our safety drivers,” said Uber engineering director Raffi Krikorian. To that end, Uber plans to install a tablet in the backseat of each of its autonomous vehicles. The company’s new human-machine interface will reportedly introduce riders to the autonomous driving experience, and explain the technology behind it.
We’ve yet to see that interface for ourselves, but it’s safe to assume the challenges Uber faces in creating it are similar to those faced by Tesla, Mercedes, Ford, and other companies testing the autonomous waters. But one thing Uber won’t have to do is teach its customers to surrender the wheel; as passengers, they’ve done that already. And designers say this single factor could turn Uber’s self-driving fleet into a prolific proving ground for interface design.
Today, Uber finds itself in a position where it makes more sense to develop an interface that behaves like a trusty chauffeur, rather than a co-pilot. It’s a situation that semi-autonomous car manufacturers may not find themselves in for years to come—but Uber has a chance to start experimenting now. “I would have said that would happen over the next couple years,” says Patrick Mankins, a designer at Artefact who specializes in AI-powered systems. “Now, I’m going to say over the next couple months.”
Designing that interface will require a delicate balance. It should present passengers with enough information to put them at ease, but not so much that they feel responsible for the car’s behavior. A town car driver wouldn’t overwhelm you with data and graphics, and Uber’s backseat tablets shouldn’t either.