Charting A Path To Autonomous Driving Affordability
Part 1 — Robots, cars. Cars, robots.
Usually, I’m not quick-witted enough to ask many questions during a media event’s Q&A session, but this time I actually thought of something.
Speaking into the microphone handed to me, I asked “So, is there any technological cross-pollination happening between you folks (we were visiting Toyota‘s Robot Lab near Nagoya) and the car company?”
After a delay for interpretation (“cross-pollination”?) Toyota’s Japanese rep’s first reaction was – and I’m paraphrasing slightly: “Yes, of course. But first realize that we are also the car company.” This was followed by a polite smile.
Toyota is also the car company. And what it does here is create robotic systems.
In one demonstration at the lab (part of Toyota’s Advanced Technologies tour), we watched a little robot called HSR fetch a container from a shelf full of other boxes (identifying it by reading the 2D bar code) then traverse the floor and set it on a nearby table. In another, a computer-controlled bionic leg brace allowed for more natural walking via a motorized flexing knee.
Robots, cars. Cars, robots.
During the “automotive” portion of the demonstrations, we climbed into a Prius, accelerated to 50 mph, and approached a parked car. Suddenly, a pedestrian dummy emerged from behind it. First, our car’s alarm sounded, then it automatically braked (I intentionally didn’t take action), and at the last second, it autonomously veered away from the frozen humanoid. Its video camera (located behind the rear view mirror) had identified the object (in part by matching it to likely human heights) and directed the car to steer as far away as possible within its lane.
In another demo, a Toyota self-parked into a perpendicular space (tail-in) by first noticing a likely spot as we passed it. Then, when I angled away from the space a bit and stopped (the characteristic behavior of a car about to reverse into a parking space), the rear camera displayed a candidate destination and framed it with a blue border. I tapped “OK” and put it into reverse, and the car self-steered into the spot as I modulated the brake. Worked like a charm. And yes, that demonstration of the HSR robot was fresh in my mind.
But the most dramatic exercise found Toyota taking us into Tokyo traffic in two vehicles driving in tandem. The pair wasn’t deploying just the “usual” radar and optical sensors, though. They were also communicating over a 740-Mhz radio frequency to erase even the smallest delay in their coordination (a system Toyota calls Cooperative-Adaptive Cruise Control). As we navigated around a sequence of on- and off-ramps, the cars maintained a precise, synchronized time gap. (The usual radar sensors alerted us to cars cutting in between us.) And passing these warnings about slowing to upstream traffic not only makes things safer, but also radically damps out the accordion effect that triggers so many traffic jams. Toyota is working with other Japanese companies to make this communication frequency an industry standard.
Meanwhile, each car was also portraying its individual autonomous traits. As the lead car approached a corner, it was comparing its real-time speed to what the GPS path deemed prudent, then smoothly braking to a comfortable cornering-g pace. That’s cool, but I’ve seen it before. What I haven’t seen is how it confirmed this judgment by identifying the curvature of the lane markings ahead (a system called Lane Trace Control). Behind, the following car was steering autonomously by combining its own LTC guidance with a visual lock on the lead car’s stern.
And while automated driving is being pursued elsewhere around the world for the usual reasons of safety and traffic mitigation, Toyota has a subtle, additional motivation: the implications of its home country’s aging population. For the Japanese, systems like these may be essential to affording automotive transportation to a sizable portion of its future society.
During these adventures, I heard a few of my colleagues mention that they’ve seen even more advanced versions of these same technologies elsewhere. I have, too: A while back I tried a pedestrian/lane-change demo put on by supplier Continental in which the car not only swerved into the next lane but even steered back again. When I asked a Toyota engineer about this he nodded knowingly and pointed out that it requires additional (costly) sensors to know the 360-degree state of nearby traffic. All these companies are carefully weighing the value that each sensor brings to the party.
But here’s the overarching, Big Idea we need to understand: In order to build an automobile with this near-robotic level of sensing, computing, and vehicle-to-vehicle communication that’s also affordable requires simultaneously skimming significant cost from today’s non-robotic cars. Let me repeat that: Cost has to come out of present-day cars to afford this stuff. So in 2015 we’ll begin to see the rollout of what Toyota calls TNGA — Toyota New Generation Architecture. It’s akin to the architecture-simplification schemes we’ve seen elsewhere (for instance, from Volkswagen, and recently Nissan/Renault). In Toyota’s case, it means boiling its portfolio of cars down to three principle vehicle groups, each of which largely shares floor pans and underpinnings.
But here’s what particularly unusual about TNGA: Within each group, the driver’s H-point (the distance between the car’s floor and the driver’s comfortable hip height) will be fixed. As Mitsuhisa Kato explained in his presentation, designing every car in your line to have a slightly different H-point means developing its crash safety solutions over and over again for every single (expensive) instance. Simplifying like this — as well adopting an increased percentage of standardized parts like electrical connectors — is the cost-cutting meat cleaver that’ll hack open a path for the autonomous automobile to emerge.
It’s a big deal. Toyota wasn’t blowing smoke with the vehicle-to-vehicle/autonomous driving demonstrations it showed us. It means it. And the automaker is placing a fantastic bet on it by fundamentally changing how it designs and builds cars to do it. I say judge the seriousness of the other car companies “autonomous driving” dog-and-pony-shows on what how they’re going to cut the manufacturing costs of the cars they currently build.
Robots, cars. Cars, robots.