Inside The City Where Waymo Tests Self-Driving Vehicles

Inside The City Where Waymo Tests Self-Driving Vehicles


In a suburb of Phoenix, Arizona, there’s
a fleet of 600 minivans shuttling people from place to place. Ordering one feels almost exactly like
calling an Uber, except for one thing: the vans are
driving themselves. It does feel like from when I first got
in that it is just a normal car. Their vehicles are all
over our roads. I don’t think you could stand at a
street corner or drive a couple of miles without seeing a Waymo vehicle. I know that some of this technology is
scary for many of our citizens, but I think if you see other times in
our economic arc that this has really opened up new worlds for
people and new opportunities. This is my thirty third year in policing,
and when I started in policing we had vehicles, obviously, but there were
there were no computers, no cell phones. Pagers weren’t even
in existence yet. So to see this technology in relatively
a short 30 year span is just absolutely fascinating. In 2004, the U.S. Department of Defense hosted a 142
mile driverless car obstacle course competition. The farthest any of the entrants got
was a little over seven miles. And off we go. Good try, guys. The next year, the DOD tried again,
this time five vehicles completed the course, but a team from
Stanford did it the fastest. That Stanford team was led by
a computer scientist named Sebastian Thrun. So I didn’t anticipate this to become
a race for speed. It was one of the most thrilling races ever. In 2007, Google hired Thrun
and he created Google X. Two years later, Google X
launched a self-driving car project. In 2016, that project spun off as
its own company, called Waymo, under Google’s parent company, Alphabet. Here’s an example of the
Waymo Chrysler Pacifica minivan. There’s nobody driving. There’s nobody behind the wheel. The company says it’s tested its vehicles
in over 25 cities in six states. But the most miles seem to
have been driven around Phoenix, Arizona. From the police department’s point of view,
our mission is to keep our city safe. So we recognize this technology
as something that could really impact our roadways because the
overwhelming majority of collisions are preventable. You get in the car and you have
a seat and it has a start button. And it’s pretty trippy when you can see
the fact that the car is driving itself. It’s amazing to see how well the
brain processes information, as a driver, to see the car do
the same exact thing. It’s great to be a part of
history, for my kids to experience. My daughter actually liked it
a lot, didn’t she? Right now Waymo is doing two main
things in the Phoenix area: Around a thousand members of the community have
access to its rideshare beta program, Waymo One. Users can summon one
of Waymo’s 600 vehicles 24/7 and ride anywhere within a
limited local region. And these users are actually paying for
the rides, it’s not just a free demo. It also has a partnership with
Lyft and makes ten of its vehicles available to the general
public via Lyft’s app. I probably use Waymo maybe
percent of the time. The biggest limiting factor is that it
only goes in a certain defined area, mostly in Chandler and Tempe and
maybe a little bit of Mesa. If it went all the way downtown, I
would probably take it a whole lot more. The reason Waymo is limited to a
small region is because its cars are autonomous, but only
in specific locations. Everywhere it can drive has been
carefully mapped and analyzed so that even before sensing anything new, the vehicle
already has a good sense of where it is. The Society of Automotive Engineers came
up with a set of standards defining the levels of autonomy a vehicle
could be, ranging from zero to five. And right now, Waymo’s vehicles are
at a four: capable of full autonomy, but only sometimes. Tesla refers to its
driver-assist systems as Autopilot. Nobody in the industry
thinks that’s the case. Waymo and General Motors Cruise Automation
are very close to having what they refer to as level 5
cars, most of the time. So right now it’s standard for Waymo
vehicles to have safety drivers behind the wheel at all times, ready to take
over if something were to go wrong. And beyond that, there’s a team of
support staff on call to help riders. The vehicles are constantly maintained
by a team of people. They’re cleaned by a team of people. While the driving itself is done mostly
by a computer, the system is still dependent on human labor. A lot of the business promise and
also the hope for these machines, these autonomous vehicles, is that they eliminate
labor and they eliminate the need for human beings to drive and
to be stuck in jobs like delivering pizzas or picking up the elderly or
the blind from their homes and taking them to services, wherever
it needs to be. When we think through that a little
more carefully, though, some of the chinks in that idea show up. For example, think of something
like Meals on Wheels. The vehicle shows up. It opens the door. There’s the meal. Maybe they can get out to the
curb to get it, maybe not. Even if they could, though, when that human
driver shows up with a Meal on Wheels, they actually come
to the door. Maybe they sit for a little bit. So it’s this human interaction that’s still
very much a part of these transportation functions. I think there’s still a ways to go
before they’re ready for prime time on the roadways. But we want to be helpful
in the testing of it. And then we want to make certain that,
whether it’s at the state level or the federal level, that all of
those regulations and rules are being properly followed. Developing vehicles that adhere to
strict safety protocols, including speed limits, has occasionally been a point
of contention for other human drivers on the road. There’s been some experience where
because our Waymo vehicles actually follow the rules and the law, that some
people who tend be in a rush, get bothered by that. So there’s a transition
that’s going to happen. But with rideshare companies like Lyft
and Uber struggling to be profitable, for them, leaning into
self-driving cars could make sense. As we saw from the Lyft and Uber IPOs,
there does not appear to be a path to profitability for ride hailing
services with human drivers. Even buses where, you know, operating
the vehicle is very expensive, the vehicle itself, the major portion of
that expense is the driver. The future of autonomous vehicles is more
likely to be in the form of ridesharing fleets that you can borrow when
you need, but no actual car ownership. So I think they see an opportunity
in cars that will be able to transport things, transport people, but not
so much around car ownership. And it’s still a little bit unclear as
to where they see the biggest money coming from, but at least
that’s where it’s evolved to. In March 2018, a woman named Elaine
Herzberg was killed by an Uber self-driving car just 13
miles from Waymo’s office. But it didn’t slow Waymo down. It was business as usual in Chandler. The next day, just as many
vehicles were on the road. It was an unfortunate
incident for another company. But again, Waymo has had an
extremely conservative business model and safety protocols that they had
really weathered that storm well. We were very saddened, of course,
by what happened in Arizona. Our hearts go out to the family
and all those impacted by the crash. At Waymo, our focus
has always been safety. In our city, there have been no
collisions where the Waymo has been at fault. So you can take that any way
you want as an indicator, but it’s such a small sample size. Certainly we anticipate the more these
vehicles are out there functioning at the level that they’re expected to
function at, if it takes away that human element, it potentially could have
a very positive impact on the roadways. Some people have asked, you
know, is it actually safe? You know, when you are inside, do you
get nervous or, you know, do you think anything is going to go wrong? And I’d say, well, you know, no,
there’s always a driver, you know, at least while they’re still getting the
technology, you know, hammered out. Waymo is way ahead of everybody
else in terms of the technology. They have these disengagement
reports in California. They disengage a lot less, a
lot fewer times, than anybody else. You know, I think the robot drivers
are probably actually better than human drivers. Arriving shortly at your destination. Please keep your seatbelts fastened until
we reach your destination and remember to take all
your belongings with you. Proponents of autonomous vehicles make
compelling claims about the potential benefits of
self-driving cars. 94% of all crashes are
due to human error. 42 hours are wasted sitting in traffic
per person per year in the US. That’s an entire working
week every year. And millions more people aren’t able
to drive because they’re elderly or living with a disability. And self-driving cars have the potential to
change all of that for all of us. I think these cars and automobiles
and trucks provide a real opportunity for the state. There’s a lot that can be done
for disabled people, for blind people, for elderly people. So many of the deaths that happen on
our roads are a result of human error and I believe these autonomous
vehicles can provide higher public safety and that really
is the objective. But it’s just not clear these
things would actually happen with more self-driving cars on the road. One of the things I often hear from
people is when an autonomous vehicle is better than the fiftieth percentile driver
on the road, we have an absolute responsibility to let
them onto the road. Others, like Elon Musk, have said
it’s almost irresponsible not to have these vehicles out there because they are
safer and will be safer than human drivers. That’s not been proven. It presents a problem, which is people
dying on the road or crashing and so forth, and saying, well,
therefore, you need this solution. But of course, there are
a lot of solutions. And one of the solutions we see
right now are things like autonomous braking, lane keeping assist, all
of these driver-assist systems which take a good driver
and make them better. And so even if we could say that
an autonomous vehicle was better than a human driver, it doesn’t mean that an
autonomous vehicle is better than a human driver plus all the advanced
driver assist systems we have. And if the goal is safety above
all else, there are other less complicated things that could be done. For example, since speeding is known to be
one of the top causes of car accidents, members of the
European Parliament recently provisionally agreed to require all vehicles sold
in Europe to include mandatory speed limiters. A lot of the promises about
autonomous vehicles are around congestion and particularly safety They are kind of a silver bullet,
Silicon Valley, a tech-bro solution to the problem of road deaths. There’s a much less exciting solution
to road deaths, particularly in urban areas, and it’s
called Vision Zero. And the premise
is pretty straightforward. It says, let’s start with safety
and then let’s add mobility. The current idea around driving, around
cars, is let’s get as much mobility as we can and then
let’s start to make things safer. Whether or not autonomous vehicles are safer
than human drivers is in a lot of ways beside the point. They’re more lucrative than
selling cars to people. They’re more lucrative than selling
rides driven by human beings. So while there are other,
potentially better solutions, updating infrastructure and making policy changes is
never going to be as interesting to most people as
cars that can drive themselves. With the new technology, there’s going to
be a time period where you have to, you know, give it a
try and work out the bugs. Like if there’s a computer program, I
don’t think I’ve ever seen somebody code something and hit run and
it works perfectly the first time. You have to give us
some real world experience. And so, you know that not everything
is going to work perfectly right off the bat. I used to say a year ago that I
was sitting in a diner and looking out the door at 6 a.m. and I saw in the span
of an hour 12 Waymo vehicles. That was trumped about six weeks ago when
I saw about 30 Waymo vehicles at intersections. And I don’t know if it
was a parade or whatever, but it was, they’re just all
over our streets. And it’s a good relationship.

Author:

100 thoughts on “Inside The City Where Waymo Tests Self-Driving Vehicles”

  • Cant wait to see all the police lose their jobs when the automated cars wont be pulled over, time for the road pirates to get a job that doesnt involve stealing funds from citizen's.

  • They only way to secure your self driven future is to become a hacker. The future of getting carjackings will be far worse than filing a police report.

  • Paweł Buczyński says:

    "Its cars have driven more miles than any other company’s." (Description)
    How did you count that? Tesla cars have driven more miles than all other companies together.

  • Paweł Buczyński says:

    Waymo is at level 4 in a very small limited space.
    Tesla is at level 3 anywhere in the world.
    (Just a quick comparison)

  • Andreas Shiakas says:

    "Level 5 most of the time" is the definition of level 4. So Waymo only works in pre-mapped sunny location? I once hoped they would be the first to give us true full self driving cars, but they are years behind Tesla and on the wrong path.

  • "The major portion of the expense is the driver." AI will take over human jobs. This legitimizes Andrew Yang's platform of Universal Basic Income.

  • Henry Postulart says:

    Got my driver’s license at 16. Drove until I died for an hour of a heart attack at 50. I’d give anything to get back on the road when and where I choose. Even if only as a passenger. Without an annoying, blabbermouth driver.

  • TheNewbiedoodle says:

    As a cybersecurity programmer, my two concerns with driverless cars are cyberattacks. Any computer which takes input can be attacked, and cars are especially juicy targets, given how complex their code is and how incredibly damaging they can be. Cars today already struggle with that; some can be completely disabled in the time it takes to drive past someone on the highway. So what I'd really like to see is an emergency switch which, when flicked, does a few things:

    1. Completely disables all autonomous systems, through some kind of physical, hardware disconnect. I'm including modern smart systems, too — things like ABS and the TPMS.
    2. Permanently disables them, at least to the extent that it's immediately obvious if it's been triggered, and maybe requires repair to reenable them. That way, no one can flip them off, do something bad, flip them back on, and claim the car's programming is busted.
    3. Automatically calls emergency services to your location. This would be the only subsystem that remains active.

    That way, if your car suddenly goes haywire, you can flip off all the smart systems and retain control. This would, obviously, be a last resort, but as our cars depend on computers more and more, we need to spend more time securing them, or it's only a matter of time before some evil person kills hundreds of people by remotely taking over their cars and crashing them. This emergency killswitch wouldn't be a perfect solution, obviously, but it's at least not disableable (software can be hacked en masse, but hardware can't) and it gives people the chance to avert disaster.

  • Mitchell Kasdin says:

    This is dumb. How is it Level 4 autonomy when there’s a driver on the front seat. Tesla is basically 1.5 level autonomy this waymo vehicle is probably full level 2. Or level 4 in a constrained grid area.

  • No self driving car has been demonstrated yet to even be as good as a bad human driver most self driving cars have indirectly caused accidents by stupid decisions, forcing other drivers to compensate for the self driving cars stupidity, but you rarely hear this talked about, you only hear it when it is a direct fault of the self driving car. For self driving cars to be considered, I believe they need to be way better than the average driver, because the average driver can at least have predictable and normal road behaviors that can be compensated for where self driving cars don’t. I believe average is not good enough, because an average driver won’t be driving as far as these self driving cars, if the self driving car is average it will get in way more accidents simply because the more it drives the likelyhood of an accident goes up. Humans also learn quicker than these cars – so the more a human drives the better it gets, but this is different for self driving cars which if they learn at all will learn unbelievably slow with current algorithms. People also have this odd preconceived notion that “because it’s a computer it’s better” which is not true, a lot of AI algorithms have self learned biases that are often incorrect or an extremely shallow logical understanding of the problem, on top of that there are the humans that may also introduce bias in the code, so these are things we need to be careful with

  • I stay away from these shuttle because I don't trust self driven cars. I've seen a self driven truck pulled a Walmart trailer in Marana AZ.

  • Stephen Nnodim Jr says:

    These self-driving systems are cool, but the thing to keep in mind is that these things can fail. Software/electronics can be corrupted, hacked or unexpectedly fail. What happens then? I believe it is necessary for an element of 'manual override' to be built into any and all autonomous driving systems in the future to mitigate against injury/loss of life that could result from system failure.

  • 3:54 Tesla's Autopilot was named after the autopilot systems in airplanes, which have existed for about a hundred years and nobody confuses them for something that makes airplanes autonomous. Why would you expect something called "autopilot" to handle speeding up, slowing down, changing lanes, turning, reacting to traffic that's not in the same lane, or choosing a route?

  • Self-driving cars are definitely the future and will make the road much safer than humans behind the wheel. Humans are aggressive animals and they drive recklessly.

  • Guys, I know this isn't related to self friving cars, but what do you think about the future of aviation? Self flying aircraft? If Self driving cars are already here, then it's almost a matter of time when pilot-less aircraft come.

  • Wow! A fleet of 600 self driving cars. How many people who are driving for Uber or Lyft for full time will lose their jobs when they’re on main streets of America? People , please search Andrew Yang on Joe Rogan podcast on YouTube , and find out more!

  • Aditya Jaiswal says:

    Given the amount of assholes talking on cellphone while driving their car I deal with despite knowing very well that it's against the law, am counting on these self-driving vehicles to end the threat to life for the rest of us from these half brain idiots.

  • Giesbert Nijhuis says:

    Tesla does not claim to be at level 5 self driving. Their "auto pilot" (just a name) is at level.. I would say just above Waymo, which is at ±level 3. Both Tesla and Waymo are not a level 4 yet.

  • Who;s going to be detained if driver less vehicle killed someone when failure systems?who also responsible? Company or the CEO? 😂

  • CNBC you know why they are in Arizona right? WAYMO is based upon LIDAR, which means it can't be used during fog, rain, or snow. Tesla, for example, uses SONAR which includes radars, and cameras in their system. SONAR uses emits and receives reflected sound echos which are aimed at detecting large objects. Google has a newer system still based on LIDAR that can detect weather patterns and weather issues but it's not ready. Arizona, Nevada and a few other states are perfect for LIDAR because they do not have any weather issues, but the majority of the USA has weather issues in certain months of the year. I drive a Tesla, and I have to say Autopilot works as advertised. It sees cars in front of me and to the sides, it understands blind spots, and finally, it shows when you are getting to close to a car or vice versa when a car is getting to close to you. LIDAR is not using sensors, instead, it uses one sensor which is on the roof of the car, instead of multiple sensors all around the car. There is another issue with LIDAR that SONOR does not have the issue, and that is that LIDAR degrades at high sun angles and reflections. Finally, LIDAR has some high operating costs when collecting data.

    Personally, I don't see LIDAR taking off. On that day when the DOD sponsored the program, Elon Musk was not there. At the time, I don't know if Tesla would have won at the time, my guess is no. But if you look today, Tesla is always improving their Autopilot and FSD systems, whereas LIDAR isn't that much. Secondly, LIDAR is being built in cars like a Pacifica and taking away all the functionality of that car. If Google wants to succeed here, it would be best to develop its own car, and not just use someone else and claim they won. I strongly suggest everybody look at this video which was posted by Tesla showing their FSD that is coming out soon. https://youtu.be/tlThdr3O5Qo

  • There are a few concerns…

    1. Companies like Lyft, Uber, Waymo… are only looking into self-driving cars for money potential, under the guise of safety.

    2. If you ever went to court for a speeding ticket? Cities make soooo much money off tickets. If you decrease speeding tickets, cities gonna find a way to keep the money flowing one way or another. Even if it means making up a law to keep up the profits.

    3. "Unskilled" labor is slowly getting phased out.. from driving to fast food and beyond. What you think gonna happen if all these thousands of people can't find jobs? Crime gonna go up…

    4. If safety really a priority, there's so many things that can be done to cars today such as the speed limiter. But much more money is made in chaos than peace.

  • Shouldn’t airlines be worried about flights <3hrs? If I can sleep in the back of a car for 10hr drive to Chicago or 6 to Boston I’ll take that over a $300 cramped seat on a plane that gets delayed 3 hours a 1/5 of the time, not to mention TSA and baggage fees. Sounds like First Class to me.

  • "Because our waymo vehicles actually follow rules…" I tried to say that with the straightest face possible and just I-

  • Kolajo Adeyinka says:

    It seems great to me, and the future of our road, safety, and transportation, I'm looking forward to being in one of them.

  • I'm pro-progress, but the technology is simply not there yet. And no one clearly knows when number of mistakes by algorithms will be less then by humans…

  • Jonathan Duenas says:

    8:05 If I got a dollar every time he said "yoU knOW"… I cringe so bad when people say that when explaining something, the reason you are asked is because we don't know….

  • Yeah, Andrew Yang said this is coming. The age of AI, Robotic, and Automation are upon us. Yang has a comprehensive plan to help everyone to retrain and get through the 4th industrial revolution. Yang2020.com

  • TheTruth Hurts says:

    While I appreciate the upload but it is almost like a promotional video for self driving vehicles; bare in mind that how you keep ignoring Andrew Yang, the only presidential candidate that addresses the impact of automation, and NBC muted his mic during the first debate. I have to conclude that you don't really work for the people but your corporate bosses & your advertising customers.

  • Boskoe The Mop God says:

    I really dont see the need for this.. People are that lazy now ..that cant drive them selfs around, Unbelievable.

  • The standards chart developed the automotive engineers leaves out the requirement of area. Level 4 driving is only useful if the vehicle can b Level 4 anywhere a car can drive. If limited to a specific area, Level 4 is almost useless.

  • Tesla’s Autopilot is the same as Autopilot on an aircraft. You set parameters such as destination and speed, then monitor the system. An airline pilot does not set the plane on Autopilot and then leave the cockpit. The pilot monitors the system and makes adjustments. Tesla Autopilot is no more synonymous with self driving than Autopilot in an aircraft is self flying. It is the error of thought that equates Autopilot with self driving.

  • Waymo should be used in the city limits (Los Angeles, Chicago, San Francisco, New York, Boston) where speed limits are below 35 mph. It would keep the city cleaner and minimize accidents.

  • Take away our freedom of travel..be taken by a robo car..cameras inside no privacy..when I go somewhere..I dont want to be taken..I want to go there..and maybe stop at a beach or park to eat my drive through fast food..oh..cant do that..have a destination..it goes there..no freedom of choice..not American…

  • The police officer claimed there weren't pagers or computers around the start of his career 30 years ago. Pager's have been widely used by doctor's since the late 50's/60's and personal computer's first started showing up in the 70's. Did he never walk by a radioshack, sears, or local radio shop? Seriously.

  • Corrupt Cop God says:

    Vote Andrew Yang 2020 people if you want to stay alive after the robots take over. Watch his Joe Rogan interview here on YouTube. This is our future now and we need a leader that understands this.

  • Saarthak Khanna says:

    94 percent of the cars accidents are due to human error, thanks sherlock. I don't if this is true but 99.999999% of cars in the world are driven by humans

Leave a Reply

Your email address will not be published. Required fields are marked *