The dashcam video in the recent Tempe crash which killed a woman walking across the street with a bicycle has now been released.
To me, it is quite clear that the human driver was dozing off or distracted and that the vehicle’s sensors failed to register that the pedestrian — walking with a bicycle broadside to the road, a very robust infrared and radar target, and crossing empty lanes before reaching the one with the Uber vehicle — was on a collision course. The vehicle had its low-beam headlights on when high beams would have been appropriate, the headlights were aimed low (probably a fixed setting), and the pedestrian’s white shoes don’t show in the video until two seconds before impact, that is, at a distance of about 60 feet at the reported 40 mph.
Braking distance is about 80 feet at 40 mph, and reaction time for a human driver adds about another 60 feet. An automated system with radar and infrared should have noticed the pedestrian sooner, had a shorter response time, and stopped the vehicle. Human eyesight is much better than a dashcam’s at night and the human driver might have seen the pedestrian earlier and avoided the crash if she had been paying attention. But also, the bicycle had no lights or side-facing retroreflectors which might have shown up much earlier and alerted optical or infrared sensors or a human driver, and the pedestrian somehow chose to cross an otherwise empty street at precisely the time to be on a collision course.
So, the human driver and vehicle’s sensors failed miserably. We can’t allow automated vehicles (and human drivers) to perform at the level shown in this video. We do need to make greater allowances for pedestrians, bicyclists, animals, trash barrels blown out into the road, etc.
Several people have offered insights — see comments on this post, and also an additional post with a description and history of the crash location.
European testing of sensing systems: https://www.youtube.com/watch?v=FTKxCE5qmQM
The car: Volvo XC90 with the stock ped detection disabled and replaced with Uber’s autonomous driving package.
Steve Goodridge commented in a Facebook group:
[T]he pedestrian was exactly where I concluded based on the TV news video and photos of the crash investigation scene. The pedestrian crossed 3.5 lanes at walking speed while the autonomous car approached, with no clutter or occlusion by other vehicles. While the visible light camera recorded a poor quality view of the scene as illuminated by weak headlamps (and human drivers are usually not charged when they overdrive their headlamps), better cameras can see much better than this at night, and both lidar and radar can see fine in complete darkness. Some automatic emergency braking systems on ordinary cars on the road today (in fact, a stock 2017 Volvo S90!) would have come to a near complete stop for this scenario. https://www.euroncap.com/en/results/volvo/s90/26099
And also:
A caveat, though: the Euro NCAP tests of pedestrian crossing from far side are done in daylight. The German automobilist club did a darkness crossing test but only up to about 30 mph. Some cars do much better than others for pedestrian AEB at night but we don’t have public test data for the Volvo at night.
The crash is reported as having occurred where the dashed line for the bike lane starts; https://goo.gl/maps/BceEUYkJ3ak
If the human in the car is supposed to be paying attention to potentially override the automation, that obviously was not happening. But it is exactly what I would expect from a HPI perspective.
HPI?
Pingback: Description and history of the location of the Tempe crash | John S. Allen's Bicycle Blog
The Uber camera did a very poor job of showing the road ahead. See this: https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-came-from-the-shadows-dont-believe-it/