{"id":15000,"date":"2022-02-09T12:05:25","date_gmt":"2022-02-09T17:05:25","guid":{"rendered":"https:\/\/autosector.com\/?p=15000"},"modified":"2022-02-09T12:05:25","modified_gmt":"2022-02-09T17:05:25","slug":"tesla-self-driving-crash-infrastructure","status":"publish","type":"post","link":"http:\/\/autosector.com\/?p=15000","title":{"rendered":"2019 Gardena crash was a perfect storm for Tesla&#8217;s imperfect Autopilot"},"content":{"rendered":"<p>Another year has passed, and <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/tesla\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:0;\">Tesla<\/a>\u2019s Full Self-Driving system is still a really cool demonstration suite that creates more work than it replaces. Its predecessor, the highway-only Autopilot, is also back in the news for being associated with <a href=\"https:\/\/www.autoblog.com\/2022\/01\/18\/felony-charges-tesla-autopilot-fatal-crash\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:1;\">a fatal crash that occurred in Gardena, California, in 2019<\/a>.<\/p>\n<p>While no comprehensive review of the accident appears to be currently available to the public, it\u2019s pretty easy to dissect <a href=\"https:\/\/ktla.com\/news\/local-news\/tesla-driver-suspected-of-running-red-light-in-gardena-crash-that-killed-2-lapd\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:2;\">what happened<\/a> from <a href=\"https:\/\/losangeles.cbslocal.com\/2019\/12\/29\/tesla-slams-into-honda-civic-killing-2-people-in-gardena\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:3;\">news reports<\/a>. Kevin George Aziz Riad was westbound in a <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/tesla\/model+s\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:4;\">Tesla Model S<\/a> on the far western end of CA-91 (the Gardena Freeway) with Autopilot engaged just prior to the incident. While reports that Riad \u201cleft\u201d the freeway could be misconstrued to imply he somehow lost control and drove off the side of the road, that\u2019s not what happened; the freeway merely ended.<\/p>\n<p>Like many of America\u2019s incomplete urban freeways, the 91 doesn\u2019t merely dead-end at the terminus of its route. Instead, it becomes a semi-divided surface street named Artesia Boulevard. Here\u2019s what that looks like on Google Maps:<\/p>\n<p><img class=\"grp-full lazy\" alt=\"\" data-original=\"https:\/\/o.aolcdn.com\/images\/dims3\/GLOB\/legacy_thumbnail\/1600x900\/format\/jpg\/quality\/85\/https:\/\/s.aolcdn.com\/os\/ab\/_cms\/2022\/02\/09090809\/gardena-crash.jpg\"\/><\/p>\n<p>What precisely took place inside Riad\u2019s Tesla as it traversed those final few hundred feet of freeway may forever remain a mystery, but what occurred next is well-documented. Riad allegedly ran the red light at the first north-south cross street (Vermont Avenue), striking a <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/honda\/civic\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:5;\">Honda Civic<\/a> with Gilberto Alcazar Lopez and Maria Guadalupe Nieves-Lopez onboard. Both died at the scene. Riad and a passenger in the Tesla were hospitalized with non-life-threatening injuries.<\/p>\n<p>When the 91-Artesia crash happened, Autopilot wasn\u2019t a primary component of the story. That came more recently, when authorities announced that <a href=\"https:\/\/www.autoblog.com\/2022\/01\/18\/felony-charges-tesla-autopilot-fatal-crash\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:6;\">Riad would face two charges of vehicular manslaughter<\/a> \u2014 the first felony charges filed against a private owner who crashed while utilizing an automated driving system. But is Autopilot really the problem in this case?<\/p>\n<p>Autopilot is a freeway driving assist system. While it can follow lanes, monitor and adjust speed, merge, overtake slower traffic and even exit the freeway, it is not a full self-driving suite. It was not meant to detect, understand or stop for red lights (though that functionality <a href=\"https:\/\/www.zdnet.com\/article\/teslas-big-new-feature-autopilot-now-halts-cars-at-red-lights-and-stop-signs\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:7;\">did appear after the 91-Artesia crash<\/a>). Freeways don\u2019t have those. So, if Autopilot was enabled when the accident happened, that means it was operating outside of its prescribed use case. Negotiating the hazards of surface streets is a task for Tesla\u2019s Full Self-Driving software \u2014 or at least it will be when it <a href=\"https:\/\/www.youtube.com\/watch?v=Zl9rM8D3k34\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:8;\">stops running into things<\/a>, which Tesla CEO <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/tag\/elon+musk\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:9;\">Elon Musk<\/a> has predicted will happen each year for the past several.<\/p>\n<p>In the meantime, situations like this highlight just how large the gap is between what we intuitively expect from self-driving cars and what the technology is currently capable of delivering. Until Riad &#8220;left&#8221; the freeway, letting Autopilot do the busy work was perfectly reasonable. West of the 110, it became a tragic error, both life-ending and life-altering. And preventable, with just a little attention and human intervention \u2014 the two very things self-driving software seeks to make redundant.<\/p>\n<p>I wasn\u2019t in that Tesla back in 2019. I\u2019m not sure why Riad failed to act to prevent the collision. But I do know that <a href=\"https:\/\/www.npr.org\/2021\/12\/22\/1064598337\/cars-are-getting-better-at-driving-themselves-but-you-still-cant-sit-back-and-na\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:10;\">semi-self-driving suites require a level of attention that is equivalent to what it takes to actually drive a car<\/a>. Human judgment \u2014 the flawed, supposedly unreliable process safety software proposes to eliminate \u2014 is even more critical when using these suites than it was before, especially given the state of U.S. infrastructure.<\/p>\n<p>This incident makes a compelling argument that <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/category\/autonomous\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:11;\">autonomous vehicles<\/a> won\u2019t just struggle with poorly painted lane markings and missing reflectors. The very designs of many of our road systems are inherently challenging both to machine <em>and<\/em> human intelligences. And I use the term \u201cdesign\u201d loosely and with all respect in the world for the civil engineers who make do with what they\u2019re handed. Like it or not, infrastructure tomfoolery the likes of the 91\/110\/Artesia Blvd interchange is not uncommon in America\u2019s highway system. Don&#8217;t believe me? Here are four more examples of this sort of silliness, just off the top of my head:<\/p>\n<p><img class=\"grp-half lazy\" alt=\"\" data-original=\"https:\/\/o.aolcdn.com\/images\/dims3\/GLOB\/legacy_thumbnail\/800x450\/format\/jpg\/quality\/85\/https:\/\/s.aolcdn.com\/os\/ab\/_cms\/2022\/02\/09094931\/southfield.jpg\"\/><img class=\"grp-half lazy\" alt=\"\" data-original=\"https:\/\/o.aolcdn.com\/images\/dims3\/GLOB\/legacy_thumbnail\/800x450\/format\/jpg\/quality\/85\/https:\/\/s.aolcdn.com\/os\/ab\/_cms\/2022\/02\/09094928\/cleve.jpg\"\/><img class=\"grp-half lazy\" alt=\"\" data-original=\"https:\/\/o.aolcdn.com\/images\/dims3\/GLOB\/legacy_thumbnail\/800x450\/format\/jpg\/quality\/85\/https:\/\/s.aolcdn.com\/os\/ab\/_cms\/2022\/02\/09094927\/baltimore.jpg\"\/><img class=\"grp-half lazy\" alt=\"\" data-original=\"https:\/\/o.aolcdn.com\/images\/dims3\/GLOB\/legacy_thumbnail\/800x450\/format\/jpg\/quality\/85\/https:\/\/s.aolcdn.com\/os\/ab\/_cms\/2022\/02\/09094929\/new-orleans.jpg\"\/><\/p>\n<p>If you\u2019ve ever driven on a major urban or suburban highway in the United States, you\u2019re probably familiar with at least one similarly abrupt freeway dead-end. More prominent signage warning of such a major traffic change would benefit drivers, both human and artificial alike (provided the AI knows what they mean). And many of those freeway projects never had any business being authorized in the first place. But those are both (lengthy, divisive) arguments for another time. So too is any real notion of culpability from infrastructure or self-driving systems. Even with the human \u201cdriver\u201d removed completely from the equation in this instance, neither infrastructure nor autonomy would be to blame, in the strictest sense. They\u2019re merely two things at odds \u2014 and perhaps ultimately incompatible \u2014 with each other.\u00a0<\/p>\n<p>In the current climate, Riad will be treated like any other driver would (and should) under the circumstances. The degree of responsibility will be determined along the same lines it always is. Was the driver distracted? Intoxicated? Tired? Reckless? Whether we can trust technology to do the job for us is not on trial here. Not yet. There are many reasons why automated driving systems are controversial, but the element of autonomy that interests me the most, personally, is liability. If you\u2019re a driver in America, the buck generally stops with you. Yes, there can be extenuating circumstances, defects or other contributing factors in a crash, but liability for a crash almost always falls at the feet of the at-fault driver \u2014 a definition challenged by <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/category\/autonomous\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:12;\">autonomous cars<\/a>.<\/p>\n<p>I previous joked that we might eventually <a href=\"https:\/\/www.autoblog.com\/2021\/12\/08\/driver-assist-tech-boredom-danger\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:13;\">bore ourselves to death<\/a> behind the wheel of self-driving cars, but in the meantime, we\u2019re going to be strung out on <a class=\"injectedLinkmain\" href=\"https:\/\/www.autoblog.com\/tag\/red+bull\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:14;\">Red Bull<\/a> and 5-Hour Energy just trying to keep an eye on our electronic nannies. It used to be that we only had ourselves to second-guess. Now we have to manage an artificial intelligence which, despite being superior to a human brain in certain respects, fails at some of the most basic tasks. Tesla\u2019s FSD can barely pass a basic driver\u2019s ed driving test, and <a href=\"https:\/\/insideevs.com\/news\/549118\/tesla-fsd-beta-driving-test\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:15;\">even that requires human intervention<\/a>. Tesla even <a href=\"https:\/\/www.autoblog.com\/2022\/02\/01\/tesla-full-self-driving-recall-runs-stop-signs\/\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:16;\">taught it to take shortcuts<\/a>\u00a0like a lazy human. And now there\u2019s statistical evidence that machines <a href=\"https:\/\/engrxiv.org\/preprint\/view\/1973\/3986\" data-ylk=\"elm:context_link;itc:0;pos:1;sec:donut-hole;cpos:17;\">aren\u2019t any safer after all<\/a>.<\/p>\n<p>So, is the machine better at driving, or are we? Even if the answer were &#8220;the machine&#8221; (and it&#8217;s not), no automaker is offering to cover your legal fees if your semi-autonomous vehicle kills somebody. Consider any self-driving tech to be experimental at best and treat it accordingly. The buck still stops with you.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Another year has passed, and Tesla\u2019s Full Self-Driving system is still a really cool demonstration suite that creates more work than it replaces. Its predecessor, the highway-only Autopilot, is also back in the news for being associated with a fatal crash that occurred in Gardena, California, in 2019. While no comprehensive review of the accident [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":15001,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"site-sidebar-layout":"default","site-content-layout":"","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"categories":[2],"tags":[],"class_list":["post-15000","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-industry"],"_links":{"self":[{"href":"http:\/\/autosector.com\/index.php?rest_route=\/wp\/v2\/posts\/15000","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/autosector.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/autosector.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/autosector.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/autosector.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=15000"}],"version-history":[{"count":0,"href":"http:\/\/autosector.com\/index.php?rest_route=\/wp\/v2\/posts\/15000\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/autosector.com\/index.php?rest_route=\/wp\/v2\/media\/15001"}],"wp:attachment":[{"href":"http:\/\/autosector.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=15000"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/autosector.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=15000"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/autosector.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=15000"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}