Files: 8e0d7e643c647b11a09fa74cf75bd78b828004c6 / content / selfdrivingcars.md
title: "Self-driving cars: software crashes" date: 2018-02-25 23:28 description: > “It just works automatically!”
tags: software, Free Software, technology, trust, law, politics, capitalism
links:
- url: https://www.youtube.com/watch?v=nFZGpES-St8
title: "Karen Sandler's talk about having a pacemaker-defibrillator that runs proprietary software (video)" description: "It's literally screwed into her heart, but she can't legally fix it, or even see how it works" - url: http://fortune.com/2016/04/12/self-driving-cars-safety-study/
title: "It's Impossible to Find Out If Self-Driving Cars Are Safe: Report" description: "“Even if autonomous vehicle fleets are driven 10 million miles, one still would not be able to draw statistical conclusions about safety and reliability.” — if only you could inspect the instructions…" - url: https://opensource.org/osd-annotated
title: "The Open Source Definition" rel: related type: text/html - url: https://reproducible-builds.org/
title: "Reproducible builds" rel: related type: text/html - url: https://www.washingtonpost.com/news/energy-environment/wp/2015/09/18/epa-volkswagen-used-defeat-device-to-circumvent-air-pollution-controls/
title: "Volkswagen used ‘defeat device’ to illegally skirt air-pollution controls" description: "“The software, which the EPA called a ‘defeat device,’ would turn on full emissions controls during testing and switch them off again under normal driving conditions.”" rel: related type: text/html
No car is self-driving.
A “self-driving” car is piloted by software, which is ultimately written by a person. You don't know who that person was; only that they were employed by a particular company.
They were probably sitting in an office somewhere in California when they wrote the code driving your car. Maybe it was 17:30 on a Friday and, despite caring sincerely about the work they were doing, they happened to be distracted by the prospect of going home. Maybe not. You don't know.
Do you trust that person with your life?
Well, the company hired them, so they can't be completely useless. You trust the company's recruitment procedures. …What are the company's recruitment procedures?
Anyway, presumably there are processes in place to review the code, and stop mistakes from making it into the final software. Presumably. You trust that there are, and that they work, and never fail.
Now imagine the company has made it illegal for you to see how the software works. Are you sure you trust this company with your life?
There should be a law saying that if a vehicle can be piloted by software, and it's capable of containing or hurting a human, then all installed software must be open source, and you must be able to prove that the source code corresponds to the software running in the car.
It has to be legally possible for the vehicle's owner (or prospective owner) to discover how their car might behave in a life-or-death situation, so they can decide whether they want to be responsible for the car's actions.
Responsibility
Logically, the manufacturer who wrote the software would be responsible, but they have no incentive to take responsibility for their cars' imperfections. Doesn't make money. Why admit your own flaws while your competitors keep schtum, look better, and rake it in? Any goodwill from better transparency will evaporate as soon as someone dies in an accident.
It's much safer to claim that the human pilot should have taken control at the critical moment. Capitalist governments won't argue with rich, profitably-taxable businesses.
Car makers will only be transparent about how their cars behave if they're obliged to by law.
Open source
Merely having access to the software's source code isn't enough. It must be legal to reuse the source code, for several reasons.
Morally, if Non-Specific Engines Ltd writes an algorithm that's better at saving lives than any other algorithm, shouldn't Acme Motors be obliged to used the safer algorithm in their cars, rather than forbidden?
Practically, you need software experts to audit the code. You want the code checked by an independent expert in the field of vehicle automation — not a business partner of the manufacturer — and that person will be a software developer.
If they use a similar concept in their own work later, Mom's Friendly Car Company could threaten to sue them, claiming they copied the code illegally. Software developers are rarely as rich as car companies; even the threat of a lawsuit would mean that in practice the code would go unchecked.
And again, morally, you can save lives here, by letting the developer reuse the good code.
Reproducible builds
Lastly, it needs to be possible to prove that the audited code is actually the code running in the car. You want an independent auditor to build the software for themself, in a development environment they trust, and get the exact same output as what's in the car. It must be possible to build the software reproducibly.
Otherwise checking the code is pointless — you still have to trust the car manufacturer, and you can't be sure the software's behaviour doesn't [deviate in subtle ways in very specific situations]. Maybe you don't care about any subtle differences, but maybe you do. The driver should at least be honest with you, and you can decide for yourself.
[deviate in subtle ways in very specific situations]: https://www.washingtonpost.com/news/energy-environment/wp/2015/09/18/epa-volkswagen-used-defeat-device-to-circumvent-air-pollution-controls/
None of this will make sure a self-driving car is perfectly safe. All software has bugs. But at least you'll know the driver was acting in good faith.
Trade secrets and competitive advantage are not worth dying for.
…Or you could just trust the big friendly company… right?
Built with git-ssb-web