Roundup 005, June 2020 Roundup 005, June 2020 Roundup 005, June 2020
Roundup 005, June 2020

“Bad Code Will Get People Killed”

Always insightful points from Benedict Evans:

This partly reflects the current political moment in the USA, but more fundamentally comes from ongoing concerns about the reliability of machine-learning-based systems in what can be life-or-death situations (especially with US cops’ tendency to shoot first). These systems look for patterns in data, and if you give them the wrong data they’ll find the wrong pattern, so they’re very easy to break if you don’t know what you’re doing. Hence, if you design a system badly and then train it only with criminals’ faces, and then give it a picture of whoever is in front of you, it will say ‘well, this bank robber is the closest match’. Bad code can get people killed - hence the moratoria.

https://mailchi.mp/bad1c520af3b/benedicts-newsletter-no-451186?e=f33426d29a

Also:

The NY Times has a story of someone arrested because face recognition wrongly matched him to a CCTV image of a shoplifter. There will be more stories like this: machine learning in general and face recognition in particular can only give probabilistic results, and ML systems trained on skewed data with skewed assumptions will produce skewed results. Part of what’s interesting here is that the safeguards half-worked at each step: the software clearly said to treat the result with caution, the cops did a photo line-up rather than going straight to an arrest, and as soon as they got him in a room they realised they had the wrong man. But, they still arrested someone who doesn’t look anything like the picture because ‘the computer was wrong’. This is partly an engineering problem, but mostly a training and process problem.

https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html

There’s a longer thread to explore here about automated and inscrutinable mistakes that actively hurt people.


Here’s how Alexa, Google Assistant, and Siri answer the question, ‘Do black lives matter?’

Everything is political. Automated assistants can’t be dumb to the social and political questions we ask ourselves.

[https://www.theverge.com/2020/6/8/21284546/apple-siri-amazon-alexa-google-assisant-black-lives-matter-ai-response]


The Broken Phones of NYC

[https://www.inputmag.com/culture/the-broken-phones-of-new-york-city]


“I shipped a word processor that formatted the hard drive every 1024 saves.”