Monthly Archives: November 2017

Build Your Own Surveillance State

In case the video dies: Chinese surveillance cam tracking people and objects in what appears to be real-time.

I think you could actually build this off the shelf. AWS has Rekognition, which does object recognition and facial recognition. Well, I’m not sure it would be real-time, but you could get fairly close sending frames from a video. You could also build a database of faces just from scraping social media.

As for the phone, technology is getting close. iPhone 7 and up support ARKit. I believe ARKit just does surface recognition and the facial tracking is more for Snapchat-filter-esque stuff and only looks at one face at a time. iPhone 7 and up also is when we got CoreML, which means phones can run trained machine learning models. So, the tech is all there, I think, to be able to do this, but I don’t know if it can quite be done software-wise yet.

Anyway, we’re a short hop away from people running surveillance AR on their phones. This will be especially bad when cops can do it. Apps will be built for real-life doxxing from any rando on the street. The current crowd-sourced identification will be hyper-powered. Also, any store will be able to run surveillance AR and then connect that data to facebook to retarget.

One small note on accuracy:

These things won’t be 100% accurate. One of the most powerful uses of AI right now is FB’s advertising system. False positives aren’t that big a deal. It just means that someone was served your ad who wasn’t interested in your ad and they don’t click on it. But false positives in other instances can potentially be deadly. For example, facial recognition already is less accurate on black people because Silicon Valley is very white. Cops with real-time AR facial recognition could get the wrong people. This doesn’t mean we should try to make surveillance more accurate to help cops. I mean this more as a warning. That facial recognition will be less accurate, cops will not care, and it will be used to increase oppression.