Pretty much immediately after the 2016 election, I blocked Facebook. I didn't delete my account. I disabled the newsfeed with a browser plugin, making the platform an almost write-only medium. Something just felt wrong about it. Like it was too needy, as it slid into and then well beyond tabloid territory. Too engaging, but in a draining, physically sickening way before finally falling asleep to the pale blue glow. It felt strained. Frenzied. Fried. I half-joked that there was a dangerous adversarial AI on the back end and it didn't seem like a good idea to connect to it.
I hadn't been able to bring myself to really connect to the election itself either. I felt bad about it, like I should have been engaged and working on it somehow, but it seemed so weird and extra meaningless. How could this nutty thing actually be important? I knew it was wrong, but I did it anyway (partly because we were fighting to keep our cooperative homes and that was tiring in its own way). I turned off the US election because it felt like a reality show. And I don't watch reality shows.
Now we know that there really was an adversarial AI on the other side. It was shepherded or cultivated or mined* by some humans, who were building a giant "vote crazy" knob and cranking it up to 11.
The AI on the other sides is still evolving. What's it learning now? Can it study quietly at its desk without disturbing others? What is it watching and listening to? What does it see and hear? What's it reading these days? What is it preparing for? Is it a seasonal elections worker? Does it get tired? Do we get tired of it?
I've got a 50GB archive of photos that were posted on a small (in retrospect) Facebook like site I helped to host between 2000 and 2009. I've been lugging them around meaning to re-marry to their metadata for almost a decade. They've been in long term movage. Having just migrated to a new laptop, suddenly my available space doubled to 2TB. My first computer had 20MB of disk. That's a factor of 100,000. 50GB has gotten a lot smaller over the years. All those photos would fit on the memory card in my current camera. The files from back then are smaller but the files and disks from right now are larger, so there's a kind of foreshortening. Old data apparently shrinks faster than disks grow. So I decided to move everything onto my laptop, and suddenly it seemed easy to work with.
Seeing this flood of pictures and captions going back as far as the year 2000 was like wallowing in zombie memories. My software started identifying faces. Naming names. Transforming GPS coordinates into addresses and vice versa. Is it actually my software? I have the last version with a perpetual license.
This all felt a little weird, because the memories weren't all mine. They'd all been public at some point, but when a site goes down and never comes back up, what happens to the information? It does not necessarily disappear. It felt intimate. It felt awkward. That person is dead. Those two have a kid. I remember those pants. That beach. This night.
How is this process fundamentally different than a server farm full of GPUs running machine learning algorithms?
Browsing through 47,387 images from 36 friends is relatable — like finding a diary or a box of photos — we have a sense of how to feel about it. I think maybe I've hesitated to work with this little archive, because I wasn't quite sure if I had permission. But I also didn't delete it, because I wasn't quite sure if I had permission. I don't think we know intuitively how to feel about the algorithmic digestion of 2,000,000,000 people's data. That's a hundred million times more data than my digital shoebox.
Were we really okay with all this when it was just extracting money? Why are we more surprised and horrified when the same machine extracts power instead? How are they different? How different are they?
The publicly available tools for curating our own attention and memories are weak. We're mostly blind and dumb. Opting out is different from fighting back. The platform monopolies don't want us to use these tools in our own ways, for our own purposes. To shepherd or cultivate our own relationships. There are a few options, like the Media Lab's Gobo, but if it really took off, would it still be allowed to exist? What terms and conditions would it be subject to? If it was allowed to take off, would we get an arms race, as better fake news is developed, and countermeasures are deployed to detect it, and it evolves to get around them? There are federated open social networks like social.coop, but they're anemic without monopoly network effects.
Can we create media platforms that are more friendly to truth and reconciliation? Or do we have to make ever more virulent truths for the captive platforms that we're trapped on? How do we survive if some vital truths just aren't that catchy?
*Do you shepherd (animal) or cultivate (plant) an intelligence? Do you mine it? Is it animate, or inanimate? Is it more like sheep or wheat or ore? Maybe it'll get its own domestication verb. What's the word for when it goes feral?