Tensions and frictions in researching activists’ digital security and privacy practices

Presentations at HotPETS are not as technical as those at PETS proper, and are chosen for relevance, novelty, and potential to generate productive discussion. The associated papers are very short, and both are meant only as a starting point for a conversation. Several of this year’s presentations were from people and organizations that are concerned with making effective use of technology we already have, rather than building new stuff, and with highlighting the places where actual problems are not being solved.

The Tactical Technology Collective (tactical tech.org, exposing the invisible.org; if you don’t have time to watch a bunch of videos, I strongly recommend at least reading the transcribed interview with Jesus Robles Maloof) is a practitioner NGO working on digital security and privacy, with deep connections to activist communities around the world. What this means is that they spend a lot of time going places and talking to people—often people at significant risk due to their political activities—about their own methods, the risks that they perceive, and how existing technology does or doesn’t help.

Their presentation was primarily about the tension between research and support goals from the perspective of an organization that wants to do both. Concretely: if you are helping activists do their thing by training them in more effective use of technology, you’re changing their behavior, which is what you want to study. (It was not mentioned, and they might not know it by this name, but I suspect they would recognize the Hawthorne Effect in which being the subject of research also changes people’s behavior.) They also discussed potential harm to activists due to any sort of intervention from the outside, whether that is research, training, or just contact, and the need for cultural sensitivity in working out the best way to go about things. (I can’t find the talk now—perhaps it was at the rump session—but at last year’s PETS someone mentioned a case where democracy activists in a Southeast Asian country specifically did not want to be anonymous, because it would be harder for the government to make them disappear if they were known by name to the Western media.)

There were several other presentations on related topics, and technology developers and researchers aren’t paying enough attention to what activists actually need has been a refrain, both at PETS and elsewhere, for several years now. I’m not sure further reiteration of that point helps all that much. What would help? I’m not sure about that either. It’s not even that people aren’t building the right tools, so much, as that the network effects of the wrong tools are so dominant that the activists have to use them even though they’re dangerous.