Techtonic with Mark Hurst is a weekly radio show from WFMU about technology, how it's affecting us, and what we can do about it.

Nov 3, 2025: Aram Sinnreich, co-author, "The Secret Life of Data"

The surveillance state is tracking you every day, everywhere you go. Data about you will be “recorded, archived, analyzed, combined, and cross-referenced . . . without your awareness or consent,” according to Aram Sinnreich, co-author of “The Secret Life of Data: Navigating Hype and Uncertainty in the Age of Algorithmic Surveillance.”

Show Notes

The Secret Life of Data: Navigating Hype and Uncertainty in the Age of Algorithmic Surveillance, by Aram Sinnreich and Jesse Gilbert, published by MIT Press

A $60 Mod to Meta’s Ray-Bans Disables Its Privacy-Protecting Recording Light (404 Media, Oct 23, 2025)

Digital Threat Modeling Under Authoritarianism (past Techtonic guest Bruce Schneier, Sep 26, 2025):
The risks here are twofold. First, mass surveillance could be used to single out people to harass or arrest: when they cross the border, show up at immigration hearings, attend a protest, are stopped by the police for speeding, or just as they’re living their normal lives. Second, mass surveillance could be used to threaten or blackmail. In the first case, the government is using that database to find a plausible excuse for its actions. In the second, it is looking for an actual infraction that it could selectively prosecute - or not.
... and ...
Threat modeling is all about trade-offs. Understanding yours depends not only on the technology and its capabilities but also on your personal goals. Are you trying to keep your head down and survive—or get out? Are you wanting to protest legally? Are you doing more, maybe throwing sand into the gears of an authoritarian government, or even engaging in active resistance? The more you are doing, the more technology you need—and the more technology will be used against you. There are no simple answers, only choices.
How A.I. Can Use Your Personal Data to Hurt Your Neighbor (gift link, NYT, Nov 2, 2025), by Maximilian Kasy:
In climate change, one person’s emissions don’t alter the atmosphere, but everyone’s emissions will destroy the planet. Your emissions matter for everyone else. Similarly, sharing one person’s data seems trivial, but sharing everyone’s data — and tasking A.I. to make decisions using it — transforms society. Everyone sharing his or her data to train A.I. is great if we agree with the goals that were given to the A.I. It’s not so great if we don’t agree with these goals; and if the algorithm’s decisions might cost us our jobs, happiness, liberty or even lives.

To safeguard ourselves from collective harm, we need to build institutions and pass laws that give people affected by A.I. algorithms a voice over how those algorithms are designed, and what they aim to achieve.
Guest
Host
Comments
Playlist & Comments at WFMU
Aired
Nov 3, 2025