Soatok Dreamseeker

@soatok@furry.engineer

At #RealWorldCrypto this year, there was a session on "privacy-enhancing technologies".

The first talk in the session was about a new encryption method for Tor.

The next two were painful examples of "a person cannot be convinced of something when their salary depends on them not knowing it".

Advertisers wants to collect signals about populations without being individually identifying. So let's talk about differential privacy techniques to let them do that.

One example was "Meta wants to know what percentage of its teneage users blocked a contact today".

At no point did they address the elephants in the room.

  • Why do they want this data in the first place?
  • What are they even doing with this signal?
  • Have you considered telling them to fuck off and not collect it in the first place?

As tempting as it might be to hand wave it, and say "well yes but their business model depends on it", I say to advertisers, "then perish".

March 11, 2026 at 10:15:19 PM
(Edited)

Here's a privacy-enhancing technology for you to consider:

"No."

You don't need to know. You don't need to measure. The efficacy of advertising campaigns, market segmentation, and relevance targeting should be minimized for the good of humanity.

"No" is a better privacy-enhancing technology than the state-of-the-art differential privacy techniques.

It's efficient! Not collecting data requires at most O(1) bandwidth, O(1) storage, and O(1) compute.

"No" is not "Maybe later".

"No" is not "Ask me again in 3 days".

"No" is not "Maybe after a few more beers", since many of the people that need to hear the first part of his message likely also needs the second.

I'm not sharing this to shit on anyone at #RWC2026. My favorite people in tech are often found there, and the organizers put a lot of thought, effort, and care into making the vibe good.

I also don't ascribe any malice to the speakers. They probably didn't think to ask these questions, and didn't think to put them in their slide deck. Maybe they've self-selected into an environment that doesn't foster that kind of inquiry. Maybe they considered it but cut it out for time.

But if we're going to talk about this sort of thing,, we need to actually address these questions, even if there isn't a comfortable answer.

At an earlier track, one of the invited speakers suggested using Fully Homomorphic Encryption to allow folks to have private conversations with an AI chatbot for therapy.

My mind was instantly filled with news stories of OpenAI and self-harm. Lawsuits from grieving families.

Are they deeply out of touch?

Or was it just "hmm, what do people want privacy for? I'll just throw a bunch of hypothetical examples of things FHE would be good for without interrogating them deeply"?

Elk Logo

Elk is in Preview!

Thanks for your interest in trying out Elk, our work-in-progress Mastodon web client!

Expect some bugs and missing features here and there. we are working hard on the development and improving it over time.

Elk is Open Source. If you'd like to help with testing, giving feedback, or contributing, reach out to us on GitHub and get involved.

To boost development, you can sponsor the Team through GitHub Sponsors. We hope you enjoy Elk!

PatakDaniel RoeAnthony Fu三咲智子 Kevin Deng

The Elk Team