Privacy Story Telling

From DevSummit
Revision as of 17:00, 19 November 2015 by Bensheldon (talk | contribs) (Created page with "We do a bad job of telling stories about privacy: “One time Cory Doctorow said something…" Many are at the stage of "We know we should floss, but we don’t." What sto...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

We do a bad job of telling stories about privacy: “One time Cory Doctorow said something…"

Many are at the stage of "We know we should floss, but we don’t."

What sto

Hanging out on online forum discussing Etsy changing their product to make public wish lists that were previously private (e.g. sex toys, mom jeans, gifts to others). At every stage, people were willing to believe Etsy was doing things right (e.g. notifying people, only targeting certain users) turned out to be incorrect.

In many states, workers can be fired without cause for homosexuality/gender identity. That information can be assumed via people’s search engine traffic.

Activity: participants wrote down something that they browsed or emailed that they believed no one else needed to know.

"None of your fucking business"

People who run technology companies try to mediate these things like “Anyone who has more than one persona online is dishonest”. These are held by people who are at the top of the privilege pyramid: their need for privacy is very low. When we talk about privacy, we are talking about power: who gains power if this becomes public? Power-shifts matter most to people who are disempowered in today’s culture or economy. They need you to have a single identity so they can serve you ads.

At Responsible Data Forum, there was a "Talking about Harm”: in order to gain legitimacy, we sacrifice dignity (e.g. complexity of people).

Counter-argument: if someone takes a moral stance that doing X is wrong, they will not be persuaded that those people should have privacy. Privacy doesn’t really fall cleanly into political spectrum, so there can be necessity to choose examples based on context.

One analogy is really useful is as “blinds” in the home, and people can choose when to open their blinds and when not to.

Privacy and Publicness can suffer from attribution error: we may see an intention to “do something publicly” that may not exist. To the “blinds” analogy, if it’s light inside and dark outside, we may change clothes without worrying about what’s outside that we may not see.

People shift context all the time. It’s not just about “hiding the bad stuff”.

When Facebook made an onion (Tor) service, there was a contradictory media narratives of “Facebook is helping terrorists” and “Who cares, it’s just Facebook”.

To come up with lists of values that come up with Privacy: “Does anyone know anyone who has untagged themselves from a photo on Facebook? Why would they do that?” You have exercised actions to protect your privacy

John Oliver’s interview of Edward Snowden: General: “Should the government be able to monitor your private communications?” Very specific: "Should the government be able to look at your dick pics?”

Activities:

1. Get in pairs and share something private with the other person. Then participants switch pairs and tell a different person about the first person’s secrets. This freaked people out.

2. Two groups were created, one was told they were “law enforcement” and the other was told they were “journalists”. They were given the same set of neutral data about a person’s movements during the day and asked to create an explanation of what happened.

3. Iceberg: things that are safe topics (sports, weather) are above the waterline and things that are possibly inflammatory (politics) that are below the waterline, with things at the very bottom (fear of failure). We get to choose where the waterline is, and we may change it on context. Some Californians may think it should be all the way at the bottom to be fully actualized.