Difference between revisions of "Social media and filter bubbles"

From DevSummit
Jump to navigation Jump to search
(Created page with "'''Aman:''' Two statements from FB and Google '''Zuckerberg:''' Minimized FB's role, hesitant to change policy (should not be arbiter) '''Google:''' Changed ad serving polic...")
 
 
Line 13: Line 13:
 
'''Cathy:''' Fellow at Buzzfeed. project is filter bubbles. People talk about filter bubbles in political context. Those kinds of echo chambers are symptoms of deeper divides. Class. Age. race. Geography (rural vs urban). How to think beyond the political spectrum but think about how to encounter someone not from your class, etc.  
 
'''Cathy:''' Fellow at Buzzfeed. project is filter bubbles. People talk about filter bubbles in political context. Those kinds of echo chambers are symptoms of deeper divides. Class. Age. race. Geography (rural vs urban). How to think beyond the political spectrum but think about how to encounter someone not from your class, etc.  
  
'''David:''' <NPR's relianace on FB; FB paying media companies to create content>
+
'''David:''' <NPR's reliance on FB; FB paying media companies to create content>
  
 
'''Michael:''' Relatives all over and see the effects of this. Isn't a plot from FB, every time someone unfriends there's a drive to self-select. It's a problem outside social media. The collapse of shared, respected trusted institutions. If FB and Google start vetting what's false, that becomes another arbiter people don't trust.
 
'''Michael:''' Relatives all over and see the effects of this. Isn't a plot from FB, every time someone unfriends there's a drive to self-select. It's a problem outside social media. The collapse of shared, respected trusted institutions. If FB and Google start vetting what's false, that becomes another arbiter people don't trust.
Line 144: Line 144:
  
 
'''Jonah:''' Hoping we'd get to is how do we build movements in an echo chamber? Clients are speaking to bubbles. Media literacy for communications professionals in movements. Produce two sets of content with different.
 
'''Jonah:''' Hoping we'd get to is how do we build movements in an echo chamber? Clients are speaking to bubbles. Media literacy for communications professionals in movements. Produce two sets of content with different.
 +
 +
 +
[[Category: 2016]][[Category: Social Media]] [[Category: Needs Cleanup]]

Latest revision as of 19:59, 22 November 2016

Aman: Two statements from FB and Google

Zuckerberg: Minimized FB's role, hesitant to change policy (should not be arbiter)

Google: Changed ad serving policy -- block spam and porn, but now also will block ad sharing that shares misinformation -- yet remains to be seen how they will operationalize

We've all experience this in some way.

Jonah: On heels of zuckerberg's statement, plus leaks from FB on the proposed policy change. Broadly impacted right and died because of disparity. On FB for family but watching. Had been exposed to false narratives and accepted them blindly. Most of that's on the "left" bubble. Small yet meaningful false information in feed. Even not in the depths of that world, it's very present and real. We're all subject to it.

Aman: Difference between left and right on this issue.

Cathy: Fellow at Buzzfeed. project is filter bubbles. People talk about filter bubbles in political context. Those kinds of echo chambers are symptoms of deeper divides. Class. Age. race. Geography (rural vs urban). How to think beyond the political spectrum but think about how to encounter someone not from your class, etc.

David: <NPR's reliance on FB; FB paying media companies to create content>

Michael: Relatives all over and see the effects of this. Isn't a plot from FB, every time someone unfriends there's a drive to self-select. It's a problem outside social media. The collapse of shared, respected trusted institutions. If FB and Google start vetting what's false, that becomes another arbiter people don't trust.

Laura: Does NPR have to produce certain kind of content...

...

Jonah: Power analysis of filter bubbles. Feel powerless as individuals. How can we re-empowered?

Laura: How does one do that? Do you friend people you don't agree with?

Aman: UX dictates so much of content. The primary form of engagement is like button. FB's algorithm uses that. You're more likely to thumbs up stuff you with agree with. There's no down thumb. If there was you might be able to see content you don't agree with. We would like content sharing platforms to show us subjects of interest to us (e.g. intl development... or cats). What it is in fact doing is showing you opinions you agree with. What FB and we can do is diversifying types of reactions you can have. Understanding that way we interface needs to be more diverse.

Michael: What do reactions added mean? Are they any different?

Aman: Adoption is still very low. Takes more clicks and time to change reaction. Most of other well-used reactions are positive ones. Hearts and thumbs up.

Sasha: Distance and timezone affects what you see. Hasn't been a lot of constructive discourse across political spectrum irl or on social media. You're curating content for people you know and trust. It can happen at broader level. What would happen if it was done more thoughtfully? How do we make it work like what works irl? A lot of people are not engaged at all. What effect does this have on disaffectation? Does it turn people off to being civically engaged? Either you check out totally, or you're overexposed.

Cathy: How does it affect discourse in general? There's more understanding now about how feedback loops feed into how these work (things that are liked more get shown more). More subtle distortions. Posting on social media favors certain kinds of content. Certainty. Arrogance. Disadvantages ambivalence. Few people are going to post their admibvalent thoughts. there's a fdistrotion because of that. Feels like people are very certain. You don't see.

David: Post ambivalence and trolls come out.

Jonah: Opens up paths of resistance and ease. It's this game system. You don't win likes with ambivalence. Are there specific practices around disrupting that system? Can you get likes in a different way? Can you use the tools and not buy into that paradigm? Job as organizers is to put out content that doesn't cleave to the system. Does that content disappear or stick around? Can we make the algorithm fail.

Cathy: Security researcher interviewed talking about what is realistic for people to do. Not realistic to abandon facebook or encrypt everything. Just search for random stuff. Very actionable for an individual. You can control what they're collecting about you. Add random noise.

Sasha: Not even that many people are going to be that intentional about it. Most people see facebook was place to to connec tw families and friends. Tough to get people to regulate own behavior.

Aman: One way to focus conversation -- what could a group like us do? Or put pressure on them to change? What can we do to affect FB that influence way FB operates. What do we expect out of FB? A certain amount of the problem is us. We are a little bit of the problem. Like bite sized opinoins. Not complexity. Not long pieces. One button was a popular thing for a reason. Whod ecides what is misinformation? If we ask google to do that, it's giving them a lot of power. If it's corwdsourcing, the crowd could come up with crazy answers too (e.g. creationism). Which parts are worth exploring further? If you were to create an organization what would it look like?

Sasha: There are models. If FB is a media aggregator, there's rules there. FB has done that for hate speech. How do we regulate? WE have models to draw from.

Jonah: That's where power analysis is useful. Exploitable power dyanmic. There's leverage with news oeganizations. News orgs pushing for editorial standards. They need those partnerships. This is enabling their stance as "We're not a media organization, we just aggregate". Less for individiauls.

Sasha: Could be individuals... Free basics was resisted effectively in India. Won a net neutrality battle. Comedians being funny.

D: What was medium for resistance?

Sasha: MEdia coverage of resistancne.

Aman: Whatapp

Laura: The kind of roganization we need is to get celebrities and famous folks and get them to speak out because people pay attention and. Friend voted for trump, paert of reasons he cited -- trust him more as businessman, couldn't trust Hilary. the way media covers makes a big difference.

Aman: Sasahs's exmplaae good Hundreds of millions of people gave up something potentially really attractive.

Sasha: Snobby techie in city is naother thing.

Aman: Where's analog there in terms of filter bubbles?

Laura: Way things are so data driven... creating convenience; people value the convenience over privacy.

Jonah: The accuracy of the content isn't part of the value propositison for FB. Pictures of cats and news... giving up cats for something nuanced might not meet motivations of why people are there.

Laura: Do most people get their news from social media?

Aman: Twitter, maybe not FB. No TV. It's what people are sharing. No way to map how diverse my content feed is.

Sasha: Go onto FB and tons of people share that's helpful.

David: Mattered a lot to friends, that's powerful.

Jonah: But does it matter? Does it really mean it matters? But maybe not happening? How do we make it this is meaningful, not just a lot of people liked this or saw it?

Aman: Lots of good things for content sharing. Now anyone can be a journalist. Has good and bad to it. Before how do we have lots of contributors? Now: how do we curate? How do we avoid bubbles of that content? These are new problems. Don't want FB or google to choose my content. Lots of small choices they make that can make a huge difference in the content being shared. One that we

David: Dissect how the aLORIGHTM WORKS?

Sasha: Would have to be big, lots of players benefit.

Jonah: Solutions have to be distributed. Must compare what everyone sees. Dig into HTMl source. Is there enough ID and timestamping to do comparisons. That's how you could uncover the algorithms. Other aves: whistleblowers, getting source code, putting pressure in other ways. That's all we have -- what everyone sees. That's big infrastructure project. May be some vested interests with resources to focus on that. How do we make transparent network effects? Doesn't have to be perfect, just has to create pressure to be transparent or make changes.

Aman: We could put resources like that together. Does API have information? FB could shut you down. How long before FB just says no, stop this.

Jonah: Looking at HTML source critical.

Noah: Chrome extension.

Sasha: Get / download your data. Have people submit it. Would throw out: what does that fair alogirhm look like? What are you left with if you get rid of biases?

Jonah: Like-only input is very thing.

Aman: Makes a difference what you're optimizing for. Likes is different from optimizing for information about a subject. e.g. vs. agreement. But likes are low hanging fruit and it's a business decision.

Laura: Have you considered and does exact opposite of what facebook does? Feed you something that's the opposite of what you believe. Yu'd need to opt in but some awareness could be useful.

Sasha: ...

Michael: It's also about facts. E.g. the president is from Kenya.

Aman: Misinformation and filter bubbles are seprate issues. Ideally.

Noah: And you need an arbiter of truth in that situation.

Jonah: Is fact checking am odel that helps? Is that a model that scales up?

Sasha: See a little meter.

David: Google's fact check metadata

Michael: Needing institutions. Flagging certain groups/people. But that goes back to not having faith in those organizations.

Cathy: Fundamentally about trust. How much does fact-checking work if people don't trust fact-checking? How do you build that?

Aman: Right fact checker.

Sasha: Coalition/partnerships e.g. right wing and left wing media

Jonah: Has 4th estate lost trust?

Noah: fundamental thing we're seeing with social media and internet is leveling of trust/status between indivudals and institutions. It's a more equal playing field.

Michael: Comes back to trust. There's a safe place for people to go. Feels safe to go to FB. Can post with condifence and anonymity.

Jonah: Question fo algorithmic transparency. Transparency is how we build trust in institutions.

Sasha: ??

Aman: algos and machine learnings has made that very difficult. How can independent orgs review algos? That's not been feasible. Inputs and outputs... we can't dissect but we can see inputs and outputs. What are regulatory orgs? What expertise do they have? Esp compared to FB (the experts).

The world isn't really trusting experts anymore.

Sasha: There is more scholarship in this area. Who designed? How is it biased?

Noah: Haven't talked about media literacy at all yet. Are efforts in that direction successful?

Sasha: Next thing you know we need to be investing public education.

Jonah: Hoping we'd get to is how do we build movements in an echo chamber? Clients are speaking to bubbles. Media literacy for communications professionals in movements. Produce two sets of content with different.