Tech is not neutral: How to make it better?

From DevSummit
Revision as of 00:24, 22 November 2017 by Evelyn (talk | contribs)
Jump to navigation Jump to search

We got two great sets of notes for this session, so both are included below. :-)


Beatrice's Notes:

People bring their biases to the tools they are building. they will not think about the consequences of the tech they build, nor the good ones nor the bad ones.

"In situations of injustice, neutrality favors the oppressor"

In librarianship there is the idea of professional neutrality. But tech is not neutral.

There are technologists that are building tools that are not neutral at all. See Palantir building tech used by ICE.

When we say that tech is not neutral there is the implication that it has biases?


Examples?

  • Metadata. How they are organized in the search in the Library of Congress.
  • Not satisfactory gender fields
  • Having the privilege to learn to code
  • Name fields
  • Advanced encryption is not affordable. More secure smartphones (iPhone) are the most expensive


Are there things to think to about when you work as a software developer if you care about this? Are there best practices?

  • Bring empathy to your work
  • See concept of "radical empathy"
  • Diverse and inclusive user stories. Don't build things in a silos
  • Prioritize the security of marginalized communiites in development. Think privacy and security. A lot of tech, like Palantir, are built by white men to target marginalized groups.
  • Share your technical knowledge to mentor and build new leadership in your profession for less privileged folks. The end goal is to have way more software developers who are not part of the current privileged majority group
  • Codify a code of ethics. It can help with buy in.
  • See Zine Librarian Code of Ethics Zine Librarian Code of Ethics
  • And keep maintain it. E.g. put it in a repo, and track changes.
  • More from "Diversity and Inclusion" to "Equity and justice".
  • See: "Language of appeasement" by Dafina- Lazarus Stewart.
  • Prioritize security and privacy
  • Think about responsible data
  • With data collection: less is more. Data minimization.
  • Prioritize accessibility
  • Prioritize localization
  • Be mindful of how your tool can be weaponized. A database is just database until you start tracking people.
  • Password management should be implemented in PII (personally identifiale informaiton)

What recommendations can be given to people to speak up in the project team? And if the technology is already built in a harmful way, what can we do?


Kai's Notes:

Technology is not neutral

  • people bring their biases to the tools they’re building
  • “in situations of injustice, neutrality favors the oppressor”
  • metadata- library of congress centralized white, christian, het male in labels, so
    • labels don’t make sense to other peoples
    • labels enforce these categories
  • who has access to technology? who has access to the tools that use these technologies?
  • name fields assume a US / Europe name format
  • gender fields assume a binary gender
  • encrypted communication - the most secure mobile phones are the most expensive
  • Signl doesn’t run without a data plan
  • the trump administration wants to develop facial recognition for accusing criminals
  • palantir - ICE raids
  • how to bring this awareness to the wider development community?
  • how to break developers out of the “I’m building with good intentions, why would anyone use this tool other than how i intend?” mindset
  • how can developers speak up to prevent bias?
  • how can developers improve existing bias?

Best practices:

  • empathy
    • what are the ramifications of collecting and organizing this subject? How do the impacted community feel about the way we’re doing this?
    • value community health over deliverables
    • when designing a product, solicit diverse and inclusive user stories
  • mentor and invite perspectives you don’t have
    • the end goal is for there not to be a majority
  • focus on projects that promote the interests and safety of marginalized people
  • emphasize privacy and security ++
  • think about responsible collection of data. How can this data be used to hurt marginalized communities?
    • Responsible Data Forum
    • collect less data
    • anonymize data
  • develop a code of ethics as a company
    • keep comparing your behaviour and plans to the code.
    • having a “code of ethics” is impressive to mgmt, and helps them prioritize this stuff
    • revisit it regularly and update when you learn more about the community and your impact
  • change the framing from “diversity and inclusion”, which is hippie junk, to “equity and justice”, which is noble like a bald eagle
  • have an accountability team?
  • prioritize accessibility and localization up front, not as an accessory
  • how can your product be weaponized?
  • encrypt all personally-identifying information
  • turn data you store into data you don’t store
  • protect not just your users, but yourselves