Decentralized Wikileaks

From DevSummit
Jump to navigation Jump to search
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

facilitated by Niels ten Oever from Free Press Unlimited

  • Trying to help investigative journalism in repressive environments
  • Wants to keep the beginning of the discussion non-technical
  • Human question of what are you going to with data is often more important than the technical question of how are you going to acquire the data and is it secure?
    • Case in point: Wikileaks malfunctioning relationship with corporate media

Project trying to solve these problems: Publeaks

Two goals:

  1. Don't build a central site to hold leaks, but put leakers directly in touch w/ journalists
  2. Work directly with journalistic organizations

Effects

  • A lot of journalists use tails (so it's not perceived as a tool just used by terrorists & pedophiles)
  • Two members of parliament left office as consequence of info made public through site

Methodology

  • Designed workflow with journalists from beginning
  • Big differences between developers & journalists from jump
  • Developers wanted to implement tools that weren't user friendly
  • Technical people thought it was important that all journalists have access to same info (believe in transparency, data liberation)
  • But all journalists thought this would make the project fail (because a large part of what drives journalism is competition over exclusives, etc.)
  • Compromise: leaker gets to decide which outlets and when to which the info would get leaked; if they leak to more than three, prompt leaker to give exclusive edge to one journalist
  • When leak is provided to organizations, the organizations also get a notification of who else received it
  • (Counterpoint: newspapers are actually lying about having/needing exclusives)

Side debate

  • Problem with decentralized model is it leaves out brokering/curating role that Wikileaks had, making leakers more open to getting played by news outlets
  • Problem with centralized model: Wikileaks eventually had problems parsing their own data because they had so much of it. by giving it directly to several news outlets in competition, there's a natural incentive for different groups to do deep dives on different parts
  • Another problem: people often want to leak highly technical information that's difficult to understand; sending it directly to either leaking orgs or press might result in important information not being widely broadcast because it's significance isn't understood by those who receive it
  • So maybe have federated model of intermediaries interpreting and publishing data?
  • In creating publeaks, two vectors of pushback on this question:
    • Journalists wanted as few intermediaries as possible both for exclusives and security reasons
    • Project organizers wanted to trust the leakers to select recipients who they think can understand the leak
  • Problem of tinfoil hattery: how do you sort out important leaks from conspiracy theories that cranks think are really important
    • One answer: leaks submitted to all possible news outlets tend to be thin conspiracy-theory

Two main platforms right now

  • Globalleaks (publeaks is an instance of this)
  • Securedrop (strongbox is an instance of this)
  • Url for side-by-side comparison spreadsheet: is.gd/Mwjkvt

Comparison

Securedrop

  • Far higher security: more air gapped, more customized, takes more time & resources to maintain
  • Lots of scripts, every package needs to be compiled to run
  • Each participant needs to run its own instance on a single computer in a single location
  • Makes integration with several organizations very difficult (they have a paid staffer who travels around installing it)
  • Relies very heavily on physically separating computers, leaving computers disconnected from internet, heavily encrypting files to be readable only to specific devices & users (assumes people will be aggressively trying to break into data)
  • Weak point: data transferred between computers with thumb drives that might themselves be infected

Globalleaks

  • Workflow that guides leaker through LONG set of disclaimers about risks and tools for avoiding it
  • Connection goes through Tor -- hosted on a tor2web server aka "hidden service" (allows users to cloak their IP address without themselves using Tor, because the server itself is within the Tor network, acts as proxy, and can't tell where you're submitting from)
  • User is strongly encouraged to install tor browser in addition to just using, if they don't they have to answer a quiz proving they know the implications of what that means (won't allow them to submit unless they answer correctly)
  • i.e.: at three different points, pressing for use of Tor and/or providing education on risk
  • (Third option: encourage people to install tails)
  • There's a heavy protocol for protecting access to the leak once it's in the hands of journalists (transfer file to separate laptop to decrypt, use partitioned usb sticks to be able to read, etc.) and big training protocol for journalists using it
  • Selling point to journalists is that they really care about protecting source secrecy