Inconvenient Truth

Having your assumptions and beliefs challenged by a group of seasoned law enforcement professionals and subject matter experts can be a rather disturbing experience. It is exactly what happened to me this week during the workshop on the role of technology in combating human trafficking, organized by Rane Johnson-Stempson, danah boyd, and the good folks from the Microsoft Digital Crimes Unit (special thanks go to Cynthia Dwork for encouraging me to participate in the workshop).

Recognizing the opportunities presented by technology in addressing the problem of human trafficking, Microsoft solicited proposal and awarded six grants to a diverse group of researchers. The workshop was an initial check point with plenty of time for brainstorming, exchange of ideas and identifying shared interests and viewpoints.

Human trafficking, narrowed for the purpose of this workshop to sexual exploitation of minors in the US, is mainly a real-world phenomenon. It has, however, embraced numerous technologies that increase supply, fuel demand, and lower transaction costs of criminal activities. For example, digital photo and video cameras removed barriers to producing child pornography that existed when one had to develop films in a photo lab or set up one at home. On-line advertising and disposable cellphones bought with prepaid cards facilitate pimping and soliciting services of prostitutes. Chat rooms and forums connect criminals who otherwise would have hard time finding each other.

On the positive side, there are already several success stories in using sophisticated technology and excellent research in fighting child pornography. I was excited to learn about a Microsoft-led PhotoDNA project (co-developed with Dartmouth and in partnership with the National Center for Missing and Exploited Children), an image-matching technology that is part of an overall program to help identify and disrupt the spread of child pornography online. It is currently used by Microsoft in several of its products, including Bing, SkyDrive, and Hotmail, and since last year, by Facebook.

Another set of tools for digital intelligence and forensics on peer-to-peer file sharing networks, called RoundUp, from the University of Massachusetts, Amherst, is used by hundreds of law-enforcement agencies across the country resulting in execution of approximately 150 search warrants per month.

These tools can be very effective against many forms of electronic distribution of offensive materials. We also know that there exist systems for anonymous communications that are mature, easily accessible, and can be disruptive to investigation of electronic crimes. The most popular such system is, by far, the Tor network. Originally designed by the U.S. Naval Research Laboratory, it now relies on individual volunteers and supporting organizations to run a network of routers that use cryptography and other security mechanisms to enable its clients communicate anonymously and in real time. It is not surprising that criminals find Tor useful. Fortunately, only a small fraction of them use Tor clients at all, and many fail to set them up correctly and use them consistently. According to one workshop participant, in many years of her service at the DA’s office, she could recall only five instances of Tor usage (four of which happened over the last year).

As a card-carrying cryptographer (not literally – IACR is hyper-vigilant about protecting privacy of its members), I am attached strongly to the view that anonymity and secrecy of communications is essential to freedom of speech and functioning civil society. The benefits, the argument goes, are worth the price of abuses, which include hate speech and distribution of illicit materials. Indeed, many past encroachments on freedom of electronic communications were justified by threats posed by terrorists, drug traffickers, and child pornographers, while the actual outcomes were increased censorship and/or unchecked surveillance of the general public. The latest episode in this struggle was Russian Wikipedia’s going dark in protest of a “blacklisting” bill three weeks ago. In a novel twist on the web’s usual suspects, the bill, which passed with minor changes shortly after the protest, allows blocking access to sites that “promote suicide”.

The workshop was a reality check for my beliefs. Its participants were experienced social workers, researchers, and law enforcement officials who are confronted daily with unimaginable crimes against minors. To this audience, anonymity and lack of accountability afforded by Tor is more than a nuisance – it’s a license to commit more crimes and further denigrate their victims. Some arguments or ideas bandied about were setting my alarm bells off but I could see where they were coming from. Examples include statements along the lines of “if you’ve done nothing wrong, you have nothing to hide”, or a suggestion to have digital cameras embed traceable watermarks into all videos and photographs, to enable forensic ballistics of sorts for digital images. These proposals were balanced with discussion on how we can protect legitimate free speech and privacy on-line and still be able to effectively address this horrendous crime. Is it possible to do both?

Modern cryptography gives us a clear set of goals and a self-consistent value system. The workshop was a wake-up call, prompting me to consider life outside this intellectual bubble. I am yet to find satisfying answers to the more difficult philosophical questions, but others may have technological or social engineering solutions.

Should there be self-imposed limits on research in cryptography in the name of social responsibility? Constructing a faster, more secure file-sharing network based on minimal assumptions (hopefully not requiring random oracles) can be an interesting research question and a good CRYPTO paper. But what about balancing its capabilities with societal norms? Can we build a Tor-like system that would support anonymity for political dissidents and be hostile to peddlers of child pornography? A simple answer – texts are OK, but images are not – won’t do. An image or a video clip can spark a revolution or give a face to a movement (think tanks on the Tiananmen Square or the death of Neda in Teheran in the summer of 2009).

I now sketch a solution that separates speech that is acceptable to at least one public entity from materials that no-one is willing to defend openly. The hypothetical socially-responsible network would allow anonymous sharing only if at the time of upload the client can designate from a list supported by the system one party that can later remove the material or break the anonymity seal. The list of acceptable parties should consist of organizations with strong record of defending freedom of electronic communications and be inclusive enough to placate even most ardent Tor supporters. This is just a sketch, and imperfect one at that. A comprehensive solution should draw on expertise of law practitioners, legal scholars, social and political scientists in a process similar to the one I witnessed at the workshop.

I’d like to conclude by reminding us that no science is an island. Cryptography prides itself on deep and fruitful connections with many disciplines ranging from abstract algebra to chip design. A single conference talk may go from circuit complexity to number theory to applications in browser security, and this is before it gets really interesting. We are fully capable of making the last step – recognizing our responsibility for how the tools we make are being used, and designing them with appropriate safeguards in place.

5 thoughts on “Inconvenient Truth

  1. I think your idea, though well-motivated, is a non-starter. As soon as this responsible network contains anything of interest, the CIA, Mossad and other oppressive state actors will infiltrate your “trusted organisations” in order to have access to the goodies (if they have not done so already).

    Free, safe, choose one.

    1. I wouldn’t be as pessimistic. The “breaking the seal” protocol may be designed in such a way as to make it visible and transparent.

      After all, defending against a resourceful organization is extremely difficult even under current circumstances (think about bugging computers or sneak-and-peak warrants). Auditing and tamper-evident design are powerful deterrents against abusing or subverting the process.

  2. Interesting post. While the point that privacy and anonymity technologies often aid malfeasors is one that is made quite often, I can believe that attending a child trafficking workshop gives one a more visceral appreciation of this fact.

    Ed Felten has a post where he expresses skepticism of technological solutions to these difficult moral questions: https://freedom-to-tinker.com/blog/felten/free-internet-if-we-can-keep-it/

    That said, your proposal is something I haven’t seen before.

    1. Thank you for the link. Ed Felten frames the dilemma nicely by quoting Hillary Clinton who defers to technologists, and then pointing out that technology can’t be expected to miraculously balance our ideals and security requirements.

      I am, for one, understand limitations of technological solutions and don’t believe in fair application of law on a global scale. The key to making Internet better is social engineering – designing solutions that encourage good behavior and penalize misconduct. Of course, it applies equally well to individuals and governments.

Leave a comment