If you build it, and they don’t come, it’s only a failure if you don’t know why. The best and brightest of Europe’s tech warriors meet in Valencia this week, and reading its programme of ‘self-organised’ sessions, there’s clearly a will to address a recurring problem. Why do people under threat, pass on technology specifically intended to help them?
The dominant theme at the Internet Freedom Festival (1–6 March) in Valencia, is security, privacy and circumvention of surveillance. It’s the dominant concern of the techies that convene there annually. Yet despite the global impact of the Snowden revelations and the agenda-changing achievements of WikiLeaks and other whistleblowers, the people they aim to help have sometimes been slow to adopt, even accept, their products and priorities.
The Festival’s aim is to “promote hands-on collaboration and synergies between those working on anti-surveillance and anti-censorship initiatives”. Attendees recognise that a journalist or an advocacy group will eventually go public with their protected information – otherwise there’d be no point in collecting it. But that’s not really their end of the deck.
By focusing on privacy and surveillance their technical stylisations respect the essential line between private and public information. Journalists focus on the need to cross that line, more simply, seamlessly and securely than they can at present.
One of Valencia’s ‘self-organised’ sessions is led by the Committee to Protect Journalists, which regards security “as a continuum running from body armour to DDOS attack shields”. The same applies to security for advocacy groups, who like journalists, often beside them, collect evidence of rights abuses, corruption and denial of information.
In the West, there’s understanding of the need to respect journalistic sources in pursuit of transparency and accountability, between this years’ Oscar winning best picture Spotlight, and followers of Edward Snowden’s enduring personal sacrifice. Why the reluctance to take up these tools?
Younger users are smarter with tech and highly engaged by the opportunities of easy access to channels with mass media impact. Older users are powerfully aware of security threats, having worked under surveillance and with endangered information sources for decades. Yet the tech often fails to meet both their needs.
The current issue of the academic journal IDS Bulletin may have the answer. It features a look at the challenges of adding new technology to transparency and accountability (T&A) initiatives. Written by Indra de Lanerolle and Christopher Wilson, it looks at how T&A tech tools were developed, chosen and finally deployed in South Africa and Kenya.
Their conclusion, no doubt recognisable to some tech advocates in Valencia, is that the tools are often chosen and applied with only limited understanding of their intended users in their intended contexts. The clue to their advice is in the title: Test It and They Might Come: Improving the Uptake of Digital Tools in Transparency and Accountability Initiatives.
The title references the dogged faith of farmer Ray Kinsella, hero of the magical realist book and movie Field of Dreams. De Lanerolle and Wilson, the latter co-founder of the engine room, one of the more innovative NGOs crossing the tech and media development divide, in turn reference the academic M. Lynne Markus. Two decades ago Markus, an expert in technological change in business, urged people to abandon their faith in “The Magic Bullet Theory in IT-Enabled Transformation,” and reassess their “failure-promoting” belief in the power of IT to guarantee change.
The quality of the work was irrelevant, she argued, if the end result was not designed for ‘implementability’. Developers’ unaided intuitions about users were often wrong, but relying on users to conceptualise tech alone was like building a house without an architect. Yet the alternative was as bad: to let the architect decide what kind of tech ‘house’ you should live in. “You’d do well to remember,” Markus added, that the ‘architect’ “isn’t planning to live there”.
De Lanerolle and Wilson say ‘build it and they will come’ has become a trope for failure to anticipate user needs and realities in software development. They highlight “failure of uptake,” the wasteful, not to say humiliating, rejection of digital tools declined by the very people the tools were intended to assist.
Rosemary McGee and Ruth Carlitz’s Learning Study on the Users in Technology for Transparency and Accountability echoes Markus’ recognition of the damaging effect of basing strategies on honestly researched but untested assumptions about intended users.
The processes by which the tools are selected involve complicated decisions – “What kind of tool? Build or buy? Open source or proprietary?” – and different decision-making models – “Top-down or bottom-up? With what degree of research, consultation or preparation?”
In fairness to the Valencia agenda, its drafters speak to journalists and activists that come to the process already committed to technical solutions and equipped to address them. De Lanerolle and Wilson’s research covered projects closer to the grassroots. Yet the scale of dysfunction is sobering. Almost half the cases they tracked effectively failed their users.
“These included the production of social media reporting systems which did not receive reports (and) SMS scoring platforms which did not receive SMS messages”. In a quarter of cases the NGO had little or no information regarding tool use. In two cases the tools were abandoned because of their complexity or the costs of deploying it.
Trialling during project planning helps understand the limitations of tools in context, and identify obstacles to user uptake. Shared documentation of trials and trialling methods could help organisations develop best, or better practice. ICT project leaders need to ask, is there an existing alternative to developing new tools? The costs of failure in projects that adopt ‘off-the-shelf’ tools are much lower than with bespoke tool development.
An issue that could be addressed in Valencia this week is the “communication problem” between tech researchers and practitioners. The event, say organisers, “cross-pollinates information and knowledge with the goal of improving the services, strategies, and tools offered to the most vulnerable individuals on the front lines.”
De Lanerolle and Wilson also advise reaching out to peers, identifying existing initiatives with similar objectives, or simply searching the web to find tools ready to use. Trialling two or three tools is an obvious but effective means of identifying hidden challenges to implementation, earlier in the process the better.
And be prepared for failure. Some of the groups tracked by de Lanerolle and Wilson learned from initial failure and improved their tools and projects in subsequent iterations. Only one project team tracked by De Lanerolle and Wilson had prepared itself for managing risk through iterative and adaptive design, the standard modus operandi of new tech start-ups elsewhere.
Ironically, several of the teams recognised the value of iterative development and perfectly understood the reasons behind the shortcomings of their new tools. But they lacked the capacity, authority, financial resources or time to invest in further development. This was time, money and opportunity wasted, and responsibility abandoned.
An understanding of the importance of planning for an iterative cycle of versions that will succeed over time through testing, should be the lasting takeaway from this week’s discussions in Valencia.
Test It and They Might Come: Improving the Uptake of Digital Tools in Transparency and Accountability Initiatives. Wilson, C.; and Lanerolle, I., d. IDS Bulletin, 47(1). 2016. DOI: http://dx.doi.org/10.19088/1968-2016.110