Not a Dark Side

We read about the “dark side” of digital identity platforms, as if it was “side” effects produced by their design and implementation. But the “dark side” narrative appears flawed, in the light of design injustices that are deeply inscribed into digital ID architectures. In this light it seems appropriate to talk about a “dark matter” of digital ID, rather than just a “side” with incidental effects.

The terminology of “unintended consequences” is widely used with reference to the overarching research stream on digital platforms for development. Acronymised as DP4D, and widely popularised across Information Systems and cognate literatures, the DP4D orthodoxy rests on a much wider logic than that of specific, database-centred digital identity platforms. Illustrated in a recent Special Issue of the Information Systems Journal, the platforms-for-development orthodoxy is articulated across transaction platforms, used to connect the demand and supply for a given product or service, and innovation platforms, used to build a set of complements on a common core. Qualitatively different in nature, transaction and innovation platforms share the generative features that, argues the orthodoxy centred on them, allows customising content to meet the needs of recipients across contexts, including sites of marginalisation where basic needs are unmet.

Openly spelling out that orthodoxy, the Special Issue in point at the same time problematises it. It does not need a focus on digital ID – and on the systemic injustices narrated through this blog’s pages – to note the shortcomings of a platforms-for-development view, especially where existing oppressive logics are inscribed in the design of the platforms. As a core example, digital labour platforms hold the promise of generating new jobs in historically jobless contexts, creating income opportunities and windows of hope for deprived individuals. But at the same time, research on such platforms points to the subalternity inscribed in their technology, which subjects workers to inhuman rating systems and pushes logics of right deprivation that the platform, by design, enforces. The vast body of work on digital labour platforms casts a long shadow on platforms-for-development, a shadow reflected by worker’s narrations of dehumanising experiences lived on a daily basis.

Research on digital identity reflects the same orthodoxy. After all, digital identity platforms present all features of innovation platforms: they have a core constituted by a database storing demographic and, increasingly,biometric data; they rely on boundary resources which enable the construction of complements on the same core, and in virtue of that, they are seen as generative artefacts thatcan fit the needs of particular peoples and countries. But it is the same orthodoxy that crumbles through these pages: we already saw how legal injustice is inflicted on digital ID users, subordinating rights as essential as food, shelter and protection to a well-functioning nexus of authentication and authorisation. We have also seen, however, how legal injustice – and the exclusions stemming from it – is not an unintended consequence of the system, but an integral part of its design. As noted in our work on the politics of anti-poverty artefacts, biometrics are incorporated in them with the precise logic of combating forgery at the expense of the exclusion errors generated by the authentication-authorisation nexus.

Cash-based transactions within biometrisation. Bengaluru, April 2019

It is crucial to see legal injustice, and the way it turns into design injustice through the making of artefacts that incorporate it, as embedded in the features of digital ID systems that subordinate authorisation to successful authentication. In turn, successful authentication of users is predicated onto registration, the first step that allows the registration of user credentials into a central identity database. The point is, for as long as centralised ID systems remain predicated on the conditionality of authorisation to access food, shelter, vital rights associated to citizenship, to authentication processes whose completion is unsure (and in some cases, intrinsically fragile), the logic seeing exclusion errors as the “less evil” as compared to wrongful inclusions will be designed in the artefact itself. And that is not a “side”, inasmuch as platform-for-development literature has to say. It is the matter of the system, which resides in technology design itself.

My doubts on the “dark side” have several ramifications. What I especially hope that my field, Information Systems, treasures of this narrative is the problematisation of a term that has become all too common, all too used, all central to research that wants to illuminate the “unintended consequences” of technologies designed for good. And if none of this was unintended – if, as it is for digital identity platforms, the “side” was effectively designed into the inner “matter” of technology? It is too dangerous, and beyond naïve to imagine that te “side” may be a technical problem solved through fixing some glitches. Research on hunger deaths associated to biometric identification illustrates the dramatic consequences of such a narrative, and the irresponsibility of promoting naïve argumentations around it.

It is a dark matter that we must investigate, not an incidental side. Research on platforms-for-development, whatever angle it takes, cannot and should not bypass this point.

1 thought on “Not a Dark Side”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s