The ECGI blog is kindly supported by
Teaching old tricks to brand new dogs
It is not so much that new technology poses novel challenges to established governance paradigms, but that it poses challenges to those who must implement them. The tools are the same, but their application will be different. Their implementation, therefore, requires an open and progressive mindset, and willingness to ask “Why not?” instead of “Why?”.
Take, for example, the crucible of corporate existence (in both the pre and post Distributed Ledger Technology worlds): the contract.
Conventional contract law is easily capable of accommodating both smart contracts and smart legal contracts. As long as the lawyers are on board.
Historically the epitome of human interaction, it is now increasingly an arrangement achieved by algorithmic means. Given contract’s legal foundation as a “meeting of minds”, it would be easy to jump to the conclusion that the law of machine contracts must be very different to the human kind. Not so. As the Law Commission of England and Wales’ recent analysis found, conventional contract law is easily capable of accommodating both smart contracts and smart legal contracts - Smart contracts | Law Commission, Digital assets | Law Commission. As long as the lawyers are on board. Machines are, after all, simply vehicles of human expression: they will make agreements only where, when and how they have been instructed to do so: the autonomy remains with the instructing party (as has long been accepted in relation to vending and ticket-issuing machines, for example). An automated offer is no less a commitment to be bound for being made in digital form. And, more significantly, the need for such an offer to exist, alongside a corresponding acceptance, is as real as ever. It is difficult, therefore, to imagine how the rules of such engagement could change, or to make the case that they should do so. There is no question that the board looks different and the pieces move in an unfamiliar way. But when IBM developed Deep Blue, allowing artificial intelligence to take on, and beat (even Grandmaster) humans at chess, nobody suggested that the rules of the game needed to change - IBM100 - Deep Blue. In fact, to have done so would, of course, have defeated the whole purpose of the exercise. It was the players who had to adapt their preparation, strategy and behaviour (in effect, often by playing deliberately sub-optimal moves that an artificial intelligence would, at least initially, find more difficult to anticipate and counter). This is the principal challenge to all aspects of governance in a world in which tech holds increasing sway: teaching old tricks to brand new dogs. Or doges.
The thing that will shift in a world of greater automation is the topography of the contractual landscape; the rules will stay the same, but the patterns of their use will change.
That is not to deny that change is both necessary and inevitable in governance terms: but what is required is of a different order to the fundamental restructuring that is often expected (and dreaded) in response to technological development, and to distributed ledger technology in particular. The thing that will shift in a world of greater automation is the topography of the contractual landscape; the rules will stay the same, but the patterns of their use will change. There is, for instance, no such thing as a recalcitrant computer (although it might sometimes feel like there is), so the enforcement of performance is less likely to be a pressing issue. The flipside of this, however, is that defective performance is likely to be more widespread, meaning that remedies aimed at correction and restoration will be sought more and more often: rectification, in particular, is likely to be called on increasingly (or, rather, a form of rectification, which effectively sees a smart contract coded to alter the effects of a previous version, seeing as it will not be possible to alter the original code itself).
What is needed is selection rather than invention. The law needs to draw analogies with existing organisational paradigms and identify the closest fit.
The locus of liability will also spread. Achieving automation will often mean adding another party to the chain of contractual command in the form of a coder. And whilst successful automation reduces the risk of error in (properly instructed) contractual performance, coders are no more insulated from the risk of error than any contracting party or legal adviser. Mistake, misrepresentation and negligence, for instance, will not change, but will cast their net wider (subject to the interesting wrinkle of public interactions with publicly-deployed code).
These challenges are all germane to the administration and governance of Decentralised Autonomous Organisations (which are essentially collections of automated instructions, agreements and potential agreements) - Decentralised Autonomous Organisations (DAOs). This is, of course, smart contracting, smart legal contracting and contracting in aggregate, and the concerns of users are aggregated too: they seek not individual recognition, enforcement and protection, but the assurance that their co-ordinated endeavours will be treated in a legally effective and coherent way. But here, once more, what is needed is selection rather than invention. In setting out the conditions under which DAOs function, and the protections which are afforded to their creators, the law needs to draw analogies with existing organisational paradigms and identify the closest fit. Or, more likely, fits. Because there is no reason to suppose that DAOs will be any more homogenous as a class than conventional organisations.
The last decade has seen technological developments that are nothing short of tectonic in terms of their implications for human interaction on a collective, distributed and permissionless basis. In responding to this, common lawyers will not need to abandon what they know. But they will need to adapt that knowledge: when playing against the artificially intelligent, that is the really intelligent move to make.
-------------------------------------------
Professor Sarah C Green, Law Commissioner for Commercial and Common Law at Law Commission of England and Wales.
The ECGI does not, consistent with its constitutional purpose, have a view or opinion. If you wish to respond to this article, you can submit a blog article or 'letter to the editor' by clicking here.