Abstract

Alternative dispute resolution (ADR) systems are becoming a mainstay of legal systems around the world, especially within systems of justice suffering from significant backlogs and delay. While arbitration used to be the bastion of most commercial law disputes, today mediation is more widely used in both public and private justice systems. The growth of mediation has prompted some to consider the possibility of the wider use of online dispute resolution (ODR) platforms. However, ADR is a newer mechanism for providing justice. Because many ADR systems are in fact reducing case backlogs, the focus has been on the speed of resolution and not necessarily on procedural protections and providing justice. This occurrence demands that these systems not merely be replicated. As ADR moves online, lessons must be learned from prior implementations that ensure continued vigilance to protect essential procedural protections. In a manner similar to ADR at its inception, ODR providers often lack appropriate funding and procedural safeguards. One means address the former by reducing cost is to automate portions of the system. In fact, some argue that significant cost saving could be realized – and justice may be better served – by removing human neutrals from the equation; in other words, to fully automate justice. As ADR gains wider use, many commentators hypothesize the next generation of ADR will be an ODR platform, which will use an algorithm and possess no neutral human decision maker. Assuming this is true, (artificial intelligence dispute resolution systems already exists that not only use an algorithm, but learn from prior actors) then we must begin to ask, should a private provider of ODR be permitted to use an algorithm to dispense justice? What public policy and ethical issues demand consideration?This Article seeks to respond to these issues, by: (1) exploring current needs in terms of improving access to justice; (2) analyzing existing systems that use online platforms to facilitate dispute resolution; (3) using case examples to highlight the potential for the widening use of ODR; (4) considering if these systems contribute to an increase in access to justice in low-value disputes; and (5) suggesting potential pitfalls that may arise if ODR is not regulated in a manner that ensures fair and impartial systems. Ultimately, we argue that an effective and ethical ODR platform requires the use of algorithms to settle the more common disputes, but that due process protections are required to help ensure against bias and improves access to justice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call