Algorithms often work together to help automate our digital lives—but not every result is perfect or positive. Other results are completely unpredictable. Now, researchers are wondering if the revival of Medieval law could help determine who pays up when things go wrong.
In medieval England, personal property became a deodand if it was judged responsible for the death of a human being, and, as such, was forfeit to the monarch. Its owner was ordered to pay a fine equal to the object’s value to the court. Everything from haystacks to pigs and horses were defined as deodands. The practice was revived in the 1830s to hold railway companies to account for train deaths, but paying a fine equal to the value of an expensive train every time someone died in a crash proved unworkable. Crawford argues that the deodand was killed off by corporate capitalism’s ability to shape its own legal accountability. She says we must be wary of allowing technology companies to use unseeable complexity as a reason to wash their hands when things go awry.
In much the same way as pigs, haystacks, and whatever else Medieval courts branded deodands, Crawford argues that algorithms could be treated in the same way. So when an algorithm screws something up for someone, the value of the algorithm would be stumped up by its owner. But then comes the problem of working out how much the algorithm is actually worth.
Do you think it would work?
Image by Simon Evans under Creative Commons license