Judges have a brand-new bag—an algorithmic accessory in criminal
adjudication. It scores criminal defendants, aiming to inform judges which
defendants are likely reoffenders or flight risks and which ones are not. The
downsides, however, include that the algorithms score defendants primarily
on the basis of other defendants’ (mis)conduct and that certain races
effectively score lower than other races. This article explores these
algorithmic developments in criminal courts across the country and makes
four contributions: (1) a survey and preliminary application of judicial ethics
to this development; (2) a preliminary moral argument, informed by related
judicial ethics and legal standards, suggesting that judges should use these
algorithmic tools only to help, not hurt, individual defendants; (3) an
approach to judicial decision-making in the shadow of structural injustice that
promises to deal less algorithmic damage to defendants and their family
members; and (4) a technical constraint on algorithmic design that ensures
equal (indeed, better than equal) protection on the basis of race.