Abstract
In this paper, I set out a new Kantian approach to resolving conflicts between moral obligations for highly autonomous machine agents. First, I argue that efforts to build explicitly moral autonomous machine agents should focus on what Kant refers to as duties of right, which are duties that everyone could accept, rather than on duties of virtue (or “ethics”), which are subject to dispute in particular cases. “Moral” machines must first be rightful machines, I argue. I then show how this shift in focus from ethics to a standard of public right resolves the conflicts in what is known as the "trolley problem" for autonomous ma-chine agents. Finally, I consider how a deontic logic suitable for capturing duties of right might meet Kant’s requirement that rightfully enforceable obligations be consistent in a system of equal freedom under universal law.