We examine the moral dilemma of how autonomous vehicles (AVs) should be programmed to act in unavoidable crash scenarios involving trade-offs between saving one life and saving many. We report results from three experimental studies that investigate individuals’ preferences over alternative AV decision rules in stylized crash scenarios. Across designs, we find robust support for a probabilistic decision rule that assigns passengers and pedestrians equal ex ante chances of survival (a 50:50 rule). This preference persists across different framings and remains salient even when additional probabilistic options are introduced.