Moral judgment about human and robot agents in personal and impersonal dilemmas
CHU Hua-dong1, 2 LI Yuan-yuan1 YE Jun-hui3 HU Feng-pei1 HE Quan4 ZHAO Lei1*
1 School of Management, Zhejiang University of Technology, Hangzhou, 310023, China 2 Zhijiang College of Zhejiang University of Technology, Shaoxing, 312030, China 3 Zhejiang Police College, Hangzhou, 310053, China 4 School of Politics and Public Administration, Zhejiang University of Technology, Hangzhou, 310023, China
Abstract:In the current study, two experiments were adopted to investigate the people’s moral judgment about human and robot agents in personal/impersonal dilemmas. The results are as follows: (1) in impersonal dilemma (autonomous vehicle dilemma), people apply same moral norms to human and robot agents. They have the same expectation about what action the agents should do (utilitarian) and the same moral evaluation (of permissibility, rightness, blame) about the agents’ actually action. (2) In personal dilemma (footbridge dilemma), people apply different moral norms to human and robot agents. Although overall people want both human and robot agent choice deontology action, while compared with human, robot agents were more expected to take the utilitarian action, and they were give higher evaluation (higher permissibility, and lower blame) than their human counterparts about their utilitarian action.
褚华东,李园园,叶君惠,胡凤培,何铨,赵雷. 个人-非个人道德困境下人对智能机器道德判断研究[J]. 应用心理学, 2019, 25(3): 262-271.
CHU Hua-dong, LI Yuan-yuan, YE Jun-hui, HU Feng-pei, HE Quan, ZHAO Lei. Moral judgment about human and robot agents in personal and impersonal dilemmas. 应用心理学, 2019, 25(3): 262-271.