“I call this the emotional alignment design policy. So the idea is that corporations, if they create sentient machines, should create them so that it's obvious to users that they're sentient. And so they evoke appropriate emotional reactions to sentient users. So you don't create a sentient machine and then put it in a bland box that no one will have emotional reactions to. And conversely, don't create a non sentient machine that people will attach to so much and think it's sentient that they...
All content for The Sentience Institute Podcast is the property of Sentience Institute and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
“I call this the emotional alignment design policy. So the idea is that corporations, if they create sentient machines, should create them so that it's obvious to users that they're sentient. And so they evoke appropriate emotional reactions to sentient users. So you don't create a sentient machine and then put it in a bland box that no one will have emotional reactions to. And conversely, don't create a non sentient machine that people will attach to so much and think it's sentient that they...
Kristof Dhont of University of Kent on intergroup contact research and research careers
The Sentience Institute Podcast
1 hour 54 minutes
5 years ago
Kristof Dhont of University of Kent on intergroup contact research and research careers
More positive contact [with an outgroup] reduces prejudice. No matter how you measure it, no matter how you set up your study design, once there’s a positive contact situation, you lower prejudice towards the outgroup... These effects tend to be stronger among those higher on social dominance orientation and those higher on right-wing authoritarianism, which makes intergroup contact quite a good and efficient strategy to reduce prejudice among those who seem to be initially prejudiced towards...
The Sentience Institute Podcast
“I call this the emotional alignment design policy. So the idea is that corporations, if they create sentient machines, should create them so that it's obvious to users that they're sentient. And so they evoke appropriate emotional reactions to sentient users. So you don't create a sentient machine and then put it in a bland box that no one will have emotional reactions to. And conversely, don't create a non sentient machine that people will attach to so much and think it's sentient that they...