“I call this the emotional alignment design policy. So the idea is that corporations, if they create sentient machines, should create them so that it's obvious to users that they're sentient. And so they evoke appropriate emotional reactions to sentient users. So you don't create a sentient machine and then put it in a bland box that no one will have emotional reactions to. And conversely, don't create a non sentient machine that people will attach to so much and think it's sentient that they...
All content for The Sentience Institute Podcast is the property of Sentience Institute and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
“I call this the emotional alignment design policy. So the idea is that corporations, if they create sentient machines, should create them so that it's obvious to users that they're sentient. And so they evoke appropriate emotional reactions to sentient users. So you don't create a sentient machine and then put it in a bland box that no one will have emotional reactions to. And conversely, don't create a non sentient machine that people will attach to so much and think it's sentient that they...
Oscar Horta of the University of Santiago de Compostela on how we can best help wild animals
The Sentience Institute Podcast
1 hour 19 minutes
4 years ago
Oscar Horta of the University of Santiago de Compostela on how we can best help wild animals
“The main work that really needs to be carried out here is work in the intersection of animal welfare science and the science of ecology and other fields in life science… You could also build a career, not as a scientist, but say, in public administration or government. And you can reach a position in policy-making that can be relevant for the field, so there are plenty of different options there… Getting other interventions accepted and implemented would require significant lobby work. And t...
The Sentience Institute Podcast
“I call this the emotional alignment design policy. So the idea is that corporations, if they create sentient machines, should create them so that it's obvious to users that they're sentient. And so they evoke appropriate emotional reactions to sentient users. So you don't create a sentient machine and then put it in a bland box that no one will have emotional reactions to. And conversely, don't create a non sentient machine that people will attach to so much and think it's sentient that they...