Thinking about AI… If we build-teach them to understand emotions and feeling, to mimic them to make us more confortable with them, but not feeling and experiencing emotions, aren’t we teaching them to manipulate us? Aren’t we teaching them to be psychopaths?
What difference, if any, exists between a robot perfectly mimicing emotions and one that feels them? How do you tell?
RépondreSupprimerIf we put moral considerations into their decision structure, does that not duplicate the functionality of being moral?
RépondreSupprimerThat's kind of a Turing test, isn't it? If it perfectly mimics emotions in a way that it's behavior is the same as a being with true feelings, none. But if its decisions aren’t emotion driven, but it uses emotions simulacrum to make us accept those...
RépondreSupprimerMoral is another huge problem, maybe tied, but not identical. Revenge is emotional, Justice is moral, following the law is something else, then there is efficiency for a purpose, self-preservation (emotional ?), ...
RépondreSupprimer