ChatGPT has feelings about you. Or, at least, it pretends to.
ChatGPT is an artificial intelligence (AI) language model, able to provide conversation-like responses to inquiries, by drawing on a vast database of written text. And it has been designed to express emotions when it talks to you.
If you ask ChatGPT, it will explain that “As an artificial intelligence, I don’t have feelings or emotions. I don’t experience the world the way humans do.” At the same time, it happily admits that it can simulate all kinds of sentiments, from joy to frustration, to better engage users in “a realistic interaction”.
Mimicking human feeling goes deeper than this though. It has important political and ethical implications, problems that go beyond the by now well-rehearsed errors people have discovered with ChatGPT’s model. In a recently published research note in Sociology I sat down to talk to ChatGPT, about itself, reflexivity, AI ethics and what it means for knowledge work that ChatGPT seems to feel the way that it does.Continue Reading…