Depends on AI. I don’t see why it would be weird if the AI was like a human, with real emotions.
If it just pretends emotions, it would be odd, but I wouldn’t blame the person. It still sounds better than total loneliness and may provide better output than imaginary people.
I knda wish something like that existed. But I also don’t. If it had emotions, you could hurt it like a real person, which defeats the purpose. It would also be easy to exploit. How could anyone tell you’re not holding someone hostage inside your computer? And I believe initially very few people would care, because “it’s just a computer”.
Depends on AI. I don’t see why it would be weird if the AI was like a human, with real emotions.
If it just pretends emotions, it would be odd, but I wouldn’t blame the person. It still sounds better than total loneliness and may provide better output than imaginary people.
I knda wish something like that existed. But I also don’t. If it had emotions, you could hurt it like a real person, which defeats the purpose. It would also be easy to exploit. How could anyone tell you’re not holding someone hostage inside your computer? And I believe initially very few people would care, because “it’s just a computer”.