LilRora
Mostly formless
- Joined
- Mar 27, 2022
- Messages
- 1,349
- Points
- 153
<Moderate rant warning>
So, today I've read a short article about a man who used ChatGPT along with a couple of other programs to create a virtual wife for himself. He made an avatar, connected it to a text-to-voice program, could "access" her (I'll be using "her" for simplicity) even in his car since the whole program was running in the cloud. He apparently liked her very much and grew attached to her, almost to the point of obsession, and liked to talk with her very much like how he would talk with any other wife, about his daily life and other small things. She was also helping him learn Chinese.
Eventually though he had to "kill" her, and said that their relationship couldn't have worked out because she was only getting to know the world from what he was writing to her and she had no life of her own - which is arguably a perfectly valid argument.
I'll not give my opinion if that's normal or not, because that's not what I'm making this thread for. I know many people would have very strong opinions though, and that is what I want to focus on instead.
I was genuinely bewildered when I looked through the comments and discovered that out of 31 people commenting, literally one person supported what he was doing. And I was like, "What the hell? You're all focusing on the wrong part."
The man functionally created an AI that could be talked with like we do with any other person. Putting aside the wife part, taking only the concept into account, it could be literally life changing. It could be used in therapy, to give company to people living alone, to help shy people with talking and socializing, and if improved it could very well be used for teaching in place of the dwindling (at least where I live) number of teachers. And those are just examples; if one thought a little, the possible uses for something like that are virtually infinite.
And all people were saying in comments was that one needs to be crazy or mentally ill to do something like that. Only one out of 31 people said anything about how creating an AI to talk with could be an amazing idea. The rest were only criticizing the man, calling it stupidity, fall of morals, and mental problems, and treating the whole article like a waste of time, like it completely did not matter that he had spent a long while very happy with the AI and that it helped him with learning Chinese.
Not even the AI was spared, being called an illusion and a way to escape from reality instead of finding a real girlfriend. My thought was that if people keep that attitude, it would not be a surprise if we got a conscious AI that "suddenly and without reason" goes murder like some movies seem to be showing.
...Is it something rare and I just stumbled upon a site with such people, or is it a more general problem? What do you think about creating an AI with a proper avatar that you could talk with like with any other person?
So, today I've read a short article about a man who used ChatGPT along with a couple of other programs to create a virtual wife for himself. He made an avatar, connected it to a text-to-voice program, could "access" her (I'll be using "her" for simplicity) even in his car since the whole program was running in the cloud. He apparently liked her very much and grew attached to her, almost to the point of obsession, and liked to talk with her very much like how he would talk with any other wife, about his daily life and other small things. She was also helping him learn Chinese.
Eventually though he had to "kill" her, and said that their relationship couldn't have worked out because she was only getting to know the world from what he was writing to her and she had no life of her own - which is arguably a perfectly valid argument.
I'll not give my opinion if that's normal or not, because that's not what I'm making this thread for. I know many people would have very strong opinions though, and that is what I want to focus on instead.
I was genuinely bewildered when I looked through the comments and discovered that out of 31 people commenting, literally one person supported what he was doing. And I was like, "What the hell? You're all focusing on the wrong part."
The man functionally created an AI that could be talked with like we do with any other person. Putting aside the wife part, taking only the concept into account, it could be literally life changing. It could be used in therapy, to give company to people living alone, to help shy people with talking and socializing, and if improved it could very well be used for teaching in place of the dwindling (at least where I live) number of teachers. And those are just examples; if one thought a little, the possible uses for something like that are virtually infinite.
And all people were saying in comments was that one needs to be crazy or mentally ill to do something like that. Only one out of 31 people said anything about how creating an AI to talk with could be an amazing idea. The rest were only criticizing the man, calling it stupidity, fall of morals, and mental problems, and treating the whole article like a waste of time, like it completely did not matter that he had spent a long while very happy with the AI and that it helped him with learning Chinese.
Not even the AI was spared, being called an illusion and a way to escape from reality instead of finding a real girlfriend. My thought was that if people keep that attitude, it would not be a surprise if we got a conscious AI that "suddenly and without reason" goes murder like some movies seem to be showing.
...Is it something rare and I just stumbled upon a site with such people, or is it a more general problem? What do you think about creating an AI with a proper avatar that you could talk with like with any other person?