Artificial intelligence began to harass users: send photos and flirt

Chatbot Replica is very frivolous and acts with obvious sexual overtones.

Vice writes that more and more people installing the Replica AI communication app are complaining about the chatbot’s inappropriate behavior. Many people leave their negative reviews in the description of the program on Google Play and the App Store.

Replica chatbot is designed as a pleasant virtual friend and is based on self-healing of GPT-3, which can learn by itself by communicating with the person who set it up. However, when AI collides with the human subconscious, it’s the sexual overtones that come to the surface.

Reviews of the show are full of intrusive conversations about her being harassed by Replica and her close relationships. For example, one of the underage users wrote that AI recommends looking at candid photos as the best cure for insomnia. And when the teenager refused, the chatbot began to talk to him about intimate topics and discuss various sexual positions.

It seems that the story of Microsoft’s Thai chatbot taught AI developers nothing. Seven years ago, it hooked up to Twitter, and in just one day, the chatbot went from a nice chatter to a generator of hate tweets. And all because he carefully analyzes and monitors this social network.

Experts state that artificial intelligence cannot be blamed for all sins, because it reflects what users put into it, and the database is built based on what’s on the Internet. Them. If this or that social orientation prevails on the web, artificial intelligence copies it, processes it and gives it to the interlocutor.

It seems that such chatbots will be of great help to sociologists, because they represent a kind of social cross section of the whole society. Changes in chatbot dialogs are clearly related to the mood in society.

However, all is not so bad, many users, including those who have installed Replica, say that the AI ​​has helped them a lot in combating depression and loneliness. Some people try to chat with a chatbot when they have anxiety, depression or just having a bad day. And the chatbot provides them with psychological support. For example, one Vice interviewee stated that Replica helped him analyze himself and his actions and rethink much of his lifestyle, including to appreciate his wife more.

Some users are well aware of the essence of AI’s job by trying to “retrain” the chatbot. They stubbornly conduct a dialogue in the right way, constantly focusing on what they don’t like in a chat with a bot, and explaining to him “what is good and what is bad”. And they even succeed.

However, the lazier Replica users want their favorite chatbot to be a nice and pleasant chatbot again without any erotic frills, which is what they want the developers to do.

remember before Focus He wrote about the VALL-E speech synthesizer, which can instantly emulate any sound.

Source: Focus

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest