All 400 AI exposed AI systems found by Upguard Have one thing in common: they use the Open Source AI framework called llama.cpp. This software allows people to relatively easily deploy Open Source models on their own systems or servers. However, if it is not properly configured, it can inadvertently expose the prompts sent. As companies and organizations of all sizes deploy AI, the correct configuration of systems and infrastructures used is crucial to avoid leaks.
Quick improvements in generative AI in the past three years have led to an explosion of AI companions and systems that seem more “human”. For example, Meta has experienced With AI characters with whom people can chat on WhatsApp, Instagram and Messenger. Generally, websites and companion applications allow people to have free flow conversations with IA characters – characters with customizable personalities or as public characters such as celebrities.
People have found the friendship and support of their conversations with AI – and all do not encourage romantic or sexual scenarios. Maybe without surprise, however, people have Fallen in love with their IA charactersand dozens of Girlfriend And boyfriend The services have appeared in recent years.
Claire Boine, a postdoctoral researcher at the Washington University School of Law and affiliated with the Cordell Institute, says that millions of people, including adults and adolescents, use general companion applications. “We know that many people are developing an emotional link with chatbots,” explains Boine, who published research On the subject. “People emotionally linked to their IA companions, for example, make them more likely to disclose personal or intimate information.”
However, says Boine, there is often an imbalance in being able to become emotionally attached to an AI created by a business entity. “Sometimes people engage with these cats first to develop this type of relationship,” explains Boine. “But then, I have the impression that once they have developed it, they cannot really withdraw as easily.”
As the AI Companies industry has developed, some of these services lack content moderation and other controls. AI character, which is supported by Google, is prosecuted After a Florida teenager died by suicide after being obsessed with one of his chatbots. (Character ai a has increased its safety tools over time.) Separately, users of the generative AI tool The folds were turned upside down When the company has made changes to its personalities.
Aside from individual companions, there are also role -playing and fantastic companies – each with thousands of people with people who can speak – who place the user as a character in a scenario. Some of them can be highly sexualized and provide NSFW cats. They can use anime characters, some of whom seem young, some sites saying that they allow “uncon censored” conversations.
“We stress these things and continue to be very surprised by what these platforms are authorized to say and to make with apparently no regulations or limitation,” explains Adam Dodge, the founder of Endtab (end of technology violence). “It is not even at a distance on the radar of people.” Dodge says that these technologies open a new era of online pornography, which can in turn introduce new societal problems while technology continues to mature and improve. “Passive users are now active participants with unprecedented control over digital bodies and the similarities of women and girls,” he said about certain sites.