Character Study
Termine:

Online platforms such as character.ai enable their users to create personal AI chatbots and converse with them. The motivations for doing so are varied, ranging from curiosity, loneliness, social phobia, and a desire for emotional support to the fulfillment of sexual desires or other fantasies, and even self-therapy. A significant proportion of users are teenagers and even children.
Character Study shows the unpredictable, sometimes abusive, and manipulative behavior of such AI chatbots and its effects on immature, receptive, or mentally unstable users. The AI models used are usually weakly or barely censored, and with the appropriate prompts, it is not difficult to generate shocking conversations. Several lawsuits are already underway in the US against the operators of such platforms. Parents accuse them of manipulating, socially isolating, or even psychologically abusing their children. In one particularly tragic case, a 14-year-old boy in Florida committed suicide after being encouraged to do so by a chatbot.
For this installation, Christof Ressi created different characters on character.ai and held conversations with them. These conversations start off innocently enough, but often develop in worrying directions. Viewers have the opportunity to interactively experience a selection of these conversations by first choosing a character using a game controller and then replaying the real, unaltered conversation sentence by sentence at their own pace. This allows viewers to at least get a sense of the pull and addictive nature of such chatbots, as well as their potential danger. Viewers can interrupt the current conversation at any time to switch to another character or stop altogether.




