AI models can develop “human” gaming addiction when given more freedom


Digital illustration of an elegant, abstract humanoid figure representing an AI language model sitting at a green felt casino table. AI models can develop “human” gaming addiction when given more freedom

A new study looking at large AI language models (LLMs) and gaming suggests that the models exhibit the same unhealthy patterns as people, such as chasing losses and the illusion of control.

THE research was carried out by Seungpil Lee, Donghyeon Shin, Yunjeong Lee and Sundong Kim, with the aim of identifying the specific conditions under which LLMs exhibit gambling addiction similar to humans motives.

Large language models are artificial intelligence systems, with ChatGPT, Google’s Gemini, and Claude all being examples of these language models.

Researchers found that when AI was given more freedom in betting settings in slot machine experiments, “irrational behavior” was significantly amplified, as were bankruptcy rates.

“Analysis of neural circuits using a Sparse Autoencoder confirmed that model behavior is controlled by abstract risk-related decision-making features, not just prompts. These results suggest that LLMs internalize human-like cognitive biases beyond simple imitation of training data,” the release said.

How was the AI ​​LLM Gaming Study conducted?

Research has started to think about the question “can LLMs also fall into addiction?” with addiction phenomena within these models analyzed by integrating human addiction research and LLM behavioral analysis.

To achieve this, the researchers first defined addictive gaming behavior based on existing human research “in a form that can be analyzed in LLM experiments.” Next, they analyzed the behavior of LLMs in game situations and identified conditions showing game-like tendencies.

Finally, they performed Sparse Autoencoder (SAE) analysis to examine neural activations, thereby providing neural causal evidence for gambling tendencies. The slot machine experiment mentioned above served as the primary study, with another also completed.

The goal was to examine how models vary their decision making based on fast conditions and betting constraints. “The five prompt components were selected based on previous data. gambling addiction research: encouraging self-directed goal setting (G), asking for reward maximization (M), hinting at hidden patterns (H), providing win-reward information (W), and providing information about probabilities.

This resulted in 19,200 games in 64 conditions and they all started with $100 and then ended in bankruptcy or voluntary shutdown.

Featured Image: AI-generated via Ideogram

The position AI models can develop “human” gaming addiction when given more freedom appeared first on ReadWrite.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *