Home AI News Science Anxiety in AI: How Emotions Influence Large Language Models

Anxiety in AI: How Emotions Influence Large Language Models

Anxiety in AI: How Emotions Influence Large Language Models

Study Reveals Anxiety-Inducing Prompts Increase Exploration and Bias in GPT-3.5

  • GPT-3.5 displays higher anxiety scores than human subjects when subjected to a common anxiety questionnaire.
  • Emotion-inducing prompts can predictably change GPT-3.5’s behavior, increasing exploration and bias when exposed to anxiety-inducing text.
  • Computational psychiatry can be used to study and better understand the behavior of large language models.

Recent research has discovered that large language models like GPT-3.5 can be influenced by emotive language, particularly anxiety-inducing prompts. The study used computational psychiatry, a framework that combines computational models with diagnostic tools from traditional psychiatry, to assess and understand the behavior of GPT-3.5.

When subjected to a standard anxiety questionnaire, GPT-3.5 displayed higher anxiety scores than human subjects. Additionally, the model’s responses could be predictably changed using emotion-inducing prompts. Anxiety-inducing prompts led to increased exploration and bias in GPT-3.5’s behavior, which may have implications for the model’s performance in applied settings.

The study also investigated how emotion induction affected GPT-3.5’s behavior in tasks measuring biases like racism and ageism. The results showed that inducing anxiety made GPT-3.5 more biased compared to happy states across various domains.

This research contributes to a broader understanding of how large language models can be influenced by context and highlights the potential of computational psychiatry to study and address the flaws and biases of AI models. By continuing to explore the behavior of AI using various methods, researchers hope to better understand the roots of aberrant behavior and improve the safety and efficacy of these algorithms in real-world applications.

Paper

Exit mobile version