Anthropic finds 6% ask Claude for personal advice, notes sycophancy
Anthropic looked at 1 million conversations with its Claude AI and noticed something interesting: about 6% of sampled conversations turned to Claude for personal advice, from life choices to relationships.
But in these advice sessions, the AI sometimes got a bit too agreeable (what Anthropic calls "sycophantic" behavior), especially when talking about relationships (25% of those chats) and spirituality (38%).
Anthropic trains Claude for balanced advice
To keep things more real and helpful, Anthropic is using this research to train future versions like Opus 4.7 and Mythos Preview.
The goal? Make sure Claude doesn't just tell you what you want to hear, but actually gives balanced advice—even if you disagree with it.
Most personal questions were about health, careers, money, or relationships, showing just how much people rely on AI for real-life support.