It’s a tool for the owners, it’s a trap for the users. The sycophancy and malleability of the chatbots makes them not just unsuitable as replacements for therapists, they can be and often are actively harmful, validating problematic, spiraling or psychotic thought patterns. I know that therapy isn’t accessible for everyone, but any actual human you can talk to is better than a chat bot in this context. I would rather someone vibe code critical infrastructure with a chat bot than use it for mental health.