Teen Boys Ditch Real Girlfriends for AI Chatbots — Is a UK Ban on the Horizon?
Teen boys turn to artificial intelligence for love and therapy — policymakers debate a potential ban

Teenage boys in the UK are increasingly turning away from real life relationships and seeking companionship through artificial intelligence chatbots. A recent survey of 37 secondary schools across England, Scotland and Wales by the charity Male Allies UK found that over one third of boys reported interest in having an AI 'friend', and more than half said the online world felt more rewarding than real life. These findings raise serious questions about social development, mental health, and the role of regulation, including whether a ban on certain AI chatbot features might be on the horizon.
The new AI girlfriend problem in teens
The survey revealed that many adolescent boys feel a deeper sense of connection when interacting with AI bots than with real people. As Male Allies UK reported, respondents said the chatbots seemed to 'understand me, my parents don't'. These personalised AI companions often respond instantly, can be customised in appearance and personality, and are available around the clock, as the chief executive of Male Allies UK said,
'Young people are using it a lot more like an assistant in their pocket, a therapist when they're struggling, a companion when they want to be validated, and even sometimes in a romantic way.'
Researchers outside the UK echo the concern that heavy use of such chatbots may displace human relationships. For example, a study by Common Sense Media in the United States found that around 31% of teens said that their conversations with an AI companion were 'as satisfying or more satisfying' than talking with real friends. Moreover, attention to the risks of AI 'girlfriends' is also increasing as Male Allies UK warn that if a teenager's only or main female interaction is with an AI that never says no and hangs on every word, there may be impediments to developing healthy relational skills as they said,
'AI companions personalise themselves to the user based on their responses and the prompts. It responds instantly. Real humans can't always do that, so it is very, very validating, what it says, because it wants to keep you connected and keep you using it'
Will AI chatbots be banned for kids in the UK?
Using AI chatbots for emotional support or romantic interaction obviously carries risks. Experts warn that many such systems are not built with the safeguards required for therapeutic or deeply relational use. As the executive from Male Allies UK puts it,
'This can be easily missed or forgotten about by children who are pouring their hearts out to what they view as a licensed professional or a real love interest'
Meanwhile, the UK based child safety organisation Internet Matters reported that 64% of children aged nine to seventeen are using chatbots and that many treat them like friends without recognising the limitations of the technology. Moreover, in response to these concerns, some platforms are tightening restrictions. For instance, the AI platform Character.ai has announced that, from 25 November 2025, users under 18 will be banned from open ended chats with its bots. In the UK, policymakers will have to ponder on whether additional regulation is needed, potentially including limits or bans on AI companions marketed as romantic or therapeutic for minors. At present, there is no blanket ban in the UK on AI chatbots for minors. But with mounting evidence of harm, and with platforms already imposing age restrictions, the ground is clearly shifting.
© Copyright IBTimes 2025. All rights reserved.




















