Instagram
Social Media Addiction Trial: Woman Claims Instagram and YouTube Fuelled Anxiety and Insecurity Pexels

A 20-year-old Californian woman is in the midst of a high-profile landmark trial in Los Angeles that could change how social media platforms are viewed under the law.

Kaley G.M., identified only by her initials in court, has taken legal action against Meta Platforms, owner of Instagram, and Google's YouTube, claiming that years of early exposure to their apps caused lasting harm to her mental health.

In testimony this week, she said she first began using YouTube at the age of six and Instagram at nine, and that repeated engagement with the platforms left her anxious, depressed, and insecure about her body image. Kaley told jurors that she became dependent on likes and notifications, struggled to detach herself even when she felt upset or bullied, and experienced strained relationships with her family as a result of her obsessive use.

The lawsuit is about the allegations that the companies purposely built addictive design features such as endless scrolling and recommendation algorithms, which kept her engaged for longer than she ever intended and exacerbated feelings of insecurity and anxiety.

Meta and YouTube have rejected the claims, insisting that their products are not intentionally harmful and that other factors in Kaley's life may have contributed to her well-being challenges. The case's outcome may have far-reaching implications for how social networks are regulated and could open the door to more lawsuits from parents, schools, and individuals who believe social media has damaged their lives.

Early Exposure And Mental Health Struggles

The heart of Kaley's testimony revolved around her relationship with social media from a very young age. She explained that what began as casual use quickly morphed into something she could not easily control, describing how she would spend 'all day' on YouTube and Instagram during her childhood. Features such as YouTube's autoplay reportedly kept her engaged far beyond what she had intended. She said that even when subjected to online bullying or when she did not receive many likes on her posts, she felt compelled to remain logged in and check for updates.

Kaley also spoke of how digital filters and curated content contributed to body image anxiety. She told the court she had never experienced concerns about her appearance before full engagement with these platforms. Over time, her mental health deteriorated, and she began suffering from clinical diagnoses including depression, anxiety, and body dysmorphic disorder. At one point, she even admitted to self-harming as a way to cope with her emotional distress. According to her attorneys, these developments were a direct consequence of the psychological toll exerted by social media use, exacerbated by design elements intended to maximise user attention and time spent on the apps.

Her frequent use also affected her everyday life, including her performance at school, sleep patterns, and personal relationships. Kaley described moments of panic when her mother took her phone away, saying, 'It's too hard to be without it,' and that being offline made her feel she was missing out on something crucial. This anxiety about disconnection and fear of missing out are central elements of her argument that the platforms contributed a whole lot to her insecurity and dependency.

How Social Media Giants Responded

Meta and YouTube have strongly denied the main allegations in the lawsuit, rejecting the idea that their platforms were deliberately engineered to be addictive or that they are legally responsible for the claimant's mental health difficulties. They have also pointed to other challenges in the woman's personal and family life that could have influenced her emotional well-being, suggesting it would be an oversimplification to attribute her struggles solely to the apps.

Google's legal team has likewise defended YouTube by distinguishing it from typical social networking services, presenting it as a video platform where users choose what to watch and emphasising that there is no clear evidence that its design intentionally causes harm.