In recent years, YouTube has emerged as a dominant force in the realm of social media, transforming the way we consume entertainment and share information. With its staggering statistics of over 2.6 billion monthly active users and more than 1 billion hours of video watched daily, it has played a significant role in shaping our digital culture.

A double-edged sword

YouTube’s influence on mental health is a topic of growing concern and interest. YouTube can have both positive and negative effects on mental health, depending on its design, use, and regulation.

One key finding is the potential harm caused by YouTube’s ability to foster parasocial relationships. These one-sided connections between content creators and viewers can exacerbate feelings of loneliness and detachment from real-life social interactions, especially among vulnerable young individuals. However, there are tips and strategies available for developing a healthier relationship with social media, emphasising the importance of mindful usage.

On the positive side, YouTube can serve as a valuable tool for education, social connection, and emotional support. Watching informative videos on mental health issues, sharing personal stories, and participating in online communities and support groups can help reduce feelings of loneliness, anxiety, and depression. YouTube’s potential to be a resource for mental well-being is evident, but it is also affected by various factors, including algorithms and policies.

The recommendation algorithm

YouTube’s recommendation algorithm plays a crucial role in shaping users’ experiences and, in turn, their mental health. Over the years, the algorithm has undergone significant changes, from prioritising views and clicks to favouring shares and likes, and eventually focusing on safety and content moderation. While these efforts have been aimed at reducing harmful content and promoting safety, they have also led to unintended consequences.

The algorithm, while attempting to keep viewers engaged, can create filter bubbles and echo chambers, reinforcing users’ existing beliefs and biases. This can contribute to polarisation and misinformation, which have negative consequences for mental health. In response to these challenges, there are calls for increased transparency and improvements in how recommendation algorithms operate, including the use of collaborative AI.

“The algorithm, while attempting to keep viewers engaged, can create filter bubbles and echo chambers, reinforcing users’ existing beliefs and biases. This can contribute to polarisation and misinformation, which have negative consequences for mental health.”
Beyond YouTube

The impact of problematic social media use on mental health extends beyond YouTube. Platforms like Facebook and TikTok have their own unique challenges, such as cyberbullying and harmful content. A survey conducted by Headspace in Australia revealed that one in three Australian youth experienced problematic social media use, driven primarily by a fear of missing out on news, culture, or social interaction. The study findings align with the notion that unhelpful or harmful content can have a detrimental impact on mental health.

In response to growing concerns, Australia has taken a lead in social media regulation, particularly in addressing issues related to algorithms and misinformation. A draft bill seeks to hold social media platforms accountable for the spread of misinformation and disinformation, with significant penalties for non-compliance.

Striking a balance in the digital age

Ultimately, striking a balance between harnessing the benefits of technology and safeguarding mental health remains a crucial endeavour. While AI can enhance mental health care, human touch and ethical considerations must guide its development and deployment. The evolving landscape of psychological manipulation calls for vigilance in countering bias, errors, and misinformation in the digital realm. In this complex interplay between psychology and technology, the well-being of individuals and society at large hangs in the balance.

Useful mental health resources
  • Head to Health,
  • GP referral or community centres, like Neami National for assessment,
  • AI chatbots like Woebot or Wysa  may assist for anxiety or depression,
  • For crisis assistance contact Lifeline,
  • eMHprac website provides good resources for treatment and can help patients and practitioners to navigate the complex mental health landscape.
Author

Dr Luke Balcombe is a digital mental health (DMH) expert and a researcher in the Australian Institute for Suicide Research and Prevention (AISRAP). Dr Balcombe is researching and consulting in the applied psychology and informatics disciplines. Luke is a leader in DMH current state and future trends projects. He is analysing themes and providing insights on the challenges and opportunities for designing and using technology-enabled solutions in mental health care. He is presenting human-centred insights and discussing findings on the impact of YouTube on loneliness and mental health, the use/regulation of AI, and safety and quality standards. His studies include systematic reviews, qualitative research, expert comments, blogs and articles.