To say that social media has had a big impact on our lives is a pretty sizeable understatement. It presents the opportunity for us to be in constant communication with one another, speak to family across the globe, have access to 'celebrities' once unattainable,  share valuable knowledge among communities and raise awareness of key societal issues. Take the recent world record breaking Egg on Instagram, which has helped to raise awareness of mental health - and currently boasts over 50 million likes.

But as social media becomes ever-more pervasive, so too do questions around the accountability of tech giants and the responsibility they have for their users. Most recently, questions have centred around the safety of children on these platforms, particularly with regards to the content they are able to access - and content that is actively pushed to them based on previous searches or interactions.

The NSPCC is one of those organisations leading the charge, and this week came out with its 'Wild West' campaign. The charity is pushing for a robust regulator to enforce a legal duty of care to children on social networks. The regulator would have legal powers and the potential to issue steep fines to tech firms, while requiring social networks to meet a set of minimum safeguarding standards. 

While this sounds like a (relatively) straightforward solution, the complexities here are vast. Antony Walker, deputy CEO of TechUK, has commented on these difficulties, specifically how to find a solution that addresses the risks without being too restrictive. "Solutions must be found that are effective and proportionate, taking into account the very real differences between content that is illegal and content that is legal but might be harmful to some people in some contexts." Regardless, the campaigns currently underway for a statutory 'duty of care' are prolific and aren't going anywhere any time soon. 

There's no question that social media companies should be doing more to ensure the safety of their users - and many are still looking for practical ways in which to do that. Yubo is one example of a platform already innovating in this space, using AI-powered age estimation tech to help protect young people online. 

Certainly, social networks present a huge opportunity for good. But to get there, we need to figure out what to do about the bad.