Should tech giants take responsibility for our children’s safety?

Nathan Baranowski

Cyber Safety

The pressure is mounting on technology firms to take responsibility for children’s welfare and mental health, with stricter controls on how young people are able to view content online. Cases such as the death of Molly Russell, who took her own life after viewing graphic images of self-harm on Instagram, have re-ignited the call for social media companies in particular to do more. In February, the NSPCC published proposals to monitor networks and introduce a duty of care to protect children, with heavy fines for those who breach guidelines. The UK Parliament’s Science and Technology Committee also made similar recommendations. As a result, the government is proposing an independent watchdog, which will create a new code of conduct and have the power to impose penalties for those who don’t follow the rules

 

There is no question that the issue needs to be re-examined and there needs to be better fail-safes in place to protect the younger generation. Following Molly’s death, Instagram has committed to removing and banning images of self-harm and social media companies are getting better at targeting disturbing content. According to the European Commission (EC), 89 per cent of hate speech flagged on leading platforms last year was reviewed and removed – if appropriate – within 24 hours. But is it possible to monitor everything we see online?

As well as considering safety, law and policy-makers should think about the impact this trend towards regulation will have on technology companies. Firms aren’t able to take full responsibility for the content young people view online, because it simply isn’t possible to protect them 100 per cent. We know that many children lie – more than 80 per cent in fact – in order to access social media platforms, and with technology so readily available in our homes, it’s easy for children to pick up and use another person’s device, therefore bypassing any safety controls set up on their own computers.

There are many things we can do to tighten security. To give an example, take the porn industry. Just a few years ago, if you typed the wrong word by mistake into a search bar, you’d find link after link directing you to websites with adult content. This all changed after Google altered its SafeSearch filter to make sure that users couldn’t see sexually-explicit images unless they are actively looking for it. More recently, in India, major companies have started blocking keywords relating to child porn and sexual violence in order to limit, and eventually eradicate, circulation of videos and photographs. However, it is still possible to bypass these filters.

Businesses operating online are doing a number of things to make the internet safer for young children, including moderating and taking down content, setting age verifications and creating child-friendly terms and conditions. However, a House of Lords select committee found although these measures benefit young children, ‘there are no such services for children as they grow older’. As young people grow, become more proficient at using technology and begin exploring the internet, it becomes harder to safeguard.

Regulation remains a double-edged sword. The global nature of the internet, which crosses all boundaries, causes difficulty when it comes to implementing laws. Some, like Apple CEO Tim Cook, think regulation of digital companies on a wider basis is inevitable. But, is this possible when technology is changing at such a speed that regulators may not keep up?

Instead, the answer lies in educating parents, carers and teachers, who hold the responsibility of ensuring that their children are using the internet safely. Admittedly, this can be difficult if your child knows more about computers than you do, which is why it’s important to for parents to be able to access support through resources like the UK Safer Internet Centre. To paraphrase the House of Lords report mentioned previously, parents need to raise children ‘who can flourish in a digital world’.

The NSPCC is calling for named tech firm directors to be personally responsible for safeguarding children on their platforms. With nine out of 10 parents agreeing that social networks should be forced to offer more protection, it’s clear that there is a growing voice calling for action on this issue.

My worry is whether this will be enough; can legislation account for all possible changes that technology will undergo in the future? Instead, a multifaceted approach is needed to moderate harmful content, increase internet safety awareness for children and empower parents to establish control over what their family is able to access.

You and your organisation have the opportunity to truly make a difference by using technology in the right way. If you’re looking for help in taking the next step, get in contact with us. 

 

 

Coding

What is the right way to teach coding to younger generations?

May 16 2019 Tom Passmore
AI

A helping hand: working alongside AI

April 29 2019 Nathan Baranowski
Connectivity

Why should we care about 5G?

April 24 2019 Nathan Baranowski