The internet is something that we have come to rely on for our daily business, for education, for information and for leisure. However concerns to protect children have rightly been raised through a Westminster Hall debate held in May on the impact of smart phones and social media on children. This included a call for the Government to ban smartphones for under 16s.
I know that there are strongly held views both for and against this. I very much acknowledge the dangers of social media for young people and endorse the Online Safety Act which introduced measures to protect young people. Through this legislation the Government is committed to ensuring that there are sufficient protections in place for children of all ages online thus making the UK the safest place to be a child online.
However I do not agree with the proposal to ban young people from having smartphones. I believe that this is a decision for parents and not one for legislation. If we legislate about such things there will always be ways found to work around it. So rather than potentially criminalising people I think that, alongside the Online Harms Act, we should be looking to educate our young people on the risks and harms of social media and helping them to access the many privacy features available on social media platforms to help keep them safe. These include setting an account to private, turning off location settings, or blocking new friend requests. In addition, there may be the option for parental controls, which may offer certain features, such as monitoring usage time, scheduling breaks, and viewing blocked contacts.
Further, research by the charity ‘Young Minds’ has shown that young people want to be protected but do not want an outright ban. The charity say that the voice of the young people concerned has not been heard in the debate. Both Young Minds and the NSPCC offer guidance to help families with these issues.
As the Minister said in summing up the debate, we live in a digital age and many parents want their children to have a smartphone. There can be advantages to using social networks, such as staying connected to friends and family. There are also educational benefits, such as learning how to create their own videos, as well as potentially mental health benefits to those younger people who may find it easier to discuss their problems online. However, most importantly, it is vital that children are - and feel - safe online.
Smartphone in Schools
Whilst supporting this open approach to the use of smartphones I do agree that within a school context using them can be a distraction for both pupils and their teachers. Thus I am pleased that the Department for Education has published new guidance which backs head teachers in banning mobile phone use throughout the school day, including at break times, to tackle disruptive behaviour and online bullying while boosting attention during lessons.
The new guidance says that schools should prohibit the use of mobile phones, but they will have autonomy on how to do this. Schools will be supported to prohibit mobile phone use with examples of different approaches, including banning phones from the school premises, handing in phones on arrival at school, and keeping phones securely locked away at school.
The move brings England in line with other countries that have already implemented a ban, including France, Italy, and Portugal. It follows warnings from the United Nations on the risks of smartphones in schools, and government data that found that around a third of secondary school pupils reported mobile phones being used when they were not supposed to in most, or all, lessons.
The Online Safety Act
The Act has introduced significant improvements to child safety online. As well as protecting children from illegal material, all services likely to be accessed by children will need to provide additional protection for those children. The legislation puts duties on user-to-user and search service providers to tackle illegal third-party content accessed on or via their service. They must take proactive action against online offending, such as the illegal sale of drugs, and must protect children from other legal content which could cause harm to them. This includes cyberbullying, and online abuse and harassment, which has been designated as ‘priority content that is harmful to children’.
Companies must assess the risk to children from this kind of content and implement proportionate and age-appropriate protections. All in-scope companies must tackle illegal content, including illegal abuse and harassment. Additional measures have been introduced to prevent children from accessing online pornography.
Further to this, the media regulator, Ofcom, has published draft codes of practice. It has said that tech firms will need to configure their algorithms to filter out the most harmful content from children's feeds, and to reduce the visibility and prominence of other harmful content.
Other proposed measures include forcing companies to perform more rigorous age checks if they show harmful content, and making them implement stronger content moderation, including a so called "safe search" function on search engines that restricts inappropriate material.
These new measures will come into force in the second half of 2025, and in the meantime the regulator consulting on the draft codes. Ofcom expects to publish final versions of the codes within a year and companies will then have three months to carry out risk assessments on how children could encounter harmful content on their platforms, and their mitigations, taking Ofcom's guidance into account.