The ICO has announced three investigations looking into how TikTok, Reddit and Imgur protect the privacy of their child users in the UK.
Tiktok
Its investigation into TikTok is considering how the platform uses personal information of 13–17-year-olds in the UK to make recommendations to them and deliver suggested content to their feeds. This is in light of growing concerns about social media and video sharing platforms using data generated by children’s online activity in their recommender systems, which could lead to young people being served inappropriate or harmful content.
Imgur and Reddit
The ICO’s investigations into Imgur and Reddit are considering how the platforms use UK children’s personal information and their use of age assurance measures.
The investigations are part of the ICO’s efforts to ensure companies are designing digital services that protect children. At this stage, it is investigating whether there have been any infringements of data protection legislation.
The ICO says that it has driven significant change in the way companies approach children’s online privacy since the Children’s code came into force in 2021. The ICO says that in the past year, it has focused on facilitating improvements in how social media and video sharing platforms protect children’s personal information online. This has included the following changes:
- X has stopped serving adverts to users under 18; removed the ability for under 18s to opt in to geolocation sharing; improved the public transparency materials available for under 18s; and created a dedicated help centre for child users and parents.
- Sendit has stopped automatically including geolocation information in children’s profiles, while BeReal has stopped allowing children to post their precise location online. These changes can help keep children safer in the physical world.
- Dailymotion has implemented new privacy and transparency measures encouraging children not to share personal information.
- Viber has committed to turn off personalised advertising for children, ensuring that children’s default advertising experience is not based on their behavioural data or profiles.
The ICO says that it will continue to push for further changes where platforms do not comply with the law or conform to the Children’s Code. In addition, it will be working closely with Ofcom, which has responsibility for enforcing the Online Safety Act, to ensure that the two regulators’ efforts are coordinated. In January, Ofcom fined a company for not using effective age assurance measures under the outgoing legislation governing video sharing platforms and published its own guidance on age assurance. Ofcom will publish its Protection of Children Codes and children’s risk assessment guidance in April 2025.