AI generated image
Author profile picture

Sometimes, age is not just a number. When it comes to social media platforms, legislators across Europe are putting them under the magnifying glass. Social media platforms like Instagram, TikTok, and Snapchat attract children despite policies requiring users to be at least 13 years old. The ease with which children bypass these age restrictions highlights a significant challenge for regulators and platforms alike. Research indicates that over 40% of children aged 8-12 use social media, underscoring the need for more robust age verification measures.

The age verification dilemma

Age verification is crucial in protecting minors from harmful content. Platforms such as Instagram, TikTok, and Snapchat have minimum age requirements, but enforcement is challenging. The loophole lies in the registration process, where manipulating the birth year allows children to create accounts easily. This has led to a continuous cat-and-mouse game between platforms and underage users. European regulators, educators, and tech companies are striving to protect children, but enforcing the minimum age requirement remains a tricky task.

Why this is important:

More and more children are entering the Internet and social media platforms without being of age to do so, highlighting the critical need for robust age verification to protect children from harmful content.

This loophole has led to widespread underage use of social media, prompting a need for more robust verification methods. However, it seems that age verification steps are coming more from legislation than from the companies themselves. In fact, the EU is known for fining big companies they deem do not comply enough to ensure children’s security. 

Just recently TikTok’s was penalized with €345 million for making children’s accounts public by default and failing to verify the relationship between linked adult and child accounts. 

European Union initiatives

The European Union is working on a comprehensive code for age-appropriate design and digital identity solutions. The EU eID proposal aims to enhance age verification through certification and interoperability. Additionally, the euCONSENT project is developing a browser-based age verification method. These initiatives seek to standardize age verification across Europe and create a safer digital environment for children.

Spain’s R-rated passport and France’s legislative measures

Last week, Spain announced it will be introducing introduced the Cartera Digital Beta app an innovative approach to preventing minors from accessing pornographic websites. Like a digital passport, this app aims to restrict access to porn by enforcing age verification. Users must authenticate their age using electronic IDs or qualified certificates. The app also incorporates a double authentication system to ensure that minors cannot access adult content through adult devices.

In France, a new law mandates social media platforms to verify the age of users and obtain parental consent for those under 15. This legislation aims to reduce children’s screen time and protect them from cyberbullying and other online risks. Platforms that fail to comply with the law could face fines of up to 1% of their global revenues. This legislative move highlights the growing emphasis on safeguarding minors in the digital space.

AI’s take on the challenge

Tech companies are exploring AI-driven solutions to tackle the age verification challenge. AI algorithms can estimate a user’s age based on their activity patterns and content interactions. For instance, facial recognition technology can be employed during the registration process for websites requiring a specified minimum age like gambling websites. However, these technologies raise concerns about privacy and data security. Ensuring these AI-driven methods are effective and privacy-conscious is a key challenge for tech companies. Other options include behavioral analytics, already used by TikTok, which removes those accounts suspected to be from underage users and has removed over 76 million accounts in 2023 worldwide.

Striking a balance

While robust age verification methods are necessary to protect children, they must not infringe on users’ privacy or make platform use complicated. Achieving this balance requires careful consideration of both technological capabilities and user experience. 

As regulators tighten the rules, social media platforms must adapt to protect young users while maintaining usability. Collaborative efforts between governments, tech companies, and educators are essential to developing effective age verification systems. Integrating innovative technologies and legislative measures makes it possible to create a safer online environment for children.