The agreement sealed in December last year by the European Commission raises the so-called 'digital age of consent' threshold to 16 years old, though individual Member States may reduce this to 13.

This implies that those companies providing related services will need to deal with different minimum age thresholds in different Member States.

When collecting personal data, it is important that the data subject properly understands how the data that is provided will be used. However, that can be difficult when collecting data regarding children.

In the UK, for example, The Data Protection Act 1998 does not specify the age at which children are legally able to give consent to processing of their personal data. Current guidance from the Information Commissioner's Office recommends that consent should be sought from a parent or guardian prior to collecting information from children up to the age of 12, but notes that there may be cases where it is necessary to obtain parental consent from children older than 12.

Best practices to protect minors within the new age-restricted content framework

But it’s not just about the data protection itself, but also about safeguarding the audience that content is destined to, which turns quite important when affecting minors.

In a recently updated guide for advertisers, publishers and, by extension, to any one offering any kind of product or service on the Internet, the British Government highlights that 12% of online 9-16 year olds say they have seen sexual images online. Likewise, the same proportion of online 11-16s (12%) said they’d seen websites where people talk about taking drugs, and 17% had seen sites where people discuss ways of physically hurting themselves. 4% said they’d seen websites where people discuss ways of committing suicide, and the same proportion (4%) said they had received “sexting” messages.

In addition to that, the takeoff of social networks such as Snapchat, clearly catering more for a younger audience than the more established Twitter or Facebook, or the ever growing appeal of Instagram, keeping up to speed with regulations about the minimum age of audiences to consume certain types of content or digital goods has turned into a priority.

For example, while Facebook has created a Facebook Guide for Educators with tips and advice on how to use Facebook within the classroom, Twitter gives advice for families  to remind them that the service is a public space. It highlights the importance of media literacy and critical thinking when using the internet.

Merchants can avoid penalties by implementing more efficient age verification processes

Another stimulus for advertisers to get up to speed with the new regulation - which will come into force in 2018 -, is enforcement, as businesses can be fined up to 4% of global turnover for failing to comply.

Even if each of the Member States legislates to apply the lowest permitted age threshold in the given country, policy makers in the country will need to change their approach, probably by introducing more prescriptive requirements in relation to the steps that data controllers must take to ensure they have appropriate consent .

A good way for online merchants to cope with the strengthened data protection requiring is to couple more traditional approaches to age verification such as requesting the user’s date of birth with more technologically advanced verification methods such as facial recognition.

For example, Mitek´s Mobile Verify leverages the camera in mobile devices so mobile and online merchants of age-restricted goods can accurately verify a customer’s age using ID verification via facial recognition.

 

Add new comment