Google to tighten privacy for teens on search engine and YouTube
Technology and social media companies in America are under pressure from regulators and lawmakers to do more to protect children from harm
Alphabet, owner of Google, on Tuesday announced a series of changes aimed at improving privacy protection for teenagers, including limiting the targeted advertising of under-18s.
In a blog post, the Mountain View, California-based company said videos uploaded to YouTube by 13- to 17-year-old teens would be private by default, allowing the content to be seen only by the users and people they designate. It will also start to allow anyone under 18 years old, or a parent or guardian, to request the removal of that minor’s images from Google Image search results.
“Of course, removing an image from search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online,” the company said.
“Location History is a Google account setting that helps make our products more useful. It’s already off by default for all accounts, and children with supervised accounts don’t have the option of turning Location History on. Taking this a step further, we’ll soon extend this to users under the age of 18 globally, meaning that Location History will remain off (without the option to turn it on),” the blog post said.
Also read: What next in the tussle between government and social media firms?
The company will roll out the changes in the “coming weeks”, it said.
Instagram, owned by social media giant Facebook, recently announced a policy against targeting users under 18. Among the changes, one will make accounts created by children under 16 private by default.
Though both Google and Facebook have said they will take steps to protect children, their approaches are different. Facebook said advertisers would be able to target under-18s based only on their age, gender and location — and not on their interests or their activity on other apps and websites. Google said it would block personalised ads that were based on age, gender or interests to people under 18, but will allow ads based on context, such as a person’s search requests.
Technology and social media companies in America are under pressure from regulators and lawmakers to do more to protect children from harm. In the last few months, two pieces of legislation, one in the House of Representatives and one in the Senate, seek to update the Children’s Online Privacy Protection Act. The 1998 law restricts the tracking and targeting of under-13s, and the bills would extend those protections to teenagers.
Last week Apple announced new protections against explicit images in its Messages app and safeguards against the uploading of explicit or abusive images of children to its iCloud libraries.