Google does not allow kids under the age of 13 to create an account in Google as of now. However, it does not have an algorithm to detect whether an underaged user has faked his age or not.
Google is ramping up its policies to make the internet a safe place for young users. The search giant has announced that it will give minors more control over their digital footprint. In simple terms, Google will now let users under the age of 18 request the removal of their images from Google Search results. If the young user is not able to file the application, their parents or guardian can request Google on their behalf.
Google, however, made it clear that it would not remove the images from the web. “In the coming weeks, we’ll introduce a new policy that enables anyone under the age of 18, or their parent or guardian, to request the removal of their images from Google Image results. Of course, removing an image from Search doesn’t remove it from the web, but we believe this change will help give young people more control of their images online,” the search giant said in the blog post.
Google does not allow kids under the age of 13 to create an account in Google as of now. However, it does not have an algorithm to detect whether an underaged user has faked his age or not. Keeping the shortcomings in mind, Google is making changes in its apps including YouTube, Google Search app, Google Assistant and more.
The search giant said that it will not display mature content the young users haven’t searched for on the internet. Google offers a SafeSearch protection that helps filter out explicit results when enabled and is already on by default for all signed-in users under 13 who have accounts managed by Family Link. The company says that it would turn on SafeSearch protection for existing users under 18 and make this the default setting for teens setting up new accounts. The SafeSearch will also apply its SafeSearch technology to the web browser on smart displays.
Google is launching a new safety section on Google Play Store that will let the parents of the kids know which app follows the family guidelines. “Apps will be required to disclose how they use the data they collect in greater detail, making it easier for parents to decide if the app is right for their child before they download it,” Google said.
Apart from making changes to its apps, Google will also let parents set screen time limits and reminders for their kids’ supervised devices. In the coming months, Google will roll out new Digital Wellbeing filters that allow people to block news, podcasts, and access to webpages on Assistant-enabled smart devices.