Google has recently announced changes in their products and policies for its entire suite of offerings. These changes have been brought in keeping the minors and children in mind. Based on these changes, young people can now browse in a more safe and secured manner with reduced advertisement targeting.
According to a blogpost by Google, “We’re committed to building products that are secure by default, private by design, and that put people in control”. The company also added “As kids and teens spend more time online, parents, educators, child safety and privacy experts, and policymakers are rightly concerned about how to keep them safe. We engage with these groups regularly and share these concerns”.
The company mentioned that it will also allow any user aged below 18, or their parents/ guardians to remove a specific image from Google image search results, while browsing. And, other restrictions will also include blocking specific advertisements based on location, gender or interests.
The tech giant will automatically enable ‘SafeSearch’ to filter out the explicit content for the kids. Further, the applications owners will also be required to submit to disclose how they use data as per the new guidelines issued by the company.
YouTube, the most popular video streaming platform also came with a set of guidelines where they can remove a set of commercial advertisements from the YouTube Kids section. The commercials solely encourage kids to buy a specific product. Moreover, the company decided to add sections like protections like ‘Take a Break’ and default bedtime reminders for ages 13-17.
Recently, TikTok and Instagram implemented similar guidelines to safeguard the experience for minors using their apps.