Google will release new privacy features that will allow consumers to gain more control over unwelcome personal photos online and ensure that graphic or explicit photos do not surface frequently in search results. The latest update to Google regulations regarding personal explicit photographs means that users will now have the ability to remove explicit and non-consensual imagery of themselves that they no longer want to be accessible in searches.
The new upgrade implies that even if someone creates and uploads explicit information on a website and no longer wants it to be searchable, they can simply request that it be removed from Google search. Google has now simplified the forms for submitting such requests. The guideline does not apply to picture users who are currently and actively commercializing their work. The policy also applies to websites that include personally identifiable information.
Google will notify users when their names appear in search results
Google will launch a new dashboard, first available in the United States in English, that will allow users to see search results that include their contact information. Users can easily submit a request to Google to have these search results removed. When fresh results with information about the user appear in a search, the tool will also send a notification. Google will also make a new blurring setting in SafeSearch the default for individuals who do not have SafeSearch filtering enabled.
When explicit photos, adult or graphic violent content appears in search results, the new settings will blur it by default. Unless a person is a supervised user on a public network that has retained this setting as default and locked it, the setting can be turned off at any time. For example, if you search for photographs under “injury” on Google, the search engine will blur the explicit content to protect consumers from seeing gory stuff. Google first disclosed this protection in February, and it will now go universal in August.