Back in 2007, Google came up with a mechanism for SEOs or site owners to verify that Googlebot is who it states it is through reverse DNS checks. Now, Google has likewise decided to publish a list of IP addresses that Googlebot will utilize to crawl your site.Google published two various JSON files with the list of IP addresses Googlebot can utilize:-LRB- 1) You can recognize Googlebot by IP address by matching the spiders IP address to the list of Googlebot IP addresses in this JSON file.( 2) For all other Google spiders, match the crawlers IP address versus the total list of Google IP addresses in this JSON file.I assume these IP addresses may alter from time to time, so it may make good sense for you to inspect the JSON files daily for updates, that is if you automate any scripts to utilize this IP list.It is great to have actually these formally published by Google. Keep in mind, there are other approaches to validate Googlebot noted here.I asked John Mueller of Google why did they do this now, why is reverse DNS not good enough? John responded on Twitter saying “it makes it a bit easier for some websites (CDNs, and so on), and the old concerns/ risks around cloaking seem to have mostly disappeared, so …”It makes it a bit much easier for some sites (CDNs, and so on), and the old issues/ dangers around cloaking seem to have mostly disappeared, so pic.twitter.com/PgsGYBzn6i— & #x 1f9c0; John & #x 1f9c0; (@JohnMu) November 10, 2021 Forum discussion at Twitter.

site-tuning