Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How do I complete a reverse DNS check when completing log file analysis?
-
I'm doing some log file analysis and need to run a reverse DNS check to ensure that I'm analysing logs from Google and not any imposters. Is there a command I can use in terminal to do this?
If not, whats the best way to verify Googlebot?
Thanks
-
That's awesome! Glad to know there's a bulk tool out there!
-
Hi Tyler,
Thanks for your reply. I managed to get down to 98 unique IPs and ran a bulk reverse DNS/IP Look-up using this tool:
https://www.infobyip.com/ipbulklookup.php
Thanks for your help though!
-
Hey Daniel,
If you want to verify that a user-agent is actually Googlebot, you'll want to use a log file analysis tool to aggregate all of the IP addresses associated with Googlebot. Once you have a list of IP addresses, you can perform a reverse DNS lookup to verify whether the IP addresses are actually associated with Googlebot or not.
If you're on windows/pc these steps should work:
https://www.serverintellect.com/support/dns/reverse-dns/If you're on mac try these steps:
1. open Terminal
2. type "host" + ip address
for example: "host 66.249.66.1"
3. hit enter
4. view results. For example: "1.66.249.66.in-addr.arpa domain name pointer crawl-66-249-66-1.googlebot.com"If the results are from Google.com or Googlebot.com, you can be sure it's actually Google crawling your site. Unfortunately, I don't know of any faster ways to achieve these results. I'm sure there's a tool out there, I just haven't found it yet.
This might also be a good resource for you: https://support.google.com/webmasters/answer/80553?hl=en
Good luck!
-Tyler
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Temporary redirect from 302 to 301 for PNG File?
#302HTTP #temporaryredirect
Technical SEO | | Damian_Ed 0
Hi everyone, Recently I have faced a crawl issue with my media images on website. For example this page url https://intreface.com/wp-content/uploads/2022/12/Horion-screen-side-2.png has 302 HTTP Status and the recommendation is to change it 301. I have read the article on temporary redirections here:
https://moz.com/learn/seo/redirection?_ga=2.45324708.1293586627.1702571936-916254120.1702571936
but its not written here how to redirect in my HTML 1 image url not the landing page.
Screenshot 2023-12-15 at 11.02.40.png
I have messaged to MOZ Support but they recommended to go for the MOZ Community!
Screenshot 2023-12-15 at 11.06.02.png Could you assist me wit this issue please? I can reach HTTML of the necessary page and change what I need for permanent redirection but firstly I need to understand how to do that correctly.0 -
Removing CSS & JS Files from Index
Hi, Google has indexed a few .CSS and .JS files that belong to our WordPress plugins and themes. I had them blocked via robots, but realized this doesn't prevent indexation (and can likely hurt us since Google wants to access these files). I've since removed the robots instructions, submitted a removal request via Search Console, but want to make sure they don't come back. Is there a way to put a noindex tag within .CSS and .JS files? Or should I do something with .htaccess instead?
Technical SEO | | kirmeliux1 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
How to check if an individual page is indexed by Google?
So my understanding is that you can use site: [page url without http] to check if a page is indexed by Google, is this 100% reliable though? Just recently Ive worked on a few pages that have not shown up when Ive checked them using site: but they do show up when using info: and also show their cached versions, also the rest of the site and pages above it (the url I was checking was quite deep) are indexed just fine. What does this mean? thank you p.s I do not have WMT or GA access for these sites
Technical SEO | | linklander0 -
PageSpeed Insights DNS Issue
Hi Anyone else having problems with Google's Pagespeed tool? I am trying to benchmark a couple of my sites but, according to Google, my sites are not loading. They will work when I run them through the test at one point but if I try again, say 15 mins later, they will present the following error message An error has occured DNS error while resolving DOMAIN. Check the spelling of the host, and ensure that the page is accessible from the public Internet. You may refresh to try again. If the problem persists, please visit the PageSpeed Insights mailing list for support. This isn't too much an issue for testing page speed but am concerned that if Google is getting this error on the speed test it will also get the error when trying to crawl and index the pages. I can confirm the sites are up and running. I the sites are pointed at the server via A-records and haven't been changed for many weeks so cannot be a dns updating issue. Am at a loss to explain. Any advice would be most welcome. Thanks.
Technical SEO | | daedriccarl0 -
How to check readability in testing mode
Hi, Is there a way to test if my content is readable while it is still in testing mode (meaning there obviously wont be a cache) Thanks!
Technical SEO | | theLotter0 -
301 Redirect on a PDF, DOCX files?
Hi, I have to rename many pdf and docx files. How can I implement 301 redirect on them as they are linked from 'n' number of places? Regards, Shailendra Sial
Technical SEO | | IM_Learner1 -
.htacess file format for Apache Server
Hi, My website having canonical issue for home page, I have written the .htaccess file and upload the root directory. But still I didn't see any changes in the home page. I am copying syntax which one I have written in the .htaccess file. Please review the syntax and let me know the changes. Options +FollowSymlinks RewriteEngine on #RewriteBase / re-direct index.htm to root / ### RewriteCond %{THE_REQUEST} ^./index.htm\ HTTP/ RewriteRule ^(.)index.htm$ /$1 [R=301,L] re-direct IP address to www ### re-direct non-www to www ### re-direct any parked domain to www of main domain RewriteCond %{http_host} !^www.metricstream.com$ [nc] RewriteRule ^(.*)$ http://www.metricstream.com/$1 [r=301,nc,L] Is there any specific htaccess file format for apache server? Thanks, Karthik
Technical SEO | | karthik-1755440