How do I complete a reverse DNS check when completing log file analysis?
-
I'm doing some log file analysis and need to run a reverse DNS check to ensure that I'm analysing logs from Google and not any imposters. Is there a command I can use in terminal to do this?
If not, whats the best way to verify Googlebot?
Thanks
-
That's awesome! Glad to know there's a bulk tool out there!
-
Hi Tyler,
Thanks for your reply. I managed to get down to 98 unique IPs and ran a bulk reverse DNS/IP Look-up using this tool:
https://www.infobyip.com/ipbulklookup.php
Thanks for your help though!
-
Hey Daniel,
If you want to verify that a user-agent is actually Googlebot, you'll want to use a log file analysis tool to aggregate all of the IP addresses associated with Googlebot. Once you have a list of IP addresses, you can perform a reverse DNS lookup to verify whether the IP addresses are actually associated with Googlebot or not.
If you're on windows/pc these steps should work:
https://www.serverintellect.com/support/dns/reverse-dns/If you're on mac try these steps:
1. open Terminal
2. type "host" + ip address
for example: "host 66.249.66.1"
3. hit enter
4. view results. For example: "1.66.249.66.in-addr.arpa domain name pointer crawl-66-249-66-1.googlebot.com"If the results are from Google.com or Googlebot.com, you can be sure it's actually Google crawling your site. Unfortunately, I don't know of any faster ways to achieve these results. I'm sure there's a tool out there, I just haven't found it yet.
This might also be a good resource for you: https://support.google.com/webmasters/answer/80553?hl=en
Good luck!
-Tyler
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My 'complete guide' is cannibalising my main product page and hurting rankings
Hi everyone, I have a main page for my blepharoplasty surgical product that I want to rank. It's a pretty in-depth summary for patients to read all about the treatment and look at before and after pictures and there's calls to action in there. It works great and is getting lots of conversions. But I also have a 'complete guide' PDF which is for patients who are really interested in discovering all the technicalities of their eye-lift procedure including medical research, clinical stuff and risks. Now my main page is at position 4 and the complete guide is right below it in 5. So I tried to consolidate by adding the complete guide as a download on the main page. I've looked into rel canonical but don't think it's appropriate here as they are not technically 'duplicates' because they serve different purposes. Then I thought of adding a meta noindex but was not sure whether this was the right thing to do either. My report doesn't get any clicks from the serps, people visit it from the main page. I saw in Wordpress that there's options for the link, one says 'link to media file', 'custom URL' and 'attachment'. I've got the custom URL selected at the moment. There's also a box for 'link rel' which i figure is where I'd put the noindex. If that's the right thing to do, what should go in that box? Thanks.
Technical SEO | | Smileworks_Liverpool0 -
One robots.txt file for multiple sites?
I have 2 sites hosted with Blue Host and was told to put the robots.txt in the root folder and just use the one robots.txt for both sites. Is this right? It seems wrong. I want to block certain things on one site. Thanks for the help, Rena
Technical SEO | | renalynd270 -
How to check if an individual page is indexed by Google?
So my understanding is that you can use site: [page url without http] to check if a page is indexed by Google, is this 100% reliable though? Just recently Ive worked on a few pages that have not shown up when Ive checked them using site: but they do show up when using info: and also show their cached versions, also the rest of the site and pages above it (the url I was checking was quite deep) are indexed just fine. What does this mean? thank you p.s I do not have WMT or GA access for these sites
Technical SEO | | linklander0 -
What's Worse - 404 errors or a huge .htaccess file
We have changed our site architecture pretty significantly and now have many fewer pages (albeit with more robust content and focused linking). My question is, what should I do about all the 404 errors (keep in mind, I am only finding these in Bing Webmaster tools, not Moz or GWT)? Is it worse to have all those 404 errors (hundreds), or to have a massive htaccess file for pages that are only getting hits by the Bing crawlbot. Any insight would be great. Thanks
Technical SEO | | CleanEdisonInc0 -
Link removal from search rank checking sites
I'm going through the link removal process for unnatural links to a site. While I'm able to identify the obvious link profile and seo-article links that Google wants removed, what should we do about the links that are generated by the various seo link investigation and ranking services? Example: http://www.seoprofiler.com/analyze/allamericanfencing.com This site (seoprofiler) automatically creates these links to web sites when it generates its reports. Are those links that need to be removed or disavowed, or will Google not care? I want to err on the side of caution, but don't know how to treat these types of pages. The site didn't ask for or lobby for those links, so it's "natural" in that sense, but they're not editorially earned either (except for happen to be ranking for a similar term). Does anyone have experience on this aspect of the unnatural link grooming process?
Technical SEO | | CHarkins0 -
How can I find my Webmaster Tools HTML file?
So, totally amateur hour here, but I can't for the life of me find our HTML verification file for webmaster tools. I see nowhere to look at it in Google Webmaster Tools console, I tried a site:, I googled it, all the info out there is about how to verify a site. Ours is verified, but I need the verification file code to sync up with the Google API and no one seems to have it. Any thoughts?
Technical SEO | | healthgrades0 -
Need Help writing 301 redirects in .htaccess file
SEOmoz tool shows me 2 errors for duplicate content pages (www.abc.com and www.abc.com/index.html). I believe, the solution to this is writing 301 redirects I need two 301 redirects 1. abc.com to www.abc.com 2. /index.html to / (which is www.abc.com/index.html to www.abc.com) The code that I currently have is ................................................... RewriteEngine On
Technical SEO | | WebsiteEditor
RewriteCond %{HTTP_HOST} ^abc.com
RewriteRule (.*) http://www.abc.com/$1 [R=301,L] Redirect 301 http://www.abc.com/index.html http://www.abc.com ...................................................... but this does not redirect /index.html to abc.com. What is wrong here? Please help.0 -
What's the best free tool for checking for broken links?
I'm trying to find the best tool to check for broken links on our site. We have over 11k pages and I'm looking for something fast and thorough! I've tried Xenu and LinkChecker. Any other ideas?
Technical SEO | | CIEEwebTeam0