Can you confirm legitimate Google Bot traffic?
-
We use Cloudflare as a firewall. I noticed a significant number of blocks of bot traffic. One of the things they do is try to block bad bot traffic. But it seems they are mistakenly blocking Google Bot traffic. If you use Cloudflare, you may want to look into this as well. Also, can you confirm if the following IPs are for legitimate Google Bots?
66.249.79.88
66.249.79.65
66.249.79.8066.249.79.76
Thanks,
-
Thank you very much for all of your feedback folks. I really appreciate it. Cloudflare has acknowledged that they are aware of the issue and working to correct it. It also affected Bing Bots. I ended up white listing the IP ranges, and our Google crawl has increased as a result.
-
Dammit, my mistake.
Was proabaly still asleep. I've checked with my log files and updated itThanks Alex!
-
10.0.0.0/8 is a private subnet and therefore unroutable, so I don't think you would have seen traffic from that subnet from Googlebot!
-
What Alex has pointed out is what you're needing.
Also, I'd like to remember you that Googlebot crawls from several IP ranges. Not only the 66.xxx.xxx.xx.
EDIT:
Alex pointed me out that I've made a mistake. So I've checked my data in server logs.
In some of the sites I'm working, most of the traffic comes from the range: [66.249.6x.xxx-66.249.7x.xxx]Always, when in doubt, with a reverse DNSlookup you'll find out whether it's an official GoogleBot or not.
Hope it helps.
Best luck.
Gaston -
Following the advice here: https://support.google.com/webmasters/answer/80553?hl=en I can confirm that they are indeed Googlebot.
If you use Fetch as Google in Search Console, are you seeing errors?
Cloudflare has an article here on the subject:
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
How can we analyze about duplication?
Howdy all, We have a few pages being hailed as copies by the google search comfort. Notwithstanding, we accept the substance on these pages is unmistakably extraordinary (for instance, they have totally unique list items returned, various headings and so on) An illustration of two pages google discover to be copies is underneath. in the event that anybody can spot what may be causing the copy issue here, would especially see the value in ideas! Much appreciated ahead of time.
Technical SEO | | camerpon090 -
Can Anybody Understand This ?
Hey guyz,
Technical SEO | | atakala
These days I'm reading the paperwork from sergey brin and larry which is the first paper of Google.
And I dont get the Ranking part which is: "Google maintains much more information about web documents than typical search engines. Every hitlist includes position, font, and capitalization information. Additionally, we factor in hits from anchor text and the PageRank of the document. Combining all of this information into a rank is difficult. We designed our ranking function so that no particular factor can have too much influence. First, consider the simplest case -- a single word query. In order to rank a document with a single word query, Google looks at that document's hit list for that word. Google considers each hit to be one of several different types (title, anchor, URL, plain text large font, plain text small font, ...), each of which has its own type-weight. The type-weights make up a vector indexed by type. Google counts the number of hits of each type in the hit list. Then every count is converted into a count-weight. Count-weights increase linearly with counts at first but quickly taper off so that more than a certain count will not help. We take the dot product of the vector of count-weights with the vector of type-weights to compute an IR score for the document. Finally, the IR score is combined with PageRank to give a final rank to the document. For a multi-word search, the situation is more complicated. Now multiple hit lists must be scanned through at once so that hits occurring close together in a document are weighted higher than hits occurring far apart. The hits from the multiple hit lists are matched up so that nearby hits are matched together. For every matched set of hits, a proximity is computed. The proximity is based on how far apart the hits are in the document (or anchor) but is classified into 10 different value "bins" ranging from a phrase match to "not even close". Counts are computed not only for every type of hit but for every type and proximity. Every type and proximity pair has a type-prox-weight. The counts are converted into count-weights and we take the dot product of the count-weights and the type-prox-weights to compute an IR score. All of these numbers and matrices can all be displayed with the search results using a special debug mode. These displays have been very helpful in developing the ranking system. "0 -
Site removed from Google Index
Hi mozers, Two months ago we published http://aquacion.com We registered it in the Google Webmaster tools and after a few day the website was in the index no problem. But now the webmaster tools tell us the URLs were manually removed. I've look everywhere in the webmaster tools in search for more clues but haven't found anything that would help me. I sent the acces to the client, who might have been stupid enough to remove his own site from the Google index, but now, even though I delete and add the sitemap again, the website won't show in Google SERPs. What's weird is that Google Webmaster Tools tells us all the page are indexed. I'm totally clueless here... Ps. : Added screenshots from Google Webmaster Tools. Update Turns out it was my mistake after all. When my client developped his website a few months ago, he published it, and I removed the website from the Google Index. When the website was finished I submited the sitemap, thinking it would void the removal request, but it don't. How to solve In webmaster tools, in the [Google Index => Remove URLs] page, you can reinclude pages there. tGib0
Technical SEO | | RichardPicard0 -
Traffic stats disaster
Hi, We have recently moved a few old websites to an existing one and redesigned the existing website too. The visits in Analytics have fallen though the roof in the last few days. I would expect the traffic to have gone up because visitors from the old websites should now be on the redesigned one. Not sure what the problem is entirely. GA code is on the site but not before the closing head where I asked for it to be put. It's just before closing body tag. Does it make a difference? (It's a CMS site). GA code must be working because visits are being recorded (tens rather than the expected hundreds). Redirects have not been done properly - not sure what IT have done there. Would the lack of proper redirects be the sole reason for the drop? I am still hassling for 301's to be sorted. Thanks all
Technical SEO | | Houses0 -
Prevent Google Web Preview bot from seeing pop-up,m bad for SEO?
Hey guys, On our website, we have a lightbox pop-up showing an external page with an e-mail newsletter signup form. It it shown to some 5% of our visitors and works with a cookie to prevent the popup from showing at each visit. Recently, I saw the popup displayed in the Google SERP instant preview, for every page. The preview looks messed up. We could prevent the popup to be shown to the google web preview bot by blocking this user agent. Question is: Will it hurt our SEO? Because we show the web preview bot (not the crawl bot) something different than what a visitor may see BQrS7 BQrS7.jpg
Technical SEO | | Webprint0 -
Odd Google Indexing Issue
I have encountered something odd with Google indexing. According to the Google cache my site was last updated on April 6. I had been making a series of changes on April 7th and none of them show up in the cached version of the site (naturally). Then, on the 8th, my rankings seem to have dropped about 6 places and the main SERP is showing a text that isn't even on the Web site. The cached version has the correct page title from the page that was indexed on the 6th. How do I learn where Google is picking this up from? There is a clean page title tag on my Web site. I've checked the server, etc to see what's going on. The text isn't completely unrelated, but it definitely impacted my ranking. Does Google ever have these hiccups when indexing?
Technical SEO | | VERBInteractive0 -
"Site Suspended" in Google Adwords + Lost all rankings in Google => is this related?
Can anyone share thoughts on this: Does the S recently (mid april) we revamped our website (same content, new layout, strong brand), but a few days later our google rep contacted us to tell that she got a "red flag" for one of our SEA campaigns (we broke the bridge page policy, not on purpose to be clear), they were completely correct on this matter. We even got some extra time to correct this, normal policy is only 10 days. But, we were a little slow, so all our Adwords Campaigns are suspended and we get the message "Site suspended". We are working to have this fixed, our Google rep even granted some more time to fix this. Now, almost simultaneously, same time frame, all our new pages, that were already ranking well tx to proper 301 rules, suddenly fell out of the google SERPS, nothing to be found anymore up till now. Our website is live since 1996, no issues, up till now. There seems to be a strong correlation to what happened in our SEA and what happened in our SEO can anyone share some info?
Technical SEO | | TruvoDirectories0