Struggling with Google Bot Blocks - Please help!
-
I own a site called www.wheretobuybeauty.com.au
After months and months we still have a serious issue with all pages having blocked URLs according to Google Webmaster Tools.
The 404 errors are returning a 200 header code according to the email below. Do you agree that the 404.php code should be changed? Can you do that please ?
The current state:
Google webmaster tools Index Status shows:
26,000 pages indexed
44,000 pages blocked by robots.
In late March, we implemented a change recommended by an SEO expert and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned.
The no of pages that are displayed in a google search request of www.google.com.au where the query was ‘site:wheretobuybeauty.com.au’ is 37,000:
This new site has been re-crawled over last 4 weeks.
About the site
This is a Linux php site and has the following:
55,000 URLs in sitemap.xml submitted successfully to webmaster tools
robots.txt file has been modified several times:
Firstly we had none
Then we created one but were advised that it needed to have this current content:
User-agent: *
Disallow:
-
No problem my friend. You are most welcome and here at Moz, you will not only be able to get almost all your SEO related queries addressed and solved, you will also learn a great deal about digital marketing. I highly recommend to every aspiring digital marketer to be active on a community like Moz and I bet they will be able to save a great deal of time and money as well. Wish you all the very best.
Regards,
Devanur Rafi.
-
Thanks Devanur - trying out everything you have suggested.
-
Hi Alex,
Sorry, if I were not clear in my previous post. I meant that in general pages with cleaner code will have an edge over similar pages with bad code when it comes to SEO.
Just an example: Page A has cleaner code compared to page B with all other SEO factors being equal. In a scenario like this, page B might not be favored by Google because of issues arising from bad code like page loading performance, poor rendering in browsers etc,.
The issue at hand might not be because your pages do not pass W3 Validation but its not a bad idea to have a cleaner code on your website
Best regards,
Devanur Rafi.
-
Hi Devanur
My understanding is that Google does not have a problem with invalid XHTML or pages that are not W3C accessible. Please see a comment on this at SEOMOZ:
-
Hi Alex,
I did a code validation check for the following URL:
It gave 238 Errors and 538 Warnings!!
Search engines like Google favor pages with cleaner code. So, I strongly recommend to have the code cleaned on the website.
Here you go for validation check:
Best regards,
Devanur Rafi.
-
Hi Alex,
If the underscores constitute only 4% of the total URLs, then this can be safely kept aside in purview of the current issue.
Same goes with the keyword repetition in the page titles and URLs. However, if it is possible for you to revisit your URL structure and have it like the following, you should go for it:
www.wheretobuybeauty.com.au/<brand< a=""> name>/<product name="">, e.g.</product></brand<>
http://www.wheretobuybeauty.com.au/floris/royal-arms-diamond-edition-eau-de-parfum-spray-100ml-34oz
Same thing with the Page titles also.
Now we are left with two things, the page performance and URL canonicalization. Please have them fixed as early as possible.
Also, I checked your IP address and you have gone for a shared hosting. This is not at all recommended if you are a serious online business owner. Your IP, 103.9.170.75 is being shared by at least 250 other domains that include some bad websites.
Though there are different views about IP bad neighborhood and its impact on SEO, I have always been an advocate of clean IP and recommended it to all my clients always. You can go in for a dedicated IP which is very cheap these days and better yet if you go for a VPS.
For more about this, please check out the "Oops, your IP is either dirty or virtual" section on the following page:
http://www.bruceclay.com/in/seo-tech-tips/techtips.htm
And also, this section, "A Strong Foundation for Your Site to Operate On" on the following page:
http://www.bruceclay.com/blog/2011/04/the-seo-bucket-list-3-things-to-do-before-your-site-dies/
Lastly, I checked your domain's DNS health and here you go for the results:
http://intodns.com/wheretobuybeauty.com.au
Though these might not be causing the current issue, its good to sort everything as we should not leave any stone unturned in making our website a better one out there.
Best regards,
Devanur Rafi.
-
Hey Devanur
please see our responses below:
Hi Alex,
Thanks for the info. Here are few issues that I observed with the website and I am very confident that if you can address and fix these, you should come out of the issue with flying colors:
1. URL canonicalization issue: Both the www and non-www versions of your website URLs return an HTTP header status code 200. You should ideally make all the non-www URLs to be redirected to their respective www versions via a 301 permanent redirection immediately.
**Response: We are asking the developer to correct this. **
2. Inconsistent URL structure: Your website is still using 'underscrores (_) in the URLs as word separators. There are underscores along with the recommended hyphens (-). This inconsistent usage can sometimes lead to issues. So please replace all the underscores with hyphens.
Response: This problem only occurs in a few pages where special characters have been replaced with underscores – probably in 4% of product pages. I can’t see that this has an impact on the SEO?
3. Google PageSpeed check: When I ran Google PageSpeed test on some of the URLs from your website along with the ones that you gave, I found the score varying between, 28 and 60. Please look at the recommendations that the PageSpeed tool gives and try to address the issues (especially the ones like, "Reduce blocking resources". For more: https://developers.google.com/speed/docs/best-practices/rtt#PreferAsyncResources)
I suggest you to please run Google PageSpeed check for some of the URLs.
Note: The URLs from your website that are present in the Google's index may also give similar issues when run through PageSpeed test. This should not make you not addressing these issues.
Response: We will ask the developers to improve performance specifically with the highest value things that are showing up in Google PageSpeed check.
4. Heavy pages leading to higher page loading times and response times:
Many of the pages that I checked are more than 1.3 MB in size which is very huge.This can be a really big problem most of the times that will not only impacts your website from search engines' perspective but also leads to bad user experience which ultimately affects the SEO of your website. You can use tools like gtmetrix.com and fix the issues shown by them.
Response: We will ask the developers to improve performance specifically with the highest value things that are showing up in gtmetrix.com suggestions.
5. Repetition of keywords or phrases in page titles and URLs:
This issue might look like an over optimization effort and should be fixed as early as possible.
For example: www.wheretobuybeauty.com.au/acqua-di-parma/acqua-di-parma-acqua-di-parma-collezione-barbiere-shaving-cream-75ml_25oz
If you look at the above page, the phrase, 'acqua-di-parma' is present twice in both the URL and page title. This is something that you need to review seriously as it looks like keyword repetition that is not good from an SEO stand point.
Response: This occurs with approx 300 product pages out of 40,000 so a very small percentage. We will clean this up when we update our data. I can’t see that this has any impact on SEO considering the small no? Note however that every product page is constructed as follows:
http://www.wheretobuybeauty.com.au/floris/floris-royal-arms-diamond-edition-eau-de-parfum-spray-100ml_34oz
Is there some risk that this will look like over optimisation?
By the way, your robots.txt file is clean and it should not be causing these issues.
Please have the issues mentioned above as soon as possible and you should be out of the woods soon after that.
I wish you good luck Alex.
Best regards,
Devanur Rafi.
-
Hi Alex,
Thanks for the info. Here are few issues that I observed with the website and I am very confident that if you can address and fix these, you should come out of the issue with flying colors:
1. URL canonicalization issue: Both the www and non-www versions of your website URLs return an HTTP header status code 200. You should ideally make all the non-www URLs to be redirected to their respective www versions via a 301 permanent redirection immediately.
2. Inconsistent URL structure: Your website is still using 'underscrores (_) in the URLs as word separators. There are underscores along with the recommended hyphens (-). This inconsistent usage can sometimes lead to issues. So please replace all the underscores with hyphens.
3. Google PageSpeed check: When I ran Google PageSpeed test on some of the URLs from your website along with the ones that you gave, I found the score varying between, 28 and 60. Please look at the recommendations that the PageSpeed tool gives and try to address the issues (especially the ones like, "Reduce blocking resources". For more: https://developers.google.com/speed/docs/best-practices/rtt#PreferAsyncResources)
I suggest you to please run Google PageSpeed check for some of the URLs.
Note: The URLs from your website that are present in the Google's index may also give similar issues when run through PageSpeed test. This should not make you not addressing these issues.
4. Heavy pages leading to higher page loading times and response times:
Many of the pages that I checked are more than 1.3 MB in size which is very huge.This can be a really big problem most of the times that not only impacts your website from search engines' perspective but also leads to bad user experience which ultimately affects the SEO of your website. You can use tools like gtmetrix.com and fix the issues shown by them.
5. Repetition of keywords or phrases in page titles and URLs:
This issue might look like an over optimization effort and should be fixed as early as possible.
For example: www.wheretobuybeauty.com.au/acqua-di-parma/acqua-di-parma-acqua-di-parma-collezione-barbiere-shaving-cream-75ml_25oz
It could have been like: www.wheretobuybeauty.com.au/acqua-di-parma/collezione-barbiere-shaving-cream-75ml-25oz
If you look at the above page, the phrase, 'acqua-di-parma' is present twice in both the URL and page title. This is something that you need to review seriously as it looks like keyword repetition that is not good from an SEO stand point.
By the way, your robots.txt file is clean and it should not be causing these issues.
Please have the issues mentioned above as soon as possible and you should be out of the woods soon after that.
I wish you good luck Alex.
Best regards,
Devanur Rafi.
-
Thanks Devanur
I put this to my partners and he said he is addressing it but that the main issue still remains.
This is the critical issue where there are only a few pages visible to google search as almost all are blocked by the google bot. I am re-stating the problem in this email for you.
Can you please take a look at the whole problem and see if you can see what is causing this.
Is robots.txt causing this? It is the only change that we have made and at one point the problem was corrected but has now returned. I have read everything that I can about robots.txt on the google site and in forums.
Here are two examples (out of 44,000) that are blocked. It is easy to find other examples – simply test any of the product pages – only 200 out of 44,000 return any result.
Try searching using www.google.com.au and using the search query
Abercrombie & Fitch 1892 Cobalt Eau De Cologne Spray 50ml/1.7oz site:wheretobuybeauty.com.au
Second example:
Try searching using:
Acqua Di Parma Collezione Barbiere Shaving Cream 75ml/2.5oz site:wheretobuybeauty.com.au
The current state:
Google webmaster tools Index Status shows:
26,000 pages indexed
44,000 pages blocked by robots.
In late March, we implemented a change recommended by an SEO expert Harmeen and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned.
This new site has been re-crawled over last 4 weeks.
About the site
55,000 URLs in sitemap.xml submitted successfully to webmaster tools
robots.txt file has been modified several times:
Firstly we had none, then we created one but were advised that it needed to have this current content:
“User-agent: *
Disallow:
Sitemap: http://www.wheretobuybeauty.com.au/sitemap.xml”
I put this into robots.txt but was then advised yesterday that there should be no blank line between these lines and I removed them yesterday.
-
Hi Alex,
Without diving in to the issue of increased number of 404 errors being reported by Webmaster tools account, let us first look at the core issue where, 404 pages (non-existing resources) that return an HTTP header status code 200. These are called, 'soft 404 errors'. Ideally all the non-existing resources on the website should return an HTTP header status code 404 or 410 as per the situation and not a status 200 which is very confusing for search engines and a bad practice. This should be fixed immediately. Please have all such pages return 404 and not 200 as soon as possible.
Here you go for more about the soft 404 errors:
https://support.google.com/webmasters/answer/181708?hl=en
and here to know more about when to return a 404 status code:
https://support.google.com/webmasters/answer/2409439?hl=en
Best regards,
Devanur Rafi.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help - Losing Ranking on Plural Keywords in Google
Help! I'm losing keyword rankings for certain keywords - but still ranking well on others. The trend I can see is that we're losing rankings on plurals, so this page https://www.key.co.uk/en/key/platform-trolleys-trucks Used to rank well for: platform trolleys
Algorithm Updates | | BeckyKey
platform trucks
flat bed trolleys But now only ranks well for platform trolley flat bed trolleys platform truck I have seen a large number of keywords drop out - they're not hitting traffic massively yet, but conversions are dropping - help! I;m struggling to find the cause!0 -
404s in Google Search Console and javascript
The end of April, we made the switch from http to https and I was prepared for a surge in crawl errors while Google sorted out our site. However, I wasn't prepared for the surge in impossibly incorrect URLs and partial URLs that I've seen since then. I have learned that as Googlebot grows up, he'she's now attempting to read more javascript and will occasionally try to parse out and "read" a URL in a string of javascript code where no URL is actually present. So, I've "marked as fixed" hundreds of bits like /TRo39,
Algorithm Updates | | LizMicik
category/cig
etc., etc.... But they are also returning hundreds of otherwise correct URLs with a .html extension when our CMS system generates URLs with a .uts extension like this: https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.html
when it should be:
https://www.thompsoncigar.com/thumbnail/CIGARS/90-RATED-CIGARS/FULL-CIGARS/9012/c/9007/pc/8335.uts Worst of all, when I look at them in GSC and check the "linked from" tab it shows they are linked from themselves, so I can't backtrack and find a common source of the error. Is anyone else experiencing this? Got any suggestions on how to stop it from happening in the future? Last month it was 50 URLs, this month 150, so I can't keep creating redirects and hoping it goes away. Thanks for any and all suggestions!
Liz Micik0 -
Help for a webstore with Google Warnings for Watermark Images and Panda
I have not had too much experience with helping websites that have been hit by Panda - any tried and tested formulas I can pass to website owner would be great. He does not want to reveal domain name - its in the area of children/baby products 'Web site featured on page 1 of Google search results for many years (website 5 years old- Australian domain) . In April/May 2014, Google suspended our Google Shopping account because we used watermarks on all our images. We were advised that the suspension would remain in place indefinitely or until such time the watermarks were removed. We wrote back to Google to explain that these watermarks were put in place by our store back 2005 with the sole purpose of protecting our intellectual property. Needless to say, their attitude was unwavering. And as a result, revenue plummeted. However, the perfect storm was about to hit our store without warning. In the same month, Panda 4.0 was unleashed and our store was hit once again. This update alone reduced visitor numbers by around 50% overnight. The Panda 4.0 algorithm update was designed to target poor quality, duplicate content and unfortunately we had some of it. We have now begun creating original content with many of the new products we're uploading onto our web site. It's slow and tedious. We have modified our web site to now include a tag on a the home page (this was missing). We have removed many duplicate links from our footer (it was too big and contained hundreds of links that were also repeated from the header). We introduced a blog and we have engaged the services of a local seo company to disavow any bad backlinks and add missing or improve existing content to category and brand pages. No improvement in our situation is yet visible and with Christmas just 3 months away, poor sales during our 'bread and butter' period will mean even tougher times for our store in 2015. ANY PANDA EXPERTS who can help please email me [email protected] - looking for independent freelancers rather than agencies
Algorithm Updates | | GardenBeet0 -
Can a google data refresh knock your pages out of the rankings?
I see that around mid November 2013 a handful of my sites pages dropped off of Google completely. It was around the data refreshes in November, and while everyone says it doesn't effect that much I was wondering if anyone knew if it could knock some of my pages out of the rankings for a specific keyword. Note - we had previously held muliple listings for different pages on our site for this particular keyword. Google kept the highest ranking and knocked the lower ones off. See attached image of our keyword ranking history to see what I mean. DcJJM0M
Algorithm Updates | | franchisesolutions0 -
Google spitting out old data as new alerts
Am I just unlucky or are others seeing this too? I have several google alerts. For the past 6 months, google keeps sending crap along with good stuff. its a bit like their search results. There are three types of Alerts they send that I'm not impressed with. 1. Alerts that are from unintelligible splogs that take real news stories and rewrite them with unintelligible garbage that makes no sense at all. Sometimes, they serve up new alerts from the same splogs I saw several months ago, that I felt sure they would have zapped by now. 2. Old stories, that have been around for months. I just received one that was from January, from TechDirt, a big site that must get a huge amount of attention from google. 3. Irrelevant stories because they love to show how smart they are by splitting my alert keyword text into multiple words, but it gives useless results. This is the kind of stuff that crappy search engines like AltaVista used to do. Is google reverting to the childhood of search with all these changes?
Algorithm Updates | | loopyal0 -
Google Page Rank?
We have had a quality website for 12 years now, and it seems no matter how many more links we get and how much new content we add daily, we have stayed at PR3 for the past 10 years or so. Our SEOMoz domain authority is 52. We have over 950,000 pages linking to us from 829 unique root domains. Is this in line with PR3 or should we be approaching PR4 soon? We do daily blog posts with all unique, fresh quality content that has not been published elsewhere. We try to do everything with 'white hat' methods, and we are constantly trying to provide genuine content and high quality products, and customer service. How can we improve our PR and how important is PR today?
Algorithm Updates | | applesofgold0 -
How do I get the expanded results in a Google search?
I notice for certain site (ex: mint.com) that when I search, the top result has a very detailed view with options to click to different subsections of the site. However for my site, even though we're consistently the top result for our branded terms, the result is still only a single line item. How do I adjust this?
Algorithm Updates | | syount1 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0