Google Cache can't keep up with my 403s
-
Hi Mozzers,
I hope everyone is well.
I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good.
Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403.
These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow.
My question is
a) How much is this hurting me?
b) Can I fix it?
All suggestions welcome and thanks for any answers!
-
Hi Ray-pp,
Thanks for this. I think we will redirect to similar pages.
Much appreciated!
-
So... why return a 403 Forbidden? A 404 Not Found is what you should return. That sends a stronger signal than a 403. Either way, both will eventually lead to the pages being de-indexed. If you need the pages gone faster, there is a way to manually de-index a page using Webmaster Tools.
-
Hi HireSpace,
a) The negative impact depends on:
- Is there traffic landing on this page from any outside channel (organic, referral, paid marketing)
If so, then yes it is probably hurting your site. If a visitor sees a 403 page a common response is to go directly back to the referring page, i.e. they leave your site.
- Did the 403'd page have external links pointing to the page?
If yes, then a 403 error would cause the link authority to drop, since you do not redirect that page to another page on your site.
- As far as SEO is concerned, no this isn't negatively impacting your site.
When Google sees a 403 error they pretty much handle it like any other 400 error. They wont penalize you, however, having a lot of 400 errors could be an indication of poor usability and we know how Google loves to introduce new ranking factors for the SERPs.
b) Can I fix it?
Yes, I suggest, for any page removed from your site, that you 301 the page to its closest related page. This tells G that the page is permanently moved to a new page, pass any authority to that page, and anyone landing on the old page is automatically redirected to the new page. You'll see the 403 errors decrease as G crawls your site and recognizes the 301 redirect.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find temporary redirects of existing site you don't control?
I am getting ready to move a clients site from another company. They have like 35 tempory redirects according to MOZ. Question is, how can I find out then current redirects so I can update everything for the new site? Do I need access to the current htaccess file to do this?
Technical SEO | | scott3150 -
How is this possible? A 200 response and 'nothing' to be seen? Need help!
On checking this website http://dogtraining.org.uk/ I get a 200 response. But an Oops! Google Chrome could not find dogtraining.org.uk . Same with Firefox (Server not found). Obviously there is a problem - I just don't know where to 'start' investigating to spot the error. Can someone help me? Thank you!
Technical SEO | | patrihernandez0 -
Google webmaster showing 0 indexed, yet I can see them all them Google search?
I can see them all the pages showing up in Google when i search for my site. But in webmaster tools under the sitemaps section in the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed. Any idea why is this showing like this? I don’t really think it’s that important as the pages are still indexed, but it just seems odd. Please see in the image.
Technical SEO | | Perfect0070 -
'No Follow' and 'Do Follow' links when using WordPress plugins
Hi all I hope someone can help me out with the following question in regards to 'no follow' and 'do follow' links in combination with WordPress plugins. Some plugins that deal with links i.e. link masking or SEO plugins do give you the option to 'not follow' links. Can someone speak from experience that this does actually work?? It's really quite stupid, but only occurred to me that when using the FireFox add on 'NoDoFollow' as well as looking at the SEOmoz link profile of course, 95% of my links are actually marked as FOLLOW, while the opposite should be the case. For example I mark about 90% of outgoing links as no follow within a link masking plugin. Well, why would WordPress plugins give you the option to mark links as no follow in the first place when they do in fact appear as follow for search engines and SEOmoz? Is this a WordPress thing or whatnot? Maybe they are in fact no follow, and the information supplied by SEO tools comes from the basic HTML structure analysis. I don't know... This really got me worried. Hope someone can shed a light. All the best and many thanks for your answers!
Technical SEO | | Hermski0 -
Panda or Penquin -Website Fell - Shouldn't this Recover?
On March 23rd our site fell 47% in one day. www.TranslationSoftware4u.com but we still held quite a few #1 to #7 rankings on Google and thought it would just recover. Our top keyword "translation software" was #4 , now we are #19 Over the next week I waited to see if it recovered. We have been online 10+ years and always stayed with white hat. I admit to learning as I go over the years but always felt content was king so I focused on information. I really do not see my site as using spam techniques but maybe I am missing something on the way I have it. March 23rd, major drop -47% On April 2nd I started with SEO MOZ and the Research tools showed we had duplicate content warning. This was from a blog we were trying to start that only had 7 posts but it had about 20 tags per post. I did not realize that tags actually created that post under that tag. I went in and deleted the tags again being stupid and not realizing it was then making that come up 404. The blog was so small we do not get hits on it anyway so hoping it just clears itself up. ( still get duplicate warning on our directory due to using "php Link Directory", but it's due to how it reuses the title tag and description, 2 instances per category page"). Still trying to fix the php directory issue. Seems many others are running it and did not have a drop. April 24th, we dropped another -10% It keeps falling -70% now. I have gone through the site and tried to clean up any warnings like duplicate title tags, meta descriptions. With regards to links I put up a small web directory with some reciprocal linking. Our product translates languages but software is not the same as a human so we often set clients up with human translators, the directory is a nice place to help our customers find a translator or see online tools that can help. The links were not excessive, there were maybe 100 links. After the fall I went in and found some translators had gone out of business so I deleted those, I am down to 65 links now, about 45 are exchanges. I have submitted to some online directories manually, but looking back through the links there is not really anything that makes me concerned. The link back to my site was really the most neglected SEO thing I did. Again concentrating on content. I did find a few links that I was not happy about but I did not put those links so had no control. I have been working on cleaning up my title tags, and making sure the content just reads better. I have been hoping that my site would just start recovering but it keeps sliding. Has anyone seen recovery from the updates. Should I see anything yet? I cannot seem to get Google to return to the site and reindex. Am I doing somethign spammy on my site and I do not realize it? Thanks for any advice in advance!
Technical SEO | | Force70 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
How can I get unimportant pages out of Google?
Hi Guys, I have a (newbie) question, untill recently I didn't had my robot.txt written properly so Google indexed around 1900 pages of my site, but only 380 pages are real pages, the rest are all /tag/ or /comment/ pages from my blog. I now have setup the sitemap and the robot.txt properly but how can I get the other pages out of Google? Is there a trick or will it just take a little time for Google to take out the pages? Thanks! Ramon
Technical SEO | | DennisForte0 -
Duplicate titles OK if page don't need to rank well?
I know It is not a good idea to have duplicate titles across a website on pages as Google does not like this. Is it ok to have duplicate titles on pages that aren't being optimised with SERP's in mind? or could this have a negative effect on the pages that are being optimised?
Technical SEO | | iSenseWebSolutions0