301, 404 or 410? what is the best practice
-
Hi
I'm currently working on a project to correct some really bad practices from years of different SEO's.
Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page.
Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates.
I've pulled the pages, but i'm in several frames of mind on how to best fix this.
The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them.
Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
-
Thanks for the responses, a 410 is a lot of work for probably little gain, so i think i'll run with just leaving the 404.
I have done an analytic's check on the url's in question and 10 had a tiny bit of traffic, so for these only i'll 301 to one relevant page.
Thanks again.
-
404, 410 it does not matter, you have removed the pages that is the main thing.
But to be correct you should use a 410, as they are gone forever, while 404 just means not found.
-
Hi Paul,
If they are unlikely to have external links to them (or at least no good ones) and they are not internally linked then I think your best bet is just to let them 404. You should anyway fix up your 404 page to let users know there has been a site redesign or similar and give links to homepage and other important pages etc.
You could 410 them also which is said to remove these pages more quickly from the index and to be a final word that these pages no longer exist and will never come back but that might create more overhead than it is worth in regards setting up different 4xx header responses for different types of pages. The differences between 404 and 410 headers in practice seems to be very little according to most things I have read. Since there will be no links to them anyway, 404ing them is an easy solution and should not create any problems.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
Best seo benefit location ( main page text or h1 , h2)?
i have learned that h1 has more value than h2 and h2 has more than h3, but lets say if i want to place my keywords in there. should i include them in the main body or should take advantage of header tags?
White Hat / Black Hat SEO | | Sam09schulz0 -
What would be the best course of action to nullify negative effects of our website's content being duplicated (Negative SEO)
Hello, everyone About 3 months ago I joined a company that deals in manufacturing of transportation and packaging items. Once I started digging into the website, I noticed that a lot of their content was "plagiarized". I use quotes as it really was not, but they seemed to have been hit with a negative SEO campaign last year where their content was taken and being posted across at least 15 different websites. Literally every page on their website had the same problem - and some content was even company specific (going as far as using the company's very unique name). In all my years of working in SEO and marketing I have never seen something at the scale of this. Sure, there are always spammy links here and there, but this seems very deliberate. In fact, some of the duplicate content was posted on legitimate websites that may have been hacked/compromised (some examples include charity websites. I am wondering if there is anything that I can do besides contacting the webmasters of these websites and nicely asking for a removal of the content? Or does this duplicate content not hold as much weight anymore as it used to. Especially since our content was posted years before the duplicate content started popping up. Thanks,
White Hat / Black Hat SEO | | Hasanovic0 -
How to redirect 301 from high authority sites to own website?
How to redirect 301 from high authority sites to own website? If anyone know can tell me, such gigs are selling on the Fiverr.
White Hat / Black Hat SEO | | jefjaa0 -
Should I 301 redirect my old site are just add a link to my new site
I used to offer design and web services on a site that is current blank (no content, no links). My questions is should I add a little bit of content, maybe a brief explanation with a link to my new site. Or should I just add 301 redirect. This is purely a question of what is better for SEO and ranking for my new site (not a branding question).
White Hat / Black Hat SEO | | Tyrell0 -
Best Website Security Service
Having been hacked twice and, it seems, still suffering from problems as a result (file changes and alterations, etc) I'm wondering if there is a good paid service that provides security for websites? I'm an online travel agent so our websites "up-time" and search engine position is vital but I'm spending more time trying sort out our websites' security than selling travel. It would be sensible to pay for a service that would ensure, as much as possible, that our site is secure and any optimisation we do is not cancelled out by hacking and security problems. If anyone could recommend a good company I would be very grateful. Colin
White Hat / Black Hat SEO | | NileCruises0 -
301 redirect from an unwanted non-affiliated domain
Hey everyone, Our site was hit with an unnatural link penalty a few weeks ago, and it looks very much like a malicious link attack. We've never participated in any link schemes or anything resembling black hat link building. In fact, we've barely even done any link building. Reviewing our backlink profile, there are hundreds of links from just a few (very dodgy) domains that we have no control over, and we've have had no luck when contacting webmasters to get them removed. More importantly, we've seen several domains which 301 redirect straight to our domain. We don't know who they are and they're not contactable. I believe that the redirect means they're passing link value from all of their terrible links to us? How can we get the redirects removed, as at worst they're really harming us, and at best, they're adding no value. Thanks
White Hat / Black Hat SEO | | Sarbs0 -
301 Redirect ASP code
Hi I have a script detailed below, that 301 redirects based upon different queries --- """"<%if (Request("offset") = "") Then%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") <> "" then'Sector and Location NOT NULL%> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") <> "" AND Request("j") = "" then'Sector NOT NULL and Location NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/" & LCase(SEOFriend(replaces.Fields.Item("JBCategoryLabel"))) & "-jobs-in-" & LCase(SEOFriend(SiteDetails.Fields.Item("JBSRegion"))) Response.End End If %> <%End if%> <% if Request("keywords") = "" AND Request("s") = "" AND Request("j") <> "" then'Sector NULL and Location NOT NULL %> <% if (Request.ServerVariables("HTTP_X_REQUEST_URI")) <> "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Then Response.Status="301 Moved Permanently" Response.AddHeader "Location", "/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Response.End End If %> <%End if%> <%End if%>"""" But this still allows for both the www and non www versions of these pages to render in the browser, which is resulting in duplicate content. On my home page I use -- <% If InStr(Request.ServerVariables("SERVER_NAME"),"www") = 0 Then Response.Status="301 Moved Permanently" Response.AddHeader "Location","http://www." & Request.ServerVariables("HTTP_HOST") & "/" Response.End End if %> `Is there a good way to combine these, so that I still get all of the rules of the first script whilst also redirecting any non www versions to the www version? in other words
White Hat / Black Hat SEO | | TwoPints
domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation")))
Eould redirect to
www.domain.com/jobs-in-" & LCase(SEOFriend(replacej.Fields.Item("JBLocation"))) Thanks in advance`0