Benefit of using 410 gone over 404 ??
-
It seems like it takes Google Webmaster Tools to forever realize that some pages, well, are just gone.
Truth is, the 30k plus pages in 404 errors, were due to a big site URL architecture change.
I wonder, is there any benefit of using 410 GONE as a temporary measure to speed things up for this case?
Or, when would you use a 410 gone?
Thanks
-
I had the (mis)fortune of trying to deindex nearly 2 million URLs across a couple of domains recently, so had plenty of time to play with this.
Like CleverPhD I was not able to measure any real difference in the time it took to remove a page that had been 410'd vs one that had been 404'd.
The biggest factor governing the removal of the URLs was getting all the pages recrawled. Don't underestimate how long that can take. We ended up creating crawlable routes back to that content to help Google keep visiting those pages and updating the results.
-
The 410 is supposed to be more definitive
http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
404 is "not found" vs 410 is "gone
10.4.5 404 Not Found
The server has not found anything matching the Request-URI. No indication is given of whether the condition is temporary or permanent. The 410 (Gone) status code SHOULD be used if the server knows, through some internally configurable mechanism, that an old resource is permanently unavailable and has no forwarding address. This status code is commonly used when the server does not wish to reveal exactly why the request has been refused, or when no other response is applicable.
10.4.11 410 Gone
The requested resource is no longer available at the server and no forwarding address is known. This condition is expected to be considered permanent. Clients with link editing capabilities SHOULD delete references to the Request-URI after user approval. If the server does not know, or has no facility to determine, whether or not the condition is permanent, the status code 404 (Not Found) SHOULD be used instead. This response is cacheable unless indicated otherwise.
The 410 response is primarily intended to assist the task of web maintenance by notifying the recipient that the resource is intentionally unavailable and that the server owners desire that remote links to that resource be removed. Such an event is common for limited-time, promotional services and for resources belonging to individuals no longer working at the server's site. It is not necessary to mark all permanently unavailable resources as "gone" or to keep the mark for any length of time -- that is left to the discretion of the server owner.
That said, I had a similar issue on a site with a couple thousand pages and went with the 410, not sure it really made things disappear any faster than the 404 (that I noticed).
I just found a post from John Mueller from Google
https://productforums.google.com/forum/#!topic/webmasters/qv49s4mTwNM/discussion
"In the meantime, we do treat 410s slightly differently than 404s. In particular, when we see a 404 HTTP result code, we'll want to confirm that before dropping the URL out of our search results. Using a 410 HTTP result code can help to speed that up. In practice, the time difference is just a matter of a few days, so it's not critical to return a 410 HTTP result code for URLs that are permanently removed from your website, returning a 404 is fine for that. "
So, use the 410 as a matter of a few days you may see a difference with 30k pages.
All of that said, are you sure with a site that big you would not need to 301 some of those pages. If you have a bunch of old news items or blog posts, would you not want to redirect them to the new URLs for those same assets? Seems like you should be able to recover some of them - at least your top traffic pages etc.
Cheers
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using PURL.org/GoodRelations for Schema Markup
Hello awesome MOZ community! Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good! Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup. The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings." BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup. I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars? Thanks! Your feedback, insight, and snarky remarks are welcome 🙂 Cheers!
White Hat / Black Hat SEO | | SproutDigital0 -
Client Wants To Use A .io Domain Name - How Bad For Organic?
Hi, I have a U.S. client who is stuck on a name that he wants to get as a .io (British Indian Ocean) domain name for a new site. Aside from the user confusion/weirdness, how much harder do you think this makes this sites organic in the U.S. in the future with a .io domain name? FYI, the other part of the domain name he wants to use is short, meaningless and implies nothing in and of itself. Thanks!
White Hat / Black Hat SEO | | 945012 -
Can I Use Meta NoIndex to Block Unwanted Links?
I have a forum thread on my site that is completely user generated, not spammy at all, but it is attracting about 45 backlinks from really spammy sites. Usually when this happens, the thread is created by a spammer and I just 404 it. But in this instance, the thread is completely legit, and I wouldn't want to 404 it because users could find it useful. If I add a meta noindex, nofollow tag to the header, will the spammy pagerank still be passed? How best can I protect myself from these low quality backlinks? I don't want to get slapped by Penguin! **Note: I cannot find contact information from the spam sites and it's in a foreign language.
White Hat / Black Hat SEO | | TMI.com0 -
Is it okay to use eLocal services?
Is it okay to use a service like eLocal's 'reach the web' to clean up our company listings on website directories or is it considered black hat? Our company name and address is inconsistent on many of the website directories and we want to clean it up fast. eLocal has a service that can do this. I just want to make sure it's not considered bad to have a vendor do it. Thanks!
White Hat / Black Hat SEO | | KristyFord0 -
Do I need to use meta noindex for my new website before migration?
I just want to know your thoughts if it is necessary to add meta noindex nofollow tag in each page of my new website before migrating the old pages to new pages under a new domain? Would it be better if I'll just add a blockage in my robots.txt then remove it once we launch the new website? Thanks!
White Hat / Black Hat SEO | | esiow20130 -
If Google Authorship is used for every page of your website, will it be penalized?
Hey all, I've noticed a lot of companies will implement Google Authorship on all pages of their website, ie landing pages, home pages, sub pages. I'm wondering if this will be penalized as it isn't a typical authored piece of content, like blogs, articles, press releases etc. I'm curious as I'm going to setup Google Authorship and I don't want it to be setup incorrectly for the future. Is it okay to tie each page (home page, sub pages) and not just actual authored content (blogs, articles, press releases) or will it get penalized if that occurs? Thanks and much appreciated!
White Hat / Black Hat SEO | | MonsterWeb280 -
SEO best practice: Use tags for SEO purpose? To add or not to add to Sitemap?
Hi Moz community, New to the Moz community and hopefully first post/comment of many to come. I am somewhat new to the industry and have a question that I would like to ask and get your opinions on. It is most likely something that is a very simple answer, but here goes: I have a website that is for a local moving company (so small amounts of traffic and very few pages) that was built on Wordpress... I was told when I first started that I should create tags for some of the cities serviced in the area. I did so and tagged the first blog post to each tag. Turned out to be about 12-15 tags, which in turn created 12-15 additional pages. These tags are listed in the footer area of each page. There are less than 20 pages in the website excluding the tags. Now, I know that each of these pages are showing as duplicate content. To me, this just does not seem like best practices to me. For someone quite new to the industry, what would you suggest I do in order to best deal with this situation. Should I even keep the tags? Should I keep and not index? Should I add/remove from site map? Thanks in advance for any help and I look forward to being a long time member of SEOMoz.
White Hat / Black Hat SEO | | BWrightTLM0 -
How do I find out if a competitor is using black hat methods and what can I do about it?
A competitor of mine has appeared out of nowhere with various different websites targetting slightly different keywords but all are in the same industry. They don't have as many links as me, the site structure and code is truly awful (multiple H1's on same page, tables for non-tabular data etc...) yet they outperform mine and many of my other competitors. It's a long story but I know someone who knows the people who run these sites and from what I can gather they are using black hat techniques. But that is all I know and I would like to find out more so I can report them.
White Hat / Black Hat SEO | | kevin11