Can I use a 410'd page again at a later time?
-
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
-
I've tested this a couple of times and each result has been: no.
If you 410 a page, Google does a bloody good job at never bringing that URL back. I've changed the response code back from a number of 410s, socially shared the page, built natural links to it, even blasted links to it (for testing's sake of course!) and Google has never reindexed it.
It's both good and bad - good in the sense that it really uses the 410 response code properly, but bad in the sense that, if you make a mistake and want to reuse the URL, you're pretty screwed.
If you're going to use it, be absolutely 100% you wouldn't see any use in using that URL again (such as offering the product again in the future etc.). If there is any doubt, leave it as a 404 or redirect it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can I provide titles and descriptive text for our list of USPs on the same page optimized both for usability and SEO
I am rebuilding our website together with an agency and I am stuck with the following problem: We have a page which will provide the visitor with a quick and convincing impression why he should chose our enterprise. On this page we want to show our USPs (Unique Selling Points) each with a title and a short description. Now my preferred way of presenting those USPs would be of a list of the titles (which permits to see all USPs without having to read a lot of text) where each title can be clicked to expand the description (in case you want to know more about this specific USP) and if you click on another title the previously clicked title description will collapse and the new description expand and so on (similar to this page: http://www.berlin-city-immobilien.de/38.html - I'm talking about the list in the middle of the page starting with the headline "Dabei profitieren Sie von folgenden Vorteilen"). Since I also want to use these descriptions as on page SEO-texts I checked whether Google might not index or at least value "click to expand content" less than plain text in the body of the page and I stumbled over this article: https://www.seroundtable.com/google-hidden-tab-content-seo-19489.html. According to this article Google will definitely discount the descriptions on my page. Does anyone have an idea how to solve this problem? Either by suggesting a different way to show titles and descriptions on the page or maybe by suggesting a workaround so Google will not treat the descriptions as "click to expand text". Thank you already in advance for your input.
Technical SEO | | Benni
Ben0 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Can iFrames count as duplicate content on either page?
Hi All Basically what we are wanting to do is insert an iframe with some text on onto a lot of different pages on one website. Does google crawl the content that is in an iFrame? Thanks
Technical SEO | | cttgroup0 -
Google using descriptions from other websites instead of site's own meta description
In the last month or so, Google has started displaying a description under links to my home page in its search results that doesn't actually come from my site. I have a meta description tag in place and for a very limited set of keywords, that description is displayed, but for the majority of results, it's displaying a description that appears on Alexa.com and a handful of other sites that seem to have copied Alexa's listing, e.g. similarsites.com. The problem is, the description from these other sites isn't particularly descriptive and mentions a service that we no longer provide. So my questions are: Why is Google doing this? Surely that's broken behaviour. How do I fix it?
Technical SEO | | antdesign0 -
Pages noindex'ed. Submit removal request too?
We had a bunch of catalog pages "noindex,follow" 'ed. Now should we also submit removal request in WMT for these pages? Thank you! LL
Technical SEO | | LocalLocal0 -
Is there ever a time when 301 redirects aren't possible?
I have been told that 301 redirects are always possible. I've been told that it's a very time consuming process so developers at times will say that it's not possible. Is there ever a time when it is not impossible? Perhaps using a specific server? I know it's do-able in Apache which is the server that is in question. Would it be impossible if someone were using a templated type set of websites & if they made changes on one website it would make changes across all websites? *Edit "due to a server configuration 301 redirects aren't possible" Thanks so much for any help or answers you can provide.
Technical SEO | | DCochrane0 -
Page crawling is only seeing a portion of the pages. Any Advice?
last couple of page crawls have returned 14 out of 35 pages. Is there any suggestions I can take.
Technical SEO | | cubetech0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0