Best practices for temporary articles
-
Hello,
I would like to have expert inputs about the best way to manage temporary content?
In my case, I've a page (ex : mydomain.com/agenda) where I have listing of temporary article, with a lifetime of 1 month to 6 months for some of them.
My articles also have a specific url like for ex : mydomain.com/agenda/12-02-2011/thenameofmyarticle/
As you can guess, I got hundreds of 404
I'm already using canonical tag, should I use a in the listing page? I'm a bit lost here..
-
Thanks you Egol
-
thanks Richard.
I'm going to try this.
-
Thanks Aran!
-
{script to test page URL}
$location = "http://www.YourSite.com/";
header("HTTP/1.1 301 Moved Permanently");
header("Location: {$location}");
exit;
}
-
We have temporary content and evergreen content.
When a page of temporary content is created it is filed in a folder according to its "expiration date". On that date the folder is 301 redirected to an appropriate destination. However, before the redirect is done we run analytics on the folder to see if any files are pulling traffic from SERPs or links from other websites. We then try to create evergreen content on the same topic that will capture that traffic and redirect the specific files to the new evergreen content.
-
It seems so unnatural to want to actually remove content when we spend so long striving to create awesome content!
-
You can use the meta robots tags as you mentioned in your question, this will prevent search engines indexing the pages, unfortunately we need to tackle the human side of the issue,if anyone links to the article, then eventually the link will result in a 404 page.
There is nothing wrong with a 404 page, they serve an imporant purpose. Since your articles are not around very long and not being indexed by search engines I see no reason to simply leave the 404 in place.
Ensure you have a custom 404 which is an imformative and helpful resource rather than a simple 404 Page not found message. use the 404 to direct the visitor to a category level page which is related to the topic of the article. Offer a simple list of links to various parts of the site that may be of interest.
Check out the SEOmoz articles
www.seomoz.org/blog/personalizing-your-404-error-pages
www.seomoz.org/blog/are-404-pages-always-bad-for-seo
Hope this helps.
-
I agree with Aran, setup an archive system that keeps the articles under the same URL but does not show them live on the website.
Alternatively you could setup a dumping "archive" folder where you drop all old articles in and use this link as your rel canonical link
-
Hello Arcanis,
Yes we have a destination URL for these contents, I just don't know how I can manage it when it disappears...
-
Hello Aran,
Thanks for your answers!
Unfortunately no, since the content is very "dated" (ex : 3 days music festival, etc.), we don't keep archive of this kind of content.
-
If you are using canonical tag, what is the context for that tag? do you already have a destination URL for these temporary articles?
-
Would it be possible to 'Archive' articles after the 1-6month period ?
Archive could just be a database flag that keeps the articles from appearing in Article index thus keeping the same url, but not clogging up main site with hundreds of links to expired articles?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Top 3 SEO Strategy/Research Practices
Hi Moz Community, I am launching a series of new e-commerce websites and wanted to know before I started the content writing what are the TOP 3 strategy/research practices and techniques I should be doing before building my website? Bonus points for those who can give me a Top 5! Thanks,
Technical SEO | | xlucax
Luca0 -
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
How should I close my forum in a way that's best for SEO?
Hi Guys, I have a forum on a subdomain and it is no longer used. (like forum.mywebsite.com) It kind of feels like a dead limb and I don't know what's best to do for SEO. Should I just leave it as it is and let it stagnate? There is a link in the nav menu to the main domain so users have a chance to find the main domain. Or should I remove it and just redirect the whole subdomain to the main domain? I don't know if redirects would work as I doubt most of the threads would match our articles, plus there are 700 of them. The main domain is PR3 and so is the forum subdomain. Please help!
Technical SEO | | HCHQ0 -
Is it worth adding schema markup to articles?
I know things like location, pagination, breadcrumbs, video, products etc have value in using schema markup. What about things like articles though? Is it worth all the work involved in having the pages mark up automatically? How does this effect SEO, and is it worthwhile? Thanks, Spencer
Technical SEO | | MarloSchneider0 -
Cloaking? Best Practices Crawling Content Behind Login Box
Hi- I'm helping out a client, who publishes sale information (fashion sales etc.) In order for the client to view the sale details (date, percentage off etc.) they need to register for the site. If I allow google bot to crawl the content, (identify the user agent) but serve up a registration light box to anyone who isn't google would this be considered cloaking? Does anyone know what the best practice for this is? Any help would be greatly appreciated. Thank you, Nopadon
Technical SEO | | nopadon0 -
Best source to keep abreast of Google Alogorith Changes
Good morning from 13 degress C about to chuck it down wetherby UK... I want to make sure i keep my finger on the pulse regarding Google Algorithmic changes. What is the best source to make sure your kept up to date with Googles ongoing tweeks to its Algorithm (apart from SEO moz). Thanks, David
Technical SEO | | Nightwing0 -
What's the best way to solve this sites duplicate content issues?
Hi, The site is www.expressgolf.co.uk and is an e-commerce website with lots of categories and brands. I'm trying to achieve one single unique URL for each category / brand page to avoid duplicate content and to get the correct URL's indexed. Currently it looks like this... Main URL http://www.expressgolf.co.uk/shop/clothing/galvin-green Different Versions http://www.expressgolf.co.uk/shop/clothing/galvin-green/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/1 http://www.expressgolf.co.uk/shop/clothing/galvin-green/2 http://www.expressgolf.co.uk/shop/clothing/galvin-green/3 http://www.expressgolf.co.uk/shop/clothing/galvin-green/4 http://www.expressgolf.co.uk/shop/clothing/galvin-green/all http://www.expressgolf.co.uk/shop/clothing/galvin-green/1/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/2/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/3/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/4/ http://www.expressgolf.co.uk/shop/clothing/galvin-green/all/ Firstly, what is the best course of action to make all versions point to the main URL and keep them from being indexed - Canonical Tag, NOINDEX or block them in robots? Secondly, do I just need to 301 the (/) from all URL's to the non (/) URL's ? I'm sure this question has been answered but I was having trouble coming to a solution for this one site. Cheers, Paul
Technical SEO | | paulmalin0 -
What's the best way to transplant a blogger blog to another domain?
So I have this client who's got a killer blogger blog—tons of inbound links, great content, etc. He wants to move it onto his new website. Correct me if I'm wrong, but there isn't a single way to 301 the darn thing. I can do meta refresh and/or JavaScript redirects, but those won't transfer link juice, right? Is there a best practice here? I've considered truncating each post and adding a followed "continue reading…" link, which would of course link to the full post on the client's new site. It would take a while and I'm wondering if it would be worth it, and/or if there are any better ideas out there. Sock it to me.
Technical SEO | | TheEspresseo0