Dealing with past events
-
Hi
We have a website which lists both upcoming and past events. Currently everything is indexed by google, with no real issues (usually it finds the most up-to-date events) and we have deprioritised the past events in the sitemap.
Do I need to go one step further and noindex events which are past or just leave it as-is? They dont really hold much value, but sometimes will have a number of incoming links and social media shares pointing to them. We want to keep the page active for visitors, just wondering about google (there's no real link between past events and future either, so difficult to 'point' to newer version of an event)
We have approx 1M 'past' events and growing so its a big change. Also would you keep them in sitemap with lower priority, or just remove them?
EDIT: Just seen a Matt Cutts post from 2014 which indicates than an 'unavailable_after' meta tag might be best?
-
Hello benseb,
You mention that you have de-prioritized past events in the sitemap. You could go the nofollow route although this is a somewhat clumsy way to go about it.
I think based on what you have described, your best bet is to leave it as is (after moving forward with the hint Matt Cutts dropped) rather than eliminating a load of content which is sending Google positive signals. My guess is that these positive signals overpower any negative signals that might be resulting from aging content.
If everything has been properly indexed and current events are showing up, I wouldn't make any big alterations - why mess with a good thing?
If you begin seeing drastic declines in traffic or user interaction, that might be the time to take a harder stance. For now though, let it be.
Best of luck!
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with canonicals on dup product pages in Opencart?
So I have a seriously large amount of duplicate content problems on my Opencart site, and I've been trying to figure out the best way to fix them one by one. But is there a common, easy way of doing this? Because frankly, it is a nightmare otherwise. I bought an extension which doesn't appear to work (http://www.opencart.com/index.php?route=extension/extension/info&extension_id=20468&utm_source=ordercomplete&utm_medium=email&utm_campaign=wm), so now I'm at a loss.
Intermediate & Advanced SEO | | moon-boots0 -
Are 1x Event pages considered thin content? Should they be archived or redirected?
Since past event pages will become stale after the event, should they be keep alive and archived with only a link from a couple of places (for instance the main event page and html sitemap). Or should they be "retired" and redirected to the main event page if they are really no longer needed? They would probably be considered thin content because they won't have much traffic and will have very few links pointing to them. Right? Thanks. Inquiring minds want to know... 😉
Intermediate & Advanced SEO | | cindyt-170380 -
Scraped Content on Foreign Language Site. Big deal or not?
Hi All, I've been lurking and learning from this awesome Q&A forum, and I finally have a question. I am working on SEO for an entertainment site that tends to get scraped from time to time. Often, the scraped content is then translated into a foreign language, and posted along with whatever pictures were in the article. Sometimes a backlink to our site is given, sometimes not. Is scraped content that is translated to a foreign language still considered duplicate content? Should I just let it go, provided a backlink is given? Thanks!
Intermediate & Advanced SEO | | MKGraphiques
Jamie0 -
Best practices with reoccurring event listings
On our client's events page there are a few reoccurring events that each have their own detail page. I'm trying to figure out what's the best practice for minimising duplicate content. For example, for the Bribie Island Markets that repeat weekly there are 2 (+more) detailed event pages: http://www.ourbribie.com/e/bribie-island-markets/1869/2013-12-07/2013-12-07
Intermediate & Advanced SEO | | michaelp85
http://www.ourbribie.com/e/bribie-island-markets/1869/2013-12-14/2013-12-14 While they both contain duplicated content, they're unique in that they display the specific events date/time. My thinking is that the future events (e.g. 2013-12-14) should have a canonical link to the upcoming/next event (i.e. 2013-12-07). However this would require constantly updating/changing the canonical links. What's the best way to deal with this from a duplicate content prospective? Any better recommendations?0 -
How to deal with duplicates on an e-commerce website
Hi guys, So we have an e-commerce website and we have some products that are exactly the same but come in different colours. Lets say for example we have a Samsonite Chronolite and this bag comes in 55cm, 65cm and 75cm variations. The same bag also may come in 4 different colours. The bags are the same and therefore have the same information besides maybe the title tag varying due to the size and colour. But the descriptions are the same. How do I avoid Google thinking I am duplicating pages or have duplicated pages. Google things we have duplicated when the scenario is as I have explained. Any suggestions? Best regards,
Intermediate & Advanced SEO | | iBags2 -
How do I best deal with pages returning 404 errors as they contain links from other sites?
I have over 750 URL's returning 404 errors. The majority of these pages have back links from sites, however the credibility of these pages from what I can see is somewhat dubious, mainly forums and sites with low DA & PA. It has been suggested placing 301 redirects from these pages, a nice easy solution, however I am concerned that we could do more harm than good to our sites credibility and link building strategy going into 2013. I don't want to redirect these pages if its going to cause a panda/penguin problem. Could I request manual removal or something of this nature? Thoughts appreciated.
Intermediate & Advanced SEO | | Towelsrus0 -
Best way to deal with multiple languages
Hey guys, I've been trying to read up on this and have found that answers vary greatly, so I figured I'd seek your expertise. When dealing with the url structure of a site that is translated into multiple languages, is it better SEO wise to structure a site like this : domain.com/en domain.com/it etc or to simply add url modifiers like domain.com/?lang=en domain.com/?lang=it In the first example, I'm afraid google might see my content as duplicate even though its in a different language.
Intermediate & Advanced SEO | | CrakJason0 -
How to deal with old, indexed hashbang URLs?
I inherited a site that used to be in Flash and used hashbang URLs (i.e. www.example.com/#!page-name-here). We're now off of Flash and have a "normal" URL structure that looks something like this: www.example.com/page-name-here Here's the problem: Google still has thousands of the old hashbang (#!) URLs in its index. These URLs still work because the web server doesn't actually read anything that comes after the hash. So, when the web server sees this URL www.example.com/#!page-name-here, it basically renders this page www.example.com/# while keeping the full URL structure intact (www.example.com/#!page-name-here). Hopefully, that makes sense. So, in Google you'll see this URL indexed (www.example.com/#!page-name-here), but if you click it you essentially are taken to our homepage content (even though the URL isn't exactly the canonical homepage URL...which s/b www.example.com/). My big fear here is a duplicate content penalty for our homepage. Essentially, I'm afraid that Google is seeing thousands of versions of our homepage. Even though the hashbang URLs are different, the content (ie. title, meta descrip, page content) is exactly the same for all of them. Obviously, this is a typical SEO no-no. And, I've recently seen the homepage drop like a rock for a search of our brand name which has ranked #1 for months. Now, admittedly we've made a bunch of changes during this whole site migration, but this #! URL problem just bothers me. I think it could be a major cause of our homepage tanking for brand queries. So, why not just 301 redirect all of the #! URLs? Well, the server won't accept traditional 301s for the #! URLs because the # seems to screw everything up (server doesn't acknowledge what comes after the #). I "think" our only option here is to try and add some 301 redirects via Javascript. Yeah, I know that spiders have a love/hate (well, mostly hate) relationship w/ Javascript, but I think that's our only resort.....unless, someone here has a better way? If you've dealt with hashbang URLs before, I'd LOVE to hear your advice on how to deal w/ this issue. Best, -G
Intermediate & Advanced SEO | | Celts180