Duplicate Content Caused By Blog Filters
-
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
-
Thanks, Ryan! I'll take this to our development team. Have a good weekend.
-
Your CMS will have a database. All links to your articles will be contained within the database.
The concept would be an additional field would be created in the database to separate out the actual URL from the public URL. The actual URL would be redirected to the public URL.
-
Ryan,
Thanks for the reply.
We are using Marketpath CMS and doesn't support plug-ins like that.
Maybe I am confused on how you're able to set 301's on list-tags though? Since each tag or category's content is generated based upon the most recent entries that apply to that specific filter, you can't set a 301 to one specific place. So how does your plug-in accomplish that?
-
Which CMS are you using?
The preferred choice would be to only offer your blog articles using a single URL. This may be possible naturally through your CMS.
Otherwise you need to 301 redirect all alternate versions of the URL to the primary version.
Most CMS' will offer a means to achieve this effect, often via an extension. I use Joomla as my preferred CMS and AceSEF and SH404SEF are two extensions which offer this functionality and are highly rated.
-
Thanks for the quick response. Unfortunately, we aren't using Wordpress for our blog, so installing that plug-in isn't really an option. Any idea how the plug-in actually accomplishes the feat of taking care of the tags, categories, and date filters to make them have the correct canonical link?
-
Hey TJ;
I found that http://yoast.com/wordpress/seo/ Is the BEST by far plug in that will fix you Canonical URL issues. i know it sure fixed all mine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Finding a specific link - Duplicating my own content
Hi Mozzers, This may be a bit of a n00b question and i feel i should know the answer but alas, here i am asking. I have a page www.website.co.uk/page/ and im getting a duplicate page report of www.website.co.uk/Page/ i know this is because somewhere on my website a link will exists using the capitalised version. I have tried everything i can think of to find it but with no luck, any little tricks? I could always rewrite the urls to lowercase, but I have downloadable software etc also on the website that i dont want to take the capitals out of. So the best solution seems to be finding the link and remove it. Most link checkers I use treat the capitalised and non capitalised as the same thing so really arent helping lol.
Technical SEO | | ATP0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Home Page Blog Snippets - Duplicate Content Help?
Afternoon Folks- I have been asked to contribute to a new site that has a blogfeed prominently displayed on the home page. It's laid out like this: Logo | Menu HOME PAGE SLIDER Blog 1 Title about 100 words of blog 1 Text Blog 2 Title about 100 words of blog 2 Text Blog 3 Title about 100 words of blog 3 Text Footer: -- This seems like an obvious duplicate content situation but also a way I have seen a lot of blogs laid out. (I.E. With blog content snippets being a significant portion of the home page content) I want the blogs to rank and I want the home page to rank, so I don't feel like a rel canonical on the blog post's is the correct option unless I have misunderstood their purpose. Anyone have any ideas or know how this is usually handled?
Technical SEO | | CRO_first0 -
Duplicate Content - Captcha on Contact Form
I am going to be working on a site where the contact form is being flagged as duplicate content the URL is the same apart from having: /contact/10119 contact/31010 ...at the end of it. The only difference in the content of the page that I can see is the Captcha numbers? Is there a way to overcome this to stop duplicate content? Thanks in advance
Technical SEO | | J_Sinclair0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Aspx filters causing duplicate content issues
A client has a url which is duplicated by filters on the page, for example: - http://www.example.co.uk/Home/example.aspx is duplicated by http://www.example.co.uk/Home/example.aspx?filter=3 The client is moving to a new website later this year and is using an out-of-date Kentico CMS which would need some development doing to it in order to enable implementation of rel canonical tags in the header, I don't have access to the server and they have to pay through the nose everytime they want the slightest thing altering. I am trying to resolve this duplicate content issue though and am wondering what is the best way to resolve it in the short term. The client is happy to remove the filter links from the page but that still leaves the filter urls in Google. I am concerned that a 301 redirect will cause a loop and don't understand the behaviour of this type of code enough. I hope this makes sense, any advice appreciated.
Technical SEO | | travelinnovations0 -
Duplicate Content Penalties, International Sites
We're in the process of rolling out a new domestic (US) website design. If we copy the same theme/content to our International subsidiaries, would the duplicate content penalty still apply? All International sites would carry the Country specific domain, .co.uk, .eu, etc. This question is for English only content, I'm assuming translated content would not carry a penalty.
Technical SEO | | endlesspools0