Is this duplicate content?
-
My client has several articles and pages that have 2 different URLs
For example:
is the same article as:
I was not sure if this was duplicate content or not ...
Or if I should be putting "/article.cfm" into the robots.txt file or not..
if anyone could help me out, that would be awesome!
Thanks
-
Agreed - although I think a 301-redirect or canonical tag implementation would probabyl be ok. If there's a database lookup that can translate the DocID into a URL string, the canonical is easy (I write some CF code, so I can at least tell you it's doable). Keep in mind that "article.cfm" is only one template, so if you can find a solution that's data-driven, it's just as easy for 1,000 pages as it is for 10.
You could also create a dynamic 301-redirect via <cfheader>- the core logic is the same. Basically, you look up the URL from the DocID and dynamically create the tag. You just need someone who understands your CMS and data. The actual code is only a few lines, but understanding your setup is the time-consuming part.</cfheader>
-
ATMOS, those are just the same page, so Canonical tag should do it, but also you want to stop google indexing it, so you could detect that it is called with the article.cfm and use a no index META tag too, but not if it uses the friendly url
-
I mostly agree with kchan.
- It is considered duplicate content.
- Simplest way is to do rel canonical for the pages with ids.
However, I suspect 301 redirect is not the best way. Especially, if your website is using Omniture and/or Google web analytics code, you might get miscalculated traffic through them because of 301 redirect.
Be careful if you choose the last route.
-
Awesome Chan, thanks. That was my thought as well. Most difficult part will be determining how to get that script in place.
-
Any chance you can spend a little time writing it out?
My guess is that we should be doing a rel canonical tag on all the article.cfm?intDocID=22572 type pages, that would then direct the bots to our /bc-blazes-construction-trail. But what's the easiest way to do that across the whole site?
-
Hello,
It sure is duplicate content. By putting "/article.cfm" into the robot.txt won't work because if you do that you are just re-directing the whole folder. You need to do a permanent re-direct. I had a brief look at the site and it seems like there are over 1000+ pages. This might take a while but it is neccessary to do it, if not your clients rankings will not perform and most likely penalised. A simple way would be doing a canonical link in /article.cfm?intDocID=22572 so you are showing google the main article is located at /bc-blazes-construction-trail.
However, the best way would be doing a 301 permanent re-direct of course. I'm sure you could get a web dev to write a script to automatically run through the database and output the re-directs than manually re-directing 1000+ pages. If it can't be done, you could outsource it on freelancer.com for around $2-300.
Thanks
-
That would definitely be considered duplicate content. There are a few things you can do to fix it, but rather than wasting a bunch of time writing it out here I would recommend visiting the link below for more detailed info:
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Complicated Duplicate Content Question...but it's fun, so please help.
Quick background: I have a page that is absolutely terrible, but it has links and it's a category page so it ranks. I have a landing page which is significantly - a bizillion times - better, but it is omitted in the search results for the most important query we need. I'm considering switching the content of the two pages, but I have no idea what they will do. I'm not sure if it will cause duplicate content issues or what will happen. Here are the two urls: Terrible page that ranks (not well but it's what comes up eventually) https://kemprugegreen.com/personal-injury/ Far better page that keeps getting omitted: https://kemprugegreen.com/location/tampa/tampa-personal-injury-attorney/ Any suggestions (other than just wait on google to stop omitting the page, because that's just not going to happen) would be greatly appreciated. Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Can I duplicate my websites content on Ebay Store?
Our company is setting up a store on Ebay. Is it okay to duplicate our content descriptions on our ebay store with a link going back to our website? Or would this potentially hurt us in Search?
Intermediate & Advanced SEO | | hfranz0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
Duplicate content throughout multiple URLs dilemma
We have a website with lots of categories and there are problems that some subcategories have identical content on them. So, is it enough to just add different text on those problematic subcategories or we need to use "canonical" tag to main category. Same dilemma is with our search system and duplicate content. For example, "/category/sports" URL would have similar to identical content with "/search/sports" and "/search/sports-fitness/" URLs. Ranking factors is important for all different categories and subcategories. Ranking factors is also important for search individual keywords. So, the question is, how to make them somehow unique/different to rank on all those pages well? Would love to hear advices how it can be solved using different methods and how it would affect our rankings. When we actually need to use "canonical" tag and when 301 redirect is better. Thanks!
Intermediate & Advanced SEO | | versliukai0 -
Do you bother cleaning duplicate content from Googles Index?
Hi, I'm in the process of instructing developers to stop producing duplicate content, however a lot of duplicate content is already in Google's Index and I'm wondering if I should bother getting it removed... I'd appreciate it if you could let me know what you'd do... For example one 'type' of page is being crawled thousands of times, but it only has 7 instances in the index which don't rank for anything. For this example I'm thinking of just stopping Google from accessing that page 'type'. Do you think this is right? Do you normally meta NoIndex,follow the page, wait for the pages to be removed from Google's Index, and then stop the duplicate content from being crawled? Or do you just stop the pages from being crawled and let Google sort out its own Index in its own time? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
How to deal with category browsing and duplicate content
On an ecommerce site there are typically a lot of pages that may appear to be duplications due to category browse results where the only difference may be the sorting by price or number of products per page. How best to deal with this? Add nofollow to the sorting links? Set canonical values that ignore these variables? Set cononical values that match the category home page? Is this even a possible problem with Panda or spiders in general?
Intermediate & Advanced SEO | | IanTheScot0