How can I make it so that the various iterations (pages) do not come up as duplicate content ?
-
Hello,
I wondered if somebody could give me some advice.
The problem of various iterations of the clanedar page coming up as duplicate content.
There is a large calendar on my site for events and each time the page is viewed it is seen as duplicate content . How can I make it so that the various iterations (pages) do not come up as duplicate content ? Regards
-
Anthony
There's a few ways to do this, and it does depend a little on the specifics of how the site is set up but the best (and easiest) may be to use the URL parameter settings in Webmaster Tools.
You're going to log into webmaster tools, go to configuration->URL Parameters - set it to NOT index things beyond the ? (this may be numbers 2 and beyond or it may be everything)
How is this set up though? Does the calendar "start" and "end" somewhere or is it going off infinitely in either direction?
The robots.txt file won't keep the page out of the index.
If you can let us know some more specifics that would be great!
-Dan
-
No, no, no. You don't add these to calendar pages. You need to place a file called robots.txt in your root folder.
Read more here: http://www.seomoz.org/learn-seo/robotstxt
-
Gamer07
Thanks for this , it is very helpful. I'll try it out.
Just to clarify I need to add this to every calendar page which comes up as being duplicate? If so is there a quick way of doing it as there are (I kid you not) over 5,000 calendar pages which are coming up as duplicate. Obviously I'd prefer not to go through all of those manually.
Going forwards I take it that I add this to every calendar page. Is there there a proactive way of stopping it in the first place or is it just a case of remedying it (with this code) once it occurs . As obviously ideally I'd like to stop it occuring again in the future as opposed to keep remedying it again and again.
Many thanks
-
Hi,
When you change the page, check the URL, you will see parameters. Such as calendar?week=18
Open your robots.txt file and add this line
Disallow: /calendar?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicating words in the page title OK?
Im finding a site with lots of duplicated words in the title tags, I have always avoided doing this in the past, Is there any penalty for having a word repeated twice in the title, indeed is there a benefit from having it twice, IM assuming not
On-Page Optimization | | Donsimong
For example: Marketing Services in Milton Keynes | Our Services | TFA
https://www.t-f-a.co.uk/services the word service is repeated twice, in my opinion this is of no benefit at all and is better rewritten to remove the duplication1 -
Duplicate content with tagging and categories
Hello, Moz is showing that a site has duplicate content - which appears to be because of tags and categories. It is a relatively new site, with only a few blog publications so far. This means that the same articles are displayed under a number of different tags and categories... Is this something I should worry about, or just wait until I have more content? The 'tag' and 'category' pages are not really pages I would expect or aim for anyone to find in google results anyway. Would be glad to here any advice / opinions on this Thanks!
On-Page Optimization | | wearehappymedia1 -
Moz showing 384 description duplicates on my ecommerce store.... when I download CSV, most pages are coming from my WordPress Blog, why?
Hi, I am trying to investigate why I am getting 384 description duplicates on my ecommerce store (www.doggie-diva.com)? When I download the CSV file from MOZ, the majority of the pages they refer to are pages from my Word Press blog, which is hosted on a different server (blog.doggie-diva.com). I do have a link from my website to my Word Press blog and vice versa. Can you please explain to me why this is happening when I don't have duplicate content? Example of a page flagged from www.doggie-diva.com with duplicate content (http://blog.doggie-diva.com/tag/dog-gymnastics. Thanks, Rachel <colgroup><col width="549"></colgroup>
On-Page Optimization | | doggiedivalicious
| |0 -
tagged as duplicate content?
Hello folks, I'm new to SEOmoz . I was looking at our Crawl Diagnostics and found that some of our blog posts that have been commented on were tagged as duplicate content. For example: http://thankyouregistry.com/blog/remarriages-and-gift-registries/ http://thankyouregistry.com/blog/remarriages-and-gift-registries/comment-page-1/ I'm unsure how to fix these, so any ideas would be appreciated. Thanks a lot!
On-Page Optimization | | GiftReg0 -
How to get rid of those duplicate pages
Hi eveyone Just got my first diagnostics report and i got 220 duplicate page titles and 217 duplicate content pages. I know why this has happened it is because i did play about a bit and did originally have:- www.mydomainname.com/index.php/alpha Then i change the page path to:- www.mydomainname.com/alpha Then i change the page path to:- www.mydomainname.com/category/alpha So now when i get crawled i have 3x duplicate page titles, descriptions and page content. Even when i have put 301 redirects to my preferred domain path. Which is hurting my seo, right? How do i stop theold domains from giving me these bad reports? The site is on Joomla Thanks guys Oujipickle
On-Page Optimization | | oujipickle0 -
Duplicated Page Content
I have encountered this weird problem about duplicate page content. My site got 3 duplicate content similar on the link structure below. If I'm going to use rel canonical does it help to resolve the duplication problem? Thanks http://www.sample.com http://www.sample.com/ http://www.sample.com/index.php
On-Page Optimization | | mattvectorbpo0 -
Best Way to check for duplicate pages
With Google's updates we know they want to clean out duplicate content. i have been seeing the same crap spit out even word for word on different sites. Anyway how do you experienced SEO people test for dups on your own site as well as other sites. The only thing i can come up with is paying copyscape 5 cts a test. There has to be other ways. Advise/
On-Page Optimization | | joemas990