Duplicate content with same URL?
-
SEOmoz is saying that I have duplicate content on:
The only difference I see in the URL is that the "content.asp" is capitalized in the second URL.
Should I be worried about this or is this an issue with the SEOmoz crawl?
Thanks for any help.
Mike
-
I am not using a rewrite rule yet -- I was asking if there is one that would resolve this issue.
-
Are you specifying the URL rewrite rule at the page level, or in your .htaccess? I had a similar issue once on a WordPress Multisite install that was rewriting
example.com/site2 -> site2.com
And:
example.com/site3 -> site3.comThe issue wasn't "real" in that the users' browsers were moving to the preferred URLs specified in the HTTP headers, but our crawl tests were a nightmare of non-existent files much like yours. Rel="canonical" will help in that case to avoid penalties, but won't do any favors for page rank or indexation. I believe our developers created some additional page-level rewrites to deal with the phantom pages created in the crawl, but alas, I'm not sure what the details were.
You might post in a new thread or reach out to Chris Abernethy directly, he's far savvier with PHP than I am.
-
I have a similar problem, and I couldn't see a solution on the site that your link refers to. Maybe you can help?
In both SEOmoz reports and GWT I get duplicate meta descriptions and/or duplicate title tags on pages that do not physically (or logically) exist. I'm not talking about dynamically generated URLs. What I see is for a given page, several other appended pages that have no relationship to the first, like this:
/realpage1.php/anotherrealpage1.html
/realpage1.php/adifferentrealpage2.html
/realpage1.php/anotherrealpage3.php
/realpage1.php/directory/realpage4.htmlPerhaps related to this issue, I discovered that if a trailing slash is entered after any URL typed into the browser (other than the home page), our custom 404 page appears, but with no CSS styling or active javascript.
I have been wondering if a rewrite rule that eliminates trailing slashes would work, but then it would never display a sub-directory's default index page, right?
I've searched all over for some help with this, to no avail. Any help will be much appreciated.
-
Modern search engines won't penalize you for this, but you may lose link juice if your content has multiple URLs and each is receiving links. Best practice is to set up a few simple PHP mod_rewrite rules in your .htaccess for basic URL display issues (enforce trailing backslash, redirect to/away from www, etc.), as well as to declare your preferred URL in the HTML of each page using this handy .
Here's a great tutorial how to force lower-case URLs written by a fellow Mozzer (props, Chris! It's how I learned...), and here's 10 other useful mod_rewrites to add to your repertoire.
-
You sir are a gentleman and a scholar.
Thanks for your help Matt.
-
Use canonicalization to resolve this common duplicate content issue.
You need to place the canonical tag pointing to your preferred URL.
See this SeoMoz guide on how to do it -
http://www.seomoz.org/learn-seo/duplicate-content
See
Rel="canonical"
this actually uses the example of capitalization and one page appearing as three to search engines...
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content from long Site Title
Hello! I have a number of "Duplicate Title Errors" as my website has a long Site Title: Planit NZ: New Zealand Tours, Bus Passes & Travel Planning. Am I better off with a short title that is simply my website/business name: Planit NZ My thought was adding some keywords might help with my rankings. Thanks Matt
Technical SEO | | mkyhnn0 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
URLs Case Sensitive Serving Duplications
At PakWheels.com we have URLs being generated in upper and lower cases. For example following URLs serve pages: http://www.pakwheels.com/used-cars/search/-/mk_Toyota/ http://www.pakwheels.com/used-cars/search/-/mk_toyota/ Both show same content. Similarly in following four cases: http://www.pakwheels.com/used-cars/search/-/mk_Toyota/md_corolla/ http://www.pakwheels.com/used-cars/search/-/mk_Toyota/md_Corolla/ http://www.pakwheels.com/used-cars/search/-/mk_toyota/md_Corolla/ http://www.pakwheels.com/used-cars/search/-/mk_toyota/md_corolla/ all of these 4 URLs serve page with same content. What is the best practice for this issue, is it generating duplication? Please advise
Technical SEO | | razasaeed0 -
Duplicate Footer Content
A client I just took over is having some duplicate content issues. At the top of each page he has about 200 words of unique content. Below this is are three big tables of text that talks about his services, history, etc. This table is pulled into the middle of every page using php. So, he has the exact same three big table of text across every page. What should I do to eliminate the dup content. I thought about removing the script then just rewriting the table of text on every page... Is there a better solution? Any ideas would be greatly appreciated. Thanks!
Technical SEO | | BigStereo0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0