Duplicate content warning: Same page but different urls???
-
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance
Page 1 is http://yourdigitalfile.com/signing-documents.html
the warning page is http://www.yourdigitalfile.com/signing-documents.html
another example
Page 1 http://www.yourdigitalfile.com/
same second page http://yourdigitalfile.com
i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one???
thanks very much
-
Thanks Tim. Do you have any examples of what those problems might be? With such a large catalog managing those rel canonical tags will be difficult (I don't even know if the store allows them, it's a hosted store solution and little code customization is allowed).
-
Hi there AspenFasteners, in this instance rather than a .HTAccess rule I would suggest applying a rel canonical tag which points to the page you deem as the original master source.
Using the robots to try and hide things could potentially cause you more issues as your categories may struggle to be indexed correctly.
-
We have a similar problem, but much more complex to handle as we have a massive catalog of 80,000 products and growing.
The problem occurs legitimately because our catalog is so large that we offer different navigation paths to the same content.
http://www.aspenfasteners.com/Self-Tapping-Sheet-Metal-s/8314.htm
http://www.aspenfasteners.com/Self-Tapping-Sheet-Metal-s/8315.htm
(If you look at the "You are here" breadcrumb trail, you will see the subtle differences in the navigation paths, with 8314.htm, the user went through Home > Screws, with 8315.htm, via Home > Security Fasteners > Screws).
Our hosted web store does not offer us htaccess, so I am thinking of excluding the redundant navigation points via robots.txt.
My question: is there any reason NOT to do this?
-
Oh ok
The only reason i was thinking it is duplicate content is the warnings i got on the moz crawl, see below.
75 Duplicate Page Content
6 4xx Client Error
5 Duplicate Page Title
44 Missing Meta Description Tag
5 Title Element is Too Short
I have found over 80 typos, grammatical errors, punctuation errors and incorrect information which was leading me to believe the quality of the work and their attention to detail was rather bad, which is why i thought this was a possibility.
Thanks again for your time its really appreciated
-
I wouldn't say that they have created two pages, it is just that because you have two versions of the domain and not set a preferred version that you are getting it indexing twice. .HTaccess changes are under the hood of the website and could have simply been an oversight.
-
Hey Tim
Thanks for your answer. It's really weird, other than lazyness on the devs part not to remove old or previous versions of pages?, have you any idea why they would create multiple versions of the same page with different url's?? is there any legit reason like ones severs mobile or something??
Just wondering thanks for replying
-
OK, so in this instance the only issue you have is that you need to choose your preferred start point - www or non www.
I would add a bit of code to your htaccess file to point to your preferred choice. I personally prefer a www. domain. Something like the below would work.
RewriteCond %{HTTP_HOST} ^example.com$
RewriteRule (.*) http://www.example.com/$1 [R=301,L]As your site is already indexed I would also for the time being and as more of a safety measure add canonicals to the pages that point to the www. version of your site.
Also if you have a Google Search Console account, you can select your prefered domain prefix in there. this will again help with your indexation.
Hopefully I have covered most things.
Cheers
Tim
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz spam score 16 for some pages - Never a manual penalty: Disavow needed?
Hi community, We have some top hierarchy pages with spam score 16 as per Moz due to the backlinks with very high spam score. I read that we could ignore as long as we are not employing paid links or never got a manual penalty. Still we wanna give a try by disavowing certain domains to check if this helps. Anyway we are not going to loose any backlink score by rejecting this low-quality backlinks. Can we proceed? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Page plumetting with a optimisation score of 97\. HELP
Hi everyone, One of my pages has an optimisation score of 93, but ranks in 50+ place. What on earth can I do to address this? It's a course page so I've added the 'course' schema. I've added all the alt tags to say the keyword, UX signals aren't bad. Keyword is in the title tag. It has a meta description. Added an extra 7 internal, anchor-rich links pointing at the page this week. Nothing seems to address it. Any ideas? Cheers, Rhys
White Hat / Black Hat SEO | | SwanseaMedicine1 -
Duplication Effects on Page Rank and Domain Authority
Hi Does page rank and domain authority page rank drop due to duplication issues on a web domain or on a web page? Thanks.
White Hat / Black Hat SEO | | SEOguy10 -
Why would a blank page rank? What am I missing about this page?
In terms of content, this page is blank. Yes, there's a sidebar and footer, but no content. I've seen a page like this rank before. I'm curious if they're implementing something on the back-end I don't realize or if this is just a fluke? Etc. Also, the DA of the site is only a 15, so I don't think that's the reason. http://www.thenurselawyer.com/component/tags/tag/20-pasco-county-personal-injury-lawyers.html Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup1 -
Why is this Page Ranking for such a competitive keyword?
Hello MOZ Community! I have a question, I am hoping someone can help me understand. I am looking at this URL: http://goo.gl/BkSish ...it is ranking for this Keyword: POS Systems Now, this seems to be a pretty new URL, with few links being generated to it, as seen here: Open Site Explorer: http://moz.com/researchtools/ose/comparisons?site=http%3A%2F%2Fwww.shopkeep.com%2Fpos-system Majestic SEO: https://www.majesticseo.com/reports/site-explorer/link-profile?folder=&q=http%3A%2F%2Fwww.shopkeep.com%2Fpos-system&oq=http%3A%2F%2Fwww.shopkeep.com%2Fpos-system&IndexDataSource=F&wildcard=1 QUESTION: Can someone help us understand how or why this page is ranking so well, so quickly, for such a competitive Keyword? Thank you!
White Hat / Black Hat SEO | | mstpeter0 -
Would it be a good idea to duplicate a website?
Hello, here is the situation: let's say we have a website www.company1.com which is 1 of 3 main online stores catering to a specific market. In an attempt to capture a larger market share, we are considering opening a second website, say www.company2.com. Both these websites have a different URL, but offer the same products for sale to the same clientele. With this second website, the theory is instead of operating 1 of 3 stores, we now operate 2 of 4. We see 2 ways of doing this: we launch www.company2.com as a copy of www.company1.com. we launch www.company2.com as a completely different website. The problem I see with either of these approaches is duplicate content. I think the duplicate content issue would be even more or a problem with the first approach where the entire site is mostly a duplicate. With the second approach, I think the duplicate content issue can be worked around by having completely different product pages and overall website structure. Do you think either of these approaches could result in penalties by the search engines? Furthermore, we all know that higher ranking/increased traffic can be achieved though high quality unique content, social media presence, on-going link-building and so on. Now assuming we have a fixed amount of manpower to provide for these tasks; do you think we have better odds of increasing our overall traffic by sharing the manpower on 2 websites, or putting it all behind a single one? Thanks for your help!
White Hat / Black Hat SEO | | yacpro130 -
Link Removal and Disavow - Is Page Rank a sign directory is okay with Google
Hi, Currently cleaning up a clients link profile in preparation for disavow file and I have reached the stage where I am undecided on some directories as I don't want to remove all links. Is Page Rank an indication that Google is okay with a particular directory? For example the following domain is questionable, but has a PR of 3. Do I need to consider scrapping all such links in anticipation of future updates? http://www.easyfinddirectory.com/shopping-and-services/clothing http://www.toplocallistings.co.uk/Apparel/West_Midlands/Shropshire/ Thanks in advance Andy
White Hat / Black Hat SEO | | MarzVentures0 -
Multiple domains different content same keywords
what would you advice on my case: It is bad for google if i have the four domains. I dont link between them as i dont want no association, or loss in rakings in branded page. Is bad if i link between them or the non branded to them branded domain. Is bad if i have all on my webmaster tools, i just have the branded My google page is all about the new non penalized domain. altough google gave a unique domain +propdental to the one that he manually penalized. (doesn't make sense) So. What are the thinks that i should not do with my domain to follow and respect google guidelines. As i want a white hat and do not do something that is wrong without knowledge
White Hat / Black Hat SEO | | maestrosonrisas0