Duplicate content warning: Same page but different urls???
-
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance
Page 1 is http://yourdigitalfile.com/signing-documents.html
the warning page is http://www.yourdigitalfile.com/signing-documents.html
another example
Page 1 http://www.yourdigitalfile.com/
same second page http://yourdigitalfile.com
i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one???
thanks very much
-
Thanks Tim. Do you have any examples of what those problems might be? With such a large catalog managing those rel canonical tags will be difficult (I don't even know if the store allows them, it's a hosted store solution and little code customization is allowed).
-
Hi there AspenFasteners, in this instance rather than a .HTAccess rule I would suggest applying a rel canonical tag which points to the page you deem as the original master source.
Using the robots to try and hide things could potentially cause you more issues as your categories may struggle to be indexed correctly.
-
We have a similar problem, but much more complex to handle as we have a massive catalog of 80,000 products and growing.
The problem occurs legitimately because our catalog is so large that we offer different navigation paths to the same content.
http://www.aspenfasteners.com/Self-Tapping-Sheet-Metal-s/8314.htm
http://www.aspenfasteners.com/Self-Tapping-Sheet-Metal-s/8315.htm
(If you look at the "You are here" breadcrumb trail, you will see the subtle differences in the navigation paths, with 8314.htm, the user went through Home > Screws, with 8315.htm, via Home > Security Fasteners > Screws).
Our hosted web store does not offer us htaccess, so I am thinking of excluding the redundant navigation points via robots.txt.
My question: is there any reason NOT to do this?
-
Oh ok
The only reason i was thinking it is duplicate content is the warnings i got on the moz crawl, see below.
75 Duplicate Page Content
6 4xx Client Error
5 Duplicate Page Title
44 Missing Meta Description Tag
5 Title Element is Too Short
I have found over 80 typos, grammatical errors, punctuation errors and incorrect information which was leading me to believe the quality of the work and their attention to detail was rather bad, which is why i thought this was a possibility.
Thanks again for your time
its really appreciated
-
I wouldn't say that they have created two pages, it is just that because you have two versions of the domain and not set a preferred version that you are getting it indexing twice. .HTaccess changes are under the hood of the website and could have simply been an oversight.
-
Hey Tim
Thanks for your answer. It's really weird, other than lazyness on the devs part not to remove old or previous versions of pages?, have you any idea why they would create multiple versions of the same page with different url's?? is there any legit reason like ones severs mobile or something??
Just wondering
thanks for replying
-
OK, so in this instance the only issue you have is that you need to choose your preferred start point - www or non www.
I would add a bit of code to your htaccess file to point to your preferred choice. I personally prefer a www. domain. Something like the below would work.
RewriteCond %{HTTP_HOST} ^example.com$
RewriteRule (.*) http://www.example.com/$1 [R=301,L]As your site is already indexed I would also for the time being and as more of a safety measure add canonicals to the pages that point to the www. version of your site.
Also if you have a Google Search Console account, you can select your prefered domain prefix in there. this will again help with your indexation.
Hopefully I have covered most things.
Cheers
Tim
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are online tools considered thin content?
My website has a number of simple converters. For example, this one converts spaces to commas
White Hat / Black Hat SEO | | ConvertTown
https://convert.town/replace-spaces-with-commas Now, obviously there are loads of different variations I could create of this:
Replace spaces with semicolons
Replace semicolons with tabs
Replace fullstops with commas Similarly with files:
JSON to XML
XML to PDF
JPG to PNG
JPG to TIF
JPG to PDF
(and thousands more) If somoene types one of those into Google, they will be happy because they can immediately use the tool they were hunting for. It is obvious what these pages do so I do not want to clutter the page up with unnecessary content. However, would these be considered doorway pages or thin content or would it be acceptable (from an SEO perspective) to generate 1000s of pages based on all the permutations?1 -
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
Why there is lot of difference in Domain Authority vs majestic trust flow strange???
Hello all I want to ask you why there is difference in DA authority vs majestic trust authority as both of these companies say they have the best authority alogrithm see the below link for refrence. http://wp.auburn.edu/bassclub/next-meeting-1-28-2014/
White Hat / Black Hat SEO | | adnan11010 -
HELP: What happened to my rankings? No warning from google how to know if i was penalised?
Hi Guys I have just completely a site re-design, I have 3 top level domains. I have no idea whats causing the drop in ranking. I have changed the title tags and meta tags to improve them and make them better, as the last ones weren't really doing us justice. But I see now it has actually dropped our main keyword. I read somewhere that i had to completed **site search **to check and I don't see our home page showing. I was ranking for the keyword: "online psychics" for over 4months at #6 and now is not showing anywhere in the top 50 keywords. I'm also affraid I can not find our other keyword "online psychic readings" which we were ranked #11 seems to have dropped to #44 I have no idea why this would be the case. Our new home page shows a better user experience and also added more content, unqiue content at that - our last design was content thin so I have no idea why we have dropped so much in rankings. The site is also new about 6months new. I have checked WMT and have not received any warnings of any penalties as such, unless it is still coming? Does anyone have any suggestions here? Cheers
White Hat / Black Hat SEO | | edward-may1 -
Is Google not Penalizing aggressively anymore for on page manipulation?
I wanted to throw this out where we have been seeing so much emphasis on Google cracking down on bad linking, have they let up enforcement on manipulative on-page tactics that have faded in current years? I've been seeing hidden text popping up again and ranking. Here is an example. Google "landscaping Portsmouth NH" and find the #1 result. Now find "Portsmouth" on the page. So what I find interesting, the site has a clean backilnk profile, but that's a pretty blatant manipulation hiding those keywords. What I find interesting is I filled out a report on it a year ago. (I'm not a big "fill out spam report" guy, I was curious if Google would take action). A year later it is still #1 for the competitive keyword. So I'm curious if others have seemed similar trends like font-size:0px, or text color as the background popping back up and ranking. I would love other's thoughts on it.
White Hat / Black Hat SEO | | BCutrer0 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
IP-Based Content on Homepage?
We're looking to redesign one of our niche business directory websites and we'd like to place local content on the homepage catered to the user based on IP. For instance, someone from Los Angeles would see local business recommendations in their area. Pretty much a majority of the page would be this kind of content. Is this considered cloaking or in any way a bad idea for SEO? Here are some examples of what we're thinking: http://www.yellowbook.com http://www.yellowpages.com/ I've seen some sites redirect to a local version of the page, but I'm a little worried Google will index us with localized content and the homepage would not rank for any worthwhile keywords. What's the best way to handle this? Thanks.
White Hat / Black Hat SEO | | newriver0 -
How much pain can I expect if I change the URL structure of the site again?
About 3 months ago I implemented a massive URL structure change by 'upgrading' some of the features of our CMS Prior to this URL's for catergorys and products looked something like this http://www.thefurnituremarket.co.uk/proddetail.asp?prod=OX09 I made a few changes but din't implement it fully as I felt it would be better to do it instages as the site was getting indexed more thouroughly. HOWEVER... We have just hit the first page for some key SERP's and I am wary to rock the boat again by changing the URL structures again and all the sitemaps. How much pain do you think we could feel if i went ahead and optimised the URL's fully? and What would you do? 🙂
White Hat / Black Hat SEO | | robertrRSwalters0