SEOMoz Crawl Diagnostic indicates duplicate page content for home page?
-
My first SEOMoz Crawl Diagnostic report for my website indicates duplicate page content for my home page. It lists the home page URL Page Title and URL twice.
How do I go about diagnosing this?
Is the problem related to the following code that is in my .htaccess file? (The purpose of the code was to redirect any non "www" backlink referrals to the "www" version of the domain.)
RewriteCond %{HTTP_HOST} ^whatever.com [NC]
RewriteRule ^(.*)$ http://www.whatever.com/$1 [L,R=301]Should I get rid of the "http" reference in the second line?
Related to this is a notice in the "Crawl Notices Found" -- "301 Permanent redirect" which shows my home page title as "http://whatever.com" and shows the redirect address as http://http://www.whatever.com/
I'm guessing this problem is again related to the redirect code I'm using.
Also...
The report indicates duplicate content for those links that have different parameters added to the URL i.e. http://www.whatever.com?marker=Blah Blah&markerzoom=13
If I set up a canonical reference for the page, will this fix this?
Thank you.
-
I contacted the help desk as instructed an was told:
"I took a look at the campaign and it looks like our crawler can't parse the 301 redirect you have in place on the main page. The reason for this is the redirect in place, adds two https when rogerbot tries to crawl through it. Roger can’t parse the redirect as is, but it can identify it (as it did in your notice’s report on the crawl diagnostics page). This isn't a problem for browsers since they are made to ignore redirects of this nature. Crawlers on the other hand have a strict code to follow and can't follow redirects like that. When I load up your site [mywebsite.com] right now, it redirects to www.[mywebsite].com. Try creating a new campaign under the domain you are redirecting to, this should clear any issues up."
And so I did that and it looks like that worked after the new crawl, however I then set up another campaign for another website I manage being sure to use the "www" in front of the domain and got the same problem again -- the home page appears twice as duplicate content.
So I'm back to asking my primary question again: What is the definitive redirect code to use to convert a non "www" request to a "www" request? The same redirect code mentioned in my first post is being used on all of my sites.
-
Hi Perry,
In your browser, are you seeing things redirect to the double http? I think there's a bug in the crawl tools that's causing some false errors right now. Before you go work on the redirect file, could you send an email to [email protected] to first make sure we're not the ones that messed up?
Thanks!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Showing up on Moz Crawl | www. vs. no-www.
Hello Moz Community! I am new to SEO, Moz and this is my first question. My questions; I have a client that is getting flagged for Duplicate Content. He is getting flagged for having two domains that have the same content i.e. www.mysite.com & mysite.com. I read into this and set up a 301 redirect through my hosting site. I evaluated which site had a stronger Page Authority and had the weaker site redirect to the stronger site. However, I am still getting hit for Duplicate pages caused by the www.mysite.com & mysite.com being duplicates. How should I go about resolving this? Is this an example of a Canonical tag needed in the head of the HTML? Any direction is appreciated. Thank You. B/R Will H.
Technical SEO | | MarketingChimp100 -
Can Page Content & Description Have Same Content?
I'm studying my crawl report and there are several warnings regarding missing meta descriptions. My website is built in WordPress and part of the site is a blog. Several of these missing description warnings are regarding blog posts and I was wondering if I am able to copy the first few lines of content of each of the posts to put in the meta description, or would that be considered duplicate content? Also, there are a few warnings that relate to blog index pages, e.g. http://www.iainmoran.com/2013/02/ - I don't know if I can even add a description of these as I think they are dynamically created? While on the subject of duplicate content, if I had a sidebar with information on several of the pages (same info) while the content would be coming from a WP Widget, would this still be considered duplicate content and would Google penalise me for it? Would really appreciate some thoughts on this,please. Thanks, Iain.
Technical SEO | | iainmoran0 -
Why do they rank the home page?
We are trying to rank for the key word Motorcycle Parts. We have moved up to page 2 over the past couple months; however, google is ranking our home page not our http://www.rockymountainatvmc.com/s/49/61/Motorcycle-Parts page that is for motorcycle parts. We are working on internal linking to help point the right signals too. Any other thoughts? ( we have new content written to put in as well we just have to wait for an issue to be fixed before we can put it in)
Technical SEO | | DoRM0 -
Issue: Duplicate Page Content
Hi All, I am getting warnings about duplicate page content. The pages are normally 'tag' pages. I have some blog posts tagged with multiple 'tags'. Does it really affect my site?. I am using wordpress and Yoast SEO plugin. Thanks
Technical SEO | | KLLC0 -
Shopify duplicate content issue
We recently moved out site to shopify but now have a duplicate content issue as we have the same products in different collections. I have added canonical code to get rid of this but my webmaster tools still shows hundreds of duplicate pages. How can I tell if the code I added is working? How long will it take for google to recognise this and drop the duplicates from their index and is this likely to have a significant impact on SERPS? Our we page is www.devoted2vintage.co.uk. Thanks Paul
Technical SEO | | devoted2vintage1 -
SEOMoz Crawling Errors
I recently implemented a blog using WordPress on our website. I didn't use WordPress as the CMS for the rest of our site just the blog portion. So as an example I installed Wordpress in http://www.mysite/blog/" not in the root. My error report in SEOMoz went from 0 to 22e. The Moz bot or crawler that SEOMoz uses is reporting a ton of 4xx errors to strang links that shouldn't exist anywhere on the site. Example: Good link - http://www.mysite/products.html Bad link reported by SEOMoz - http://www.mysite/blog/my-first-post/products.html I've also noticed that my page speed as become much slower as reported by Google. Does anybody know what could be happening here? I know that typically it's better to install WordPress in the root and use it to control the entire site but I was under the gun to get a blog out. Thanks
Technical SEO | | TRICORSystems0 -
SEO MOZ report showing duplicate content pages with without ending /
Hello the SEOMOZ report is showing me I have a lot of duplicate content and then proceeds listing almost every page on my site as showing with a URL with an ending "/" and without. I checked my sitemap and only one version is there, the one with "/". I have a Wordpress site. Any recommendations ? Thanks.
Technical SEO | | dpaq20110 -
Duplicate Pages Issue
I noticed a problem and I was wondering if anyone knows how to fix it. I was a sitemap for 1oxygen.com, a site that has around 50 pages. The sitemap generator come back with over a 2000 pages. Here is two of the results: http://www.1oxygen.com/portableconcentrators/portableconcentrators/portableconcentrators/services/rentals.htm
Technical SEO | | chuck-layton
http://www.1oxygen.com/portableconcentrators/portableconcentrators/1oxygen/portableconcentrators/portableconcentrators/portableconcentrators/oxusportableconcentrator.htm These are actaully pages somehow. In my FTP there in the first /portableconentrators/ folder there is about 12 html documents and no other folders. It looks like it is creating a page for every possible folder combination. I have no idea why you those pages above actually work, help please???0