How do fix twin home pages
-
Search engine analysis is indicating that my site has twin home pages (www.mysite.com and http://mysite.com).
The error message I'm getting is: "your website resides at both www.mysite.com and mysite.com.
My uploaded index page is a .htm page (not .html). I don't know if that matters.
Can someone explain how this happened and what I can do to fix it?
Thanks!
-
Hi FinalFrontier,
I agree with setting up a 301 redirect to a single version. I also recommend doing the following:
- Set up canonical URLs to your desired version
- Ensure that your XML sitemaps use your desired version
- Add both www and non-www to Google Webmaster Tools and select one as the URL you'd like displayed in search results
Best of luck!
Chris
-
If you look at the redirect code the webhost provided in their instructions, I notiched there is not a [NC] at the end of the Rewrite Cond line. I'm not sure if that [NC] is necessary or not.
Other than that and the possible time-lag you speak of, I'm at a loss.
-
It could just be a time-lag in our data (and that wouldn't shock me), but run a header checker and make sure the 301 is working properly. For example, try this:
-
Well, this isn't making any sense.
I made the following change to my .htaccess file - followed the instructions given my my web host:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mysite.com
RewriteRule (.*) http://www.mysite.com/$1 [R=301,L]
Then I ran another seoMoz root crawl a couple hours later and it still said I had the same errors on my home page (duplicate home page content and titles).
I just checked my .htaccess file again and it did save those 301 redirect changes. So why am I still getting duplicate page errors? thx.
-
Yeah, it sounds like you're not currently having major issues. I think it's good to prevent these issues (and duplicates are a real concern), but you can ease into this one, I strongly suspect.
-
Thanks for your post.
Google is indexing all my www pages (including www.mysite.com), but (I guess this is good news?) no documents show up for the:
site:mysite.com -url:www
in Google.
-
Since this issue can occur site-wide, I do tend to agree with Anton that 301-redirects are a better solution for this particular problem (although canonical tags will work, if that's your only feasible option). It is important, as implied in the comments, to make sure hat your internal links are consistent and you aren't using both versions in your site (although, with "www" vs. non-www, that's pretty rare).
Practically, it depends a lot on the size of your site, whether you have links to both versions, and whether Google has indexed both version. This is a problem in theory, but it may not currently be a problem on your site. You can check the indexed pages of both the root domain and www subdomain separately in Google with these commands:
site:mysite.com inurl:www
site:mysite.com -inurl:www
(the first pulls up anything with "www", and the second only pages without it).
If you're seeing both in play, then sorting out how to do the 301-redirects is a good bet. If you're not, then it's still a solid preventive measure, but you don't need to panic.
-
It can have a pretty major impact on search rankings. Basically what's happening is you have two identical pages for every intended page on your site. So it creates duplicate content issues.
So for example...
Someone finds something on your site that they like at www.yoursite.com/example/ and links to it from their site or shares it on Twitter, which increases the ranking power for that page.
Another person finds the same content at yoursite.com/example/ and links to it as well.
Instead of consolidating all the benefits of links to your site onto a single page, you're basically reducing your ranking potential by 50%.
-
How big of an issue is this for search engines? I'm indexed in Bing, Google, Yahoo.
I'm curious as to how big (or small) an impact this really has on a website.
thx.
-
Hi Final Frontier,
Most hosting providers will likely add this to your .htaccess file for you if you contact technical support. I know HostGator will happily provide that kind of help. If not, I'd be glad to add the lines if you'll download the file and email it to me.
-
Thanks but I'm more confused now than ever and I don't know how to change a .htaccess file, so I don't want to turn this into a DYI project and screw things up even more. I get the gist of what the problem is.
All my internal pages link back to www.mysite.com and to www.mysite.com/pages.htm throughout the site.
However, I noticed that for a img src for a facebook page (external link in my site), I am mistakenly linking that to http://mysite.com/facebook (no www). So I'll at least fix that to include www so there's consistency. Not sure if that's related to the problem - there are not other pages I've seen that link to http://mysite.com instead of www.mysite.com.
I've learned a lot here, but this is one technical thing I don't want to do myself and make things worse.
-
From: http://www.seomoz.org/blog/complete-guide-to-rel-canonical-how-to-and-why-not
There is usually a better solution
The canonical tag is not a replacement for a solid site architecture that doesn’t create duplicate content in the first place. There is almost always a superior solution to the canonical tag from a pure SEO best practice perspective.
Lets go through some of the URL examples I provided above, this time we'll talk about how to fix themwithout the canonical tag.
Example 1: http://www.example.com/quality-wrenches.htm
This is a duplicate version because our example website resolves with both the www version and the non-www version. If the canonical tag was used to pull the www version out of the index (keeping the non-www version as the canonical one) both versions would still resolve in the browser. With both versions still resolving, both versions can still continue to generate links.
A canonical tag, as with a 301 redirect, does not pass all of the link value from one page to another. It passes most of it, but not all. We estimate that the link value loss with either of these solutions is 1-10%. In this way, a 301 redirect and a canonical tag are the same.
I'd recommend a 301 redirect instead of a canonical tag.
Why, you ask? A 301 redirect takes the link value loss hit once. Once a 301 is in place, a user never lands on the duplicate URL version. They are redirected to the canonical version. If they decide to link to the page, they are going to provide that link to the canonical version. No link love lost. Compare that to the canonical tag solution which keeps both URLs resolving and perpetuates the link value loss.
From Rand's Article: http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
- Whereas a 301 redirect re-points all traffic (bots and human visitors), the Canonical URL tag is just for engines, meaning you can still separately track visitors to the unique URL versions.
- A 301 is a much stronger signal that multiple pages have a single, canonical source. While the engines are certainly planning to support this new tag and trust the intent of site owners, there will be limitations. Content analysis and other algorithmic metrics will be applied to ensure that a site owner hasn't mistakenly or manipulatively applied the tag, and we certainly expect to see mistaken use of the tag, resulting in the engines maintaining those separate URLs in their indices (meaning site owners would experience the same problems noted below).
- 301s carry cross-domain functionality, meaning you can redirect a page at domain1.com to domain2.com and carry over those search engine metrics. This is NOT THE CASE with the Canonical URL tag, which operates exclusively on a single root domain (it will carry over across subfolders and subdomains).
Rel Canonical is a great tool, but I have to disagree here. www.mysite.com is a sub-domain of mysite.com. Adding rel canonical tags to every page on the site would only send a signal to search engines specifying the preferred content, but adding a 301 redirect to the root domain one time will send all traffic, robots, and link juice to the preferred domain on a permanent basis.
-
Hi!
An easier way to fix the problem is by Canonical tags (if you´re not familiar with htaccess or server side scripts).
You find Rand Fishkins amazing article about it here:
http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemapsGood luck!
-
Hi FinalFrontier,
To fix this, you'll just need to choose which version of the domain you'd like to use and then implement a 301 redirect from the domain you don't want displayed to the preferred domain.
My personal choice is the "naked domain" (no "www"). Technically speaking, www.mysite.com is a subdomain of mysite.com and you'll notice that almost every major brand advertises their site without the "www".
When's the last time you saw an Apple commercial trying to convince you to go to www.apple.com? Seen www.eharmony.com anywhere lately?
The choice however is up to you... the key thing is make the decision and when you link to your site from another location stick with one or the other.
To implement the 301 redirect, the most common method is to edit the .htaccess file in the root directory of your site. Also, many hosting control panels (like cPanel) have this functionality built in where it can simply be activated by choosing the appropriate option in your server's configuration.
For www to non-www simply add this to your .htaccess file (replace mysite.com with your own domain)
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.mysite.com [NC]
RewriteRule ^(.*)$ http://mysite.com/$1 [L,R=301]
For the opposite (non-www to www) add this:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^mysite.com [NC]
RewriteRule ^(.*)$ http://www.mysite.com/$1 [L,R=301]
Hope this helps!
Anthony
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexable?
Hello, I've been trying to find out why Google Search Console finds these pages non-indexable: https://www.visitflorida.com/en-us/eat-drink.html https://www.visitflorida.com/en-us/florida-beaches/beach-finder.html Moz and SEMrush both crawl the pages and show no errors but GSC comes back with, "blocked by robots.txt" but I've confirmed it is not. Anyone have any thoughts? 6AYn1TL
Technical SEO | | KenSchaefer0 -
Page Rank Flow
I wonder if someone can help me understand clearly page rank flow. If we have a website with a Home page, Services, About and Contact as a very basic website and the page rank will flow to each of those pages from the Home page (i'm not including internal linking between pages or anchor text from the home page content - this is a question purely about home page flow via the main navigation). If the Services page had 3 drop down pages. Would the home page rank also flow to each of these or is it going to the Services page which then distributes it to the three drop down. So instead of Home page rank flowing to 3 pages 33% each - it is flowing to 6 pages 16.6% each. Or is it flowing to 3 pages - 33.3% then the Services pages get a third of 33.3% ->10.1% I know this is simplifying it all a great deal- but it is the basic concept I am trying to grasp on this simple example. Thanks
Technical SEO | | AL123al0 -
Duplicate page titles for blog snippets pages
I can't figure the answer to this issue, on my blog I have a number of pages which each show snippets and an image for each blog entry, these are called /recent-weddings/page/1 /2 /3 and so on. I'm getting duplicate page titles for these but can't find anywhere on Wordpress to set a unique title for them. So http://www.weddingphotojournalist.co.uk/recent-weddings/…/2/ has the same title as http://www.weddingphotojournalist.co.uk/recent-weddings/…/3/
Technical SEO | | simonatkinsphoto0 -
What to do with old conversion pages
Hey folks! I have a ton of old conversion pages from past trade shows, old webinars, etc that are either getting no traffic or very little. Wondering if I should just 404 them out? Here's an example: http://marketing.avidxchange.com/rent-manager-user-conference-demo-request-2015 For the pages getting traffic (from PPC, referral links, organic) my presumption is to keep those. The only problem is we have multiple instances of the same asset (prior marketers would just clone them for different campaigns), so in those cases should I 301 them to one version? Looking for advice on best practices here for future instances. Such as future trade shows, after we use the conversion pages at an event, should I just delete/404 them? Cleaning up old pages should I just delete/404? They don't have any value really and they're annoying to have hanging around. Thanks!
Technical SEO | | Bill_King0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
If a permanent redirect is supposed to transfer SEO from the old page to the new page, why has my domain authority been impacted?
For example, we redirected our old domain to a new one (leaving no duplicate content on the old domain) and saw a 40% decrease in domain authority. Isn't a permanent redirect supposed to transfer link authority to the place it is redirecting to? Did I do something wrong?
Technical SEO | | BlueLinkERP0 -
I know I'm missing pages with my page level 301 re-directs. What can I do?
I am implementing page level re-directs for a large site but I know that I will inevitably miss some pages. Is there an additional safety net root level re-direct that I can use to catch these pages and send them to the homepage?
Technical SEO | | VMLYRDiscoverability0