Should I 301 my non-www accesses to www accesses?
-
We have external links pointing to both mydomain.com and www.mydomain.com. I read this: http://www.stepforth.com/resources/web-marketing-knowledgebase/non-www-redirect/ and wondered if I should add this to my .htaccess file:
RewriteCond %{HTTP_HOST} ^mydomain.com
RewriteRule (.*) http://www.mydomain.com/$1 [R=301,L]so that the link juice all flows to the www version of the site? Any reason not to do it?
-
Mike,
This is best practice on all sites. Forgetting about the links for a minute; you have two versions of the same page (in search engines eyes) so the first step is to do exactly what you're asking and consolidate into one.
The next step is to ensure your URLs end with a trailing slash because if you have www.url.com and www.url.com/ you again have two different versions.
DD
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Weird Google indexing issues with www being forced
IM working on a site which is really not indexing as it should, I have created a sitemap.xml which I thought would fix the issue but it hasn't, what seems to be happening is the Google is making www pages canonical for some of the site and without www for the rest. the site should be without www. see images attached for a visual explanation.
Technical SEO | | Donsimong
when adding pages in Google search console without www some pages cannot be indexed as Google thinks the www version is canonical, and I have no idea why, there is no canonical set up at all, what I would do if I could is to add canonical tags to each page to pint to the non www version, but the CMA does not allow for canonical. not quite sure how to proceed, how to tell google that the non www version is in fact correct, I dont have any idea why its assuming www is canonical either??? k11cGAv zOuwMxv0 -
Www to non-www redirect without loop still necessary?
Hi Guys, My question is: Is it still necessary to redirect www. to non www. version of your website or other way around?
Technical SEO | | gaben
I ask because I feel that Google should be able to read these as the same by now. Thank you in advance.
Gabe0 -
Google webmaster and analytics access to others
Hi, Google webmaster and analytics access to others with restricted and user access is this ok and secure? With this access can anyone tamper anything? thanks
Technical SEO | | mtthompsons0 -
Link building to ROOT domain OR to WWW.?
Hello, Here I come with one more 'sensitive' question, hoping that you SEO gurus could give some input on. My title explains pretty much what I'm wondering about, but let me give you some short data. I have from .htaccess file set that all traffic goes to WWW.mydomain.com. I know that it is 'better' for search engines not to have duplicate destinations as that can give decreased page rank because of 'double content'. As for search engines http://domain.com and http://www.domain.com is totally different domains. Now wondering one thing: If I build a several thousands of backlinks at various sources, blogs, directories, web sites etc etc. - shall I link to domain ROOT or shall I include WWW prefix? When looking at Moz Keyword Analysis for my domains, I can see a block about 'Linking Root Domains' and 'Page Linking Root Domains'. But no 'www' variable (sub-domain) there. As I have already set canonical part so everything shows with WWW on my website - what logic shall I use when building backlinks? How will search engine translate the link juice in regards I wrote above? Thanks in advance, great forum!
Technical SEO | | SEOisSEO0 -
How long for a sitewide 301 to reindex?
Hey Gang, Finally joined the big boys here, excited to see what we all can do together. Here is my situation. I have been struggling since panda 1.0 on a particular site at www.burnworld.com. Over 2011 we figured out what the issues were with the content and went on a major cleanup. This seemed to help towards the end of the 2011. However further panda updates this year mainly April have again struck. This was after adding a wordpress blog to the site late 2011, so it was a mix of a traditional html site and a wordpress blog. Thinking that this could be an issue in May this year we transferred all the content over to wordpress only. We did keep the same linking strucutre using a permallink plugin to set specific url's. Forward to Panda 20. This wiped out all rankings and then we could not even rank for our own content. One site that syndicates our content is now ranking for our content instead of us, and many 'feed' sites that scrape out feeds also rank insead of us. Okay now to my original question. 2 weeks ago we pulled the plug and made the decision it may be best to start over on www.burnworld.net since the .net in the past was a blog on wordpress (which was shutdown earlier n 2012), but sat with about 5 pages of content until we did the 301. So today none of the pages are in the main index and I am wondering if doing the 301 might have been a mistake by pointing it to an existing site that never really ranked. Would it have been best to start on a new domain? How long have others seen before google puts the pages back in the main index? Would like to figure out the best action to take to get back into google's good graces. I'll keep this page updated so others with this issue can hopefully have a resource to turn to. BTW- nothing has chaned with Binghoo, rankings are all the same and they have updated the domain change properly.
Technical SEO | | robdawg0 -
301 redirects tanked our site on google - what now?
We had several hundred old pages on the site with duplicate content and new pages with fresh info on the same topics. So I redirected the old pages to the new pages. Next day, plop, we're dumped off google for almost every keyword. Dang I thought they didn't want duplicate content and old funky pages. What did I do wrong and what can I do to fix it? Thanks so much for anyone who can share their expertise. Jean
Technical SEO | | JeanYates0 -
Robots.txt and 301
Hi Mozzers, Can you answer something for me please. I have a client and they have 301 re-directed the homepage '/' to '/home.aspx'. Therefore all or most of the linkjuice is being passed which is great. They have also marked the '/' as nofollow / noindex in the Robots.txt file so its not being crawled. My question is if the '/' is being denied access to the robots is it still passing on the authority for the links that go into this page? It is a 301 and not 302 so it would work under normal circumstances but as the page is not being crawled do I need to change the Robots.txt to crawl the '/'? Thanks Bush
Technical SEO | | Bush_JSM0 -
How do I 301 url's with numbers in them?
I have a number of 404 error pages showing in webmaster tools and some of the url's have numbers, % symbols, and some are pdf's. My usual 301 redirect in my htaccess file does NOT redirect these pages where the url's have special characters. What am I doing wrong?
Technical SEO | | BradBorst0