Redirecting https pages
-
If I add 301 redirects from the https versions of the root domain to the http:www.website.com (canonical version), will this prevent users from being able to access the https version? I believe the https version is only accessed once users log into the site.
Thank you!
-
To directly answer your question, Erin, a 301 redirect will indeed prevent users from accessing the https versions of your pages and is not the recommended approach.
Is there a reason you want to prevent users/search engines from accessing the https versions of your pages?
Simply ensuring that all links within the site navigation point to http versions, and setting a rel=canonical on all https versions back to the http versions, should do the trick.
Best,
Mike -
What you want to do is setup the redirect for all pages "except" those pages that you want to require a person to use the https.
As an example on a site I work on, we have two areas /cart/ and /account/ that represent when someone is checking out or when they are logged into their account and want to update payment options, respectively. You would exclude these folders from the https to http 301 redirect so that users could then use that part of the site in secure mode.The rest of the site you want to have the https 301 to http. The reason you go through all this is that a http and https versions of the site, if spidered, would be considered duplicate content and you want to prevent that.
The other part of this would be that you do not want the search engines (usually) to spider the shopping cart and user login sections of a site. Nofollow noindex all links that lead to those pages and also put those folders in robots.txt - that will keep the bots out of there.
One other thing. Make sure that your templates and content within the https sections of the site link out to the non https urls. The 301 will help with this, but why link to the wrong URL anyway?
All of that said. If your site is one that you deal with highly sensitive information (medical, financial come to mind) then you may simply want to have the site run as https. You would need to bulk up your server resources to handle this as https can slow things down a little bit, but it can be done.
-
It is going to be a little bit more difficult than that. Unless you have a wildcard certificate your ssl certificate is not good for sub domains. www.site.com is considered a sub domain of site.com. That being said, I would use url rewriting to handle the problem. Here is a small example of what you could use.
Rewrite Rules for site.com
RewriteEngine On
RewriteBase /Redirect from example.com to www.site.com
RewriteCond %{HTTP_HOST} ^site.com [NC]
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]Turn SSL on for payments
RewriteCond %{HTTPS} off
RewriteCond %{SCRIPT_FILENAME} /payments.php [NC]
RewriteRule ^(.*)$ https://%{HTTP_HOST}/$1 [R=301,L]Turn SSL off everything but payments
RewriteCond %{HTTPS} on
RewriteCond %{SCRIPT_FILENAME} !/payments.php [NC]
RewriteRule ^(.*)$ http://%{HTTP_HOST}/$1 [R=301,L]You will have to enter the other pages that you want to be ssl too, this should get you started though.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Hierarchy Question
I understand the basic concept of page hierarchy, i.e. parent and child pages. My question is: Should the home page be the parent of all 2nd-level pages? Can/should there only be one top-level page, the home page? In other words, is this: site.com/homesite.com/home/products site.com/home/products/widgetsite.com/home/aboutsite.com/home/contactbetter than this:site.com/homesite.com/products site.com/products/widgetsite.com/aboutsite.com/contactThanks for your opinion!
On-Page Optimization | | BillWoods0 -
Renamed a page and created a 301, page lost its rankings.
We changed a page name to fall under the root of our site from domain.com/page1/page301d/ to domain.com/page301d/ and after 2 weeks it still is not back to its #3 position. Now it is on the bottom of page 3. I cant figure out what im doing wrong here. The original .com/page1/ that this page fell under was removed totally and redirected to antoher page that was more relevant. I went ahead and re-enabled this page and its contnent, because the page was linking out to the page we 301d. This page we re-enabled had about 150 links poitning to it and therefore i was thinking that maybe the link juice from this page (or relevancy) via an internal link was helping it rank. This was updated about 6 days ago and the internal link is back Any other ideas why this might not be working. Ive checked all the 301s, content has not changed on the page. We have updated the strcuture for many pages. Instead of having the pages in question fall under anotehr page, they all fall under the root and its sub content is now only 2 levels deep , instead of being 3. hope that makese sense.
On-Page Optimization | | waqid0 -
Geotargeting on Home Page
I work on a website that have listings across the country. Is it possible for us to geotarget the home page of the website to cater to the person based on location so we can show the geogargeting links, articles, videos on the home page and direct the user in the right direction. Forrent seem to be doing it: http://www.forrent.com/ Will this have any negative SEO impact or we can lose rankings ?
On-Page Optimization | | AdobeVAS0 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Noindex child pages (whose content is included on parent pages)?
I'm sorry if there have been questions close to this before... I've using WordPress less like a blogging platform and more like a CMS for years now... For content management purposes we organize a lot of content around Parent/Child page (and custom-post-type) relationships; the Child pages are included as tabbed content on the Parent page. Should I be noindexing these child pages, since their content is already on the site, in full, on their Parent pages (ie. duplicate content)? Or does it not matter, since the crawlers may not go to all of the tabbed content? None of the pages have shown up in Moz's "High Priority Issues" as duplicate content but it still seems like I'm making the Parent pages suffer needlessly... Anything obvious I'm not taking into consideration? By the by, this is my first post here @ Moz, which I'm loving; this site and the forums are such a great resource! Anyways, thanks in advance!
On-Page Optimization | | rsigg0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
On-Page Optimization | | BostonWright0 -
How to fix duplicate page content and page titles?
Apologies in advance if this has already been answered (it probably has) - I'm just not seeing it. Is there a guide on here for how to fix the issues brought up by the crawler - specifically, things like duplicate page content, or duplicate page titles? A lot of these seem to have been created by wordpress.org combos that I didn't anticipate - i.e., category pages, author pages, etc. The crawler brings up the problems, but I don' t know where to start to go about fixing them. Also, any guide on best SEO practices or fixing optimization problems, specifically for wordpress.org blogs, would be greatly appreciated. Thanks!
On-Page Optimization | | prospects1