HTTPS and HTTP both exist! How to handle?
-
I was asked to do some SEO work for a website and learned that just 6 weeks ago, their webmaster added an HTTPS instance of the site. Their backlinks all point to HTTP and the 6 pages that are already ranking are all on the HTTP site. I'm afraid to rock the boat by redirecting the site from HTTP to HTTPS as we may lose rank.
What are some suggestions? If I just pull down the HTTPS will that hurt us? Would you just go ahead and redirect it? IF so, would you do each page individually or as a whole?
-
Hey there,
the sooner you'll redirect HTTP to HTTPS the better. You are going to have more backlinks and rankings in time so it would be definitely harder in the future.
Also, try to convince the 6 websites which already point to HTTP to change their links to HTTPS.
And then, of course, redirect each page individually, as you said.
Hope it helps, Martin
-
Hi,
There should be no effect as long as your http pages 301 redirect to their https equivalents. For second part of your question see below thread.
https://mza.seotoolninja.com/community/q/proper-301-redirect-code-for-http-to-https
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Combining products - edit existing product page or 301 redirect to new page?
We want to combine existing products - e.g. 'hand lotion' and 'body lotion' will become 'hand & body lotion'. As such, we'll need to combine the two product pages into one. What would be the best route to take in terms of SEO to do this? My initial reaction is to create a new product page and then 301 or 302 redirect the old products to the new product page depending on if the change is permanent or temporary. Would you agree? Or am I missing something?
On-Page Optimization | | SwankyApple1 -
How to handle "app" pages.
Hey guys, We've got an app - a drag & drop email builder - and we are looking to improve our seo efforts. That being said - we're not sure how to treat pages of the app that wouldn't tell google nothing at all basically (loads of duplicate content, lorem ipsum, etc). They're pages that are used by the clients to build their own templates ex: builder pages they are extremely useful for our clients, but GGL wouldn't prolly make too much sense out of them. That being said - rather randomly, before we nofollow noindexed them, some of them started ranking (probably given to the really great analytics data we have on them. Loads of clients, loads of time spent on page, etc). Can we harness them in a better way, or just nofollownoindex them? I don't really see how they can be "canonicalised" since they don't really provide any quality content for Google. Much like MOZ's keyword explorer tool for ex. Mucho quality for us - but not a google fan favorite content-wise. Thanks for your help 🙂
On-Page Optimization | | andy.bigbangthemes0 -
Moving Site from HTTP to HTTPS
Hi, So the news is that Google has started giving more importance to sites with HTTPS i.e. it is now a new ranking signal. It says that as of now it affects fewer than 1% of global queries, and carrying less weight than other signals such as high quality content but it may decide to strengthen it as they would like to encourage all website owners to switch from HTTP to HTTPS to keep everyone safe on the web. In that case, what should we do? Switching from http:// to https:// means change in urls and low traffic. How to cope with it? Do we have to implement 'n' number of redirects? Regards,
On-Page Optimization | | IM_Learner3 -
404 errors in wordpress... Pages have never existed so why is google trying to crawl them?
I've just logged into webmaster tools and have over 100 404 errors. I'm running wordpress and I recently added child pages to 2 of my categories like so. www.mydomain.com/category1/lincolnshire www.mydomain.com/category1/cambridgeshire etc... The 404 errors though are for pages or categories I've never created though. I have over 20 root categories but decided to test adding child pages to only two of them. The 404 errors are for www.mydomain.com/category5/cambridgeshire .... It seem that gogle has tried to crawl these pages that don't exist. Can anyone explain what's going on? When I click 'linked from' in webmaster tools it's showing links from pages on my site that don't exist also.
On-Page Optimization | | SamCUK0 -
What's the best way to handle crawling of photo gallery?
When you have a photo gallery with many search filters and loads and loads of pages, is it best to block all the filters and use google's pagination code? Ex: http://photo.net/gallery/photocritique/filter This site has pages for many different queries. While the page titles are unique, the pages are showing duplicated content.
On-Page Optimization | | cakelady0 -
Will Google handle "this not that" pages differently?
If you create pages about "try keyword1 not keyword2" will there be any barriers to getting the pages ranked for keyword2? Example: You have furnished rental units in a small town, and you offer nightly/weekly rentals. You want to rank for "town hotel" since you offer the same service as a hotel. Since you're not really a hotel, you create a page called "Better than a hotel: Town nightly rental units". Anyone know if Google has an algorithm to detect this (they would have to detect the meaning of the words you were using and know that you were promoting something other than a hotel) and determine you're not really relevant to "town hotel" and not rank you well? I think they probably do not, as I've seen things like Google Adsense Alternatives articles ranking well for the term Google Adsense, or Boycott Godaddy sites ranking well for the term godaddy. But I would like to hear any evidence or facts others know of.
On-Page Optimization | | AdamThompson0 -
How would you handle network header links?
Some companies have a lot of sites covering various topics, for example, http://ninemsn.com.au/. Each category also have dropdown menus where there are more links, taking their pages to well over 100 links. Should these headers be implemented in javascript? Is there a list of best practices somewhere when dealing with a lot of network sites? I'd prefer to reduce the number of links, but sometimes company policies don't allow this. Any suggestions or tips would be helpful.
On-Page Optimization | | bigpond0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0