Http & https canonicalization issues
-
Howdyho
I'm SEOing a daily deals site that mostly runs on https Versions. (only the home page is on http). I'm wondering what to do for canonicalization. IMO it would be easiest to run all pages on https.
But the scarce resources I find are not so clear. For instance, this Youmoz blog post claims that https is only for humans, not for bots! That doesn't really apply anymore, right?
-
cool, thank for your backup on this. I figured that the bot-redirect described would be a bit over the top. And great to know that having a http-homepage on an otherwise https domain is a non-issue. much appreciated!
-
I wouldn't redirect just bots to http instead of https. That is a lot of work and honestly not worth the effort. I have worked on plenty of sites that require https across the entire site and Google crawls those with no issue.
As for your website, I wouldn't worry too much about http v. https for the canonical. However, if you can I would keep it consistent on a page level. It would be fine to say that the canonical for the about page is https://www.domain.com/about but the canonical for the home page is http://www.domain.com. I've had to do this on a number of ecomerce sites where the product pages are https but the rest of the site is http.
I hope that helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Footer Content Issue
Please check given screenshot URL. As per the screenshot we are using highlighted content through out the website in the footer section of our website (https://www.mastersindia.co/) . So, please tell us how Google will treat this content. Will Google count it as duplicate content or not? What is the solution in case if the Google treat it as duplicate content. Screenshot URL: https://prnt.sc/pmvumv
Technical SEO | | AnilTanwarMI0 -
Issues with getting a web page indexed
Hello friends, I am finding it difficult to get the following page indexed on search: http://www.niyati.sg/mobile-app-cost.htm It was uploaded over two weeks back. For indexing and trouble shooting, we have already done the following activities: The page is hyperlinked from the site's inner pages and few external websites and Google+ Submitted to Google (through the Submit URL option) Used the 'Fetch and Render' and 'Submit to index' options on Search Console (WMT) Added the URL on both HTML and XML Sitemaps Checked for any crawl errors or Google penalty (page and site level) on Search Console Checked Meta tags, Robots.txt and .htaccess files for any blocking Any idea what may have gone wrong? Thanks in advance!
Technical SEO | | RameshNair
Ramesh Nair0 -
Migration to https
Hi there, For several reasons we consider to switch from http to https. My question about this: Does this change impact organic search results since the URL changes? Is a simple 301 on the highest level enough to keep all of our positions with every page? Are there any other possible issues we might think about before deciding? I'm talking about a webshop with over 50k indexed pages and lots of running marketing channels all setted up based on the http URL structure. Thanks in advance.
Technical SEO | | MarcelMoz
Marcel0 -
How to do ip canonicalization ?
Hi , my website is opening with IP too. i think its duplicate content for google...only home page is opening with ip, no other pages, how can i fix it?, might be using .htaccess i am able to do...but don't know proper code for this...this website is on wordpress platform... Thanks Ramesh
Technical SEO | | unibiz0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
CSS Issue or not?
Hi Mozzers, I am doing an audit for one of my clients and would like to know if actually the website I am dealing with has any issues when disabling CSS. So I installed Web developer google chrome extension which is great for disabling cookies, css... When executing "Disable CSS", I can see most of the content of the page but what is weird is that in certain sections images appear in the middle of a sentence. Another image appears to be in the background in one of the internal link section(attached pic) Since I am not an expert in CSS I am wondering if this represents a CSS issue, therefore a potential SEO issue? If yes why it can be an SEO issue? Can you guys tell me what sort of CSS issues I should expect when disabling it? what should I look at? if content and nav bar are present or something else? Thank you dBCvk.png
Technical SEO | | Ideas-Money-Art0 -
Backlink density & disavow tool
I am cleaning up my backlink profile for www.devoted2vintage.co.uk but before I start removing links I wanted some advice on the following: I currently have over 2000 backlinks from about 200 domains. Is this a healthy ratio or should I prune this? Is there a recommended max number of backlings per domain? Should I delete links to all or some of the spun PR articles (some of the article web pages have over 40 articles with links back to us)
Technical SEO | | devoted2vintage0 -
HTTPS attaching to home page
Hi!! Okay - weird tech question. Domain is http://hiphound.com. I have SSL attaching to checkout and my account pages. Tested and works well. Issue - I am able to reach the home page at https://hiphound.com AND http://hiphound.com. If I access the home page via HTTPS and click on a link (any link) then the site is redirected to HTTP again which is good. My concern is the home page displaying via HTTPS and HTTP. Is this is an issue that can be resolved or is it expected behavior I have to live with.? I am being told by DEV there is nothing they can do about it but want to understand why and if they are correct. Thoughts? Thank you!! Lynn
Technical SEO | | hiphound0