How to Resolve Duplication of HTTPS & HTPP URLs?
-
Right now, I am working on eCommerce website. [Lamps Lighting and More]
I can find out both URLs in website as follow.
HTTP Version:
http://www.lampslightingandmore.com/
HTTPS Version:
https://www.lampslightingandmore.com/
I have check one of my competitor who has implemented following canonical on both pages. Please, view source code for both URLs.
Then, I checked similar thing in SEOmoz website.
Why should I not check in SEOmoz because, They are providing best SEO information so may be using best practice to deal with HTTPS & HTTP. LOL
I tried to load following URL so it redirect to home page.
https://www.seomoz.org is redirecting to http://www.seomoz.org
But, following URL is not redirecting any where as well as not set canonical over there.
https://www.seomoz.org/users/settings
I can find out following code on http://www.seomoz.org/robots.txt
**User-agent: ***
** Disallow: /api/user?***
So, I am quite confuse to solve issue. Which one is best 301 redirect or canonical tag? If any live example to see so that's good for me and make me more confident.
-
I have set robots.txt file for HTTP and HTTPS versions. You can find out both file above your response. Thanks for your answer.
-
Our solution to this, was to make sure we had a canonical for each and every page pointing to the http:// version.
Secondly https:// was only made available after logging in.
-
Yep
-
Now, Looks fine... Right??
-
You are right. Because, I have solid confusion after reading article about duplication. I checked my website and found HTTPS and HTTP pages and raising questions in that direction.
-
So, What about canonical tag. I am too confuse with it. What is ultimate conclusion. Because, I have make it live one website after getting suggestion.
Any eCommerce experience which will help me to understand more. What is best solution in my case. My goal is remove duplication in website and improve crawling rate.
-
I believe you're messing things, honestly. 1st > choose a canonical version for your site (www. or not). Sometimes absolute urls can give problems for https version of a site. 2nd > consider if your really want to index the htpps version... If not, put no index or block it via robots.txt. If yes use as canonical tag the http URL of the https page.
-
I would use no índex for the https version of the site, or block it from robots.txt, if i don't want it to be indexed.
-
I want to add similar mind bubble in this question.
http://www.lampslightingandmore.com/
https://www.lampslightingandmore.com/
I have make canonical tag live after discussion over here. But, I have confusion regarding Relative & Absolute URLs.
I am using absolute URLs in canonical tag but, website have relative URLs.
So, Does it create any issue or stop down get benefit of canonical tag?
-
Yes, I don't want to crawl my HTTPS pages and don't want to create duplication by HTTPS and HTTP pages.
-
My question is in same manner. So, why WayFair have set canonical in website?
-
But you don't want your https pages crawled if there's the same version available as http. This is mostly a technical issue, but crawling a https site is way more expensive for both bot and server.
-
How to Resolve Duplication of HTTPS & HTTP URLs?
Neither a redirect nor a canonical tag is necessary.
HTTP, HTTPS, FTP, etc are various protocols used to access information contained on your web server. The data itself is only instanced once, but you can access the data by using these various protocols. It is not a duplication of data and will not cause any SEO issues.
-
301 redirect doesn't exclude a canonical. If you just want to use one solution, use the 301. There was a YouMoz post about exactly this topic a time ago, have look at it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass URL changes and redirecting those old URLS to the new. What is SEO Risk and best practices?
Hello good people of the MOZ community, I am looking to do a mass edit of URLS on content pages within our sites. The way these were initially setup was to be unique by having the date in the URL which was a few years ago and can make evergreen content now seem dated. The new URLS would follow a better folder path style naming convention and would be way better URLS overall. Some examples of the **old **URLS would be https://www.inlineskates.com/Buying-Guide-for-Inline-Skates/buying-guide-9-17-2012,default,pg.html
Intermediate & Advanced SEO | | kirin44355
https://www.inlineskates.com/Buying-Guide-for-Kids-Inline-Skates/buying-guide-11-13-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Inline-Hockey-Skates/buying-guide-9-3-2012,default,pg.html
https://www.inlineskates.com/Buying-Guide-for-Aggressive-Skates/buying-guide-7-19-2012,default,pg.html The new URLS would look like this which would be a great improvement https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Kids-Inline-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Inline-Hockey-Skates,default,pg.html
https://www.inlineskates.com/Learn/Buying-Guide-for-Aggressive-Skates,default,pg.html My worry is that we do rank fairly well organically for some of the content and don't want to anger the google machine. The way I would be doing the process would be to edit the URLS to the new layout, then do the redirect for them and push live. Is there a great SEO risk to doing this?
Is there a way to do a mass "Fetch as googlebot" to reindex these if I do say 50 a day? I only see the ability to do 1 URL at a time in the webmaster backend.
Is there anything else I am missing? I believe this change would overall be good in the long run but do not want to take a huge hit initially by doing something incorrectly. This would be done on 5- to a couple hundred links across various sites I manage. Thanks in advance,
Chris Gorski0 -
Http - Https Issue
Hey there Mozzers, I have a site that few months ago went from being http - https. All the links redirect perfect but after scanning my site with Screaming Frog i get a bunch of 503 errors. After looking into my website I see that a lot of links in my content and menu have as a link the http url. For example my homepage has content that interlinks to the http version of the site. And even though when I test it it redirects correctly after scanning with Screaming frog it reports back as 503. Any ideas what's going on? Thanks in advance
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
Proper Hosting Setup to Avoid Subfolders & Duplicate Content
I've noticed with hosting multiple websites on a single account you end up having your main site in the root public_html folder, but when you create subfolders for new website it actually creates a duplicate website: eg. http://kohnmeat.com/ is being hosted on laubeau.com's server. So you end up with a duplicate website: http://laubeau.com/kohn/ Anyone know the best way to prevent this from happening? (i.e. canonical? 301? robots.txt?) Also, maybe a specific 'how-to' if you're feeling generous 🙂
Intermediate & Advanced SEO | | ATMOSMarketing560 -
Duplicate Content Question
Brief question - SEOMOZ is teling me that i have duplicate content on the following two pages http://www.passportsandvisas.com/visas/ and http://www.passportsandvisas.com/visas/index.asp The default page for the /visas/ directory is index.asp - so it effectively the same page - but apparently SEOMOZ and more importantly Google, etc treat these as two different pages. I read about 301 redirects etc, but in this case there aren't two physical HTML pages - so how do I fix this?
Intermediate & Advanced SEO | | santiago230 -
Duplicate page content and Duplicate page title errors
Hi, I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error. Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content. I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links". Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla. Thanks a lot for your help Marco
Intermediate & Advanced SEO | | marcodublin0 -
Strategies in Renaming URLs
We're renaming all of our Product URLs (because we're changing eCommerce Platforms), and I'm trying to determine the best strategy to take. Currently, they are all based on product SKUs. For example, Bacon Dental Floss is: http://www.stupid.com/fun/BFLS.html Right now, I'm thinking of just using the Product name. For example, Bacon Dental Floss would become: http://www.stupid.com/fun/bacon-dental-floss.html Is this strategy the best for SEO? Any better ideas? Thanks!
Intermediate & Advanced SEO | | JustinStupid0