Http:// vs Https:// in Og:URL
-
Hi,
Recently, we have migrated our website from http:// to https://. Now, every URL is in https:// and we have used 301 permanent redirection for redirecting OLD URL's to New Ones.
We have planned to include http:// link in og:url instead of https:// due to some social share issues we are facing. My concern is, if Google finds the self http:// URL on every page of my blog, will Google gets confused with http and https:// as we are providing the old URL to Google for crawling.
Please advice.
Thanks
-
Shareaholic and Social Warfare (paid) plugins are the ones I'm familiar with, Sameer. There is also a paid add-on for Sassy Social Share plugin that will accomplish this too. Others may have also added this capability.
P.
-
Thanks for sharing the informative answers. @ThompsonPaul Which Wordpress Plugin we can use to save the Facebook share count?
-
Yea, this is a stupid screwup by Facebook because they won't fix their system to simply recognise the HTTP and HTTPS URLs are the same. (Others like Google+ got this figured out ages ago.)
Mentioning the HTTP URL for your OG data won't do any harm. It's the same as all the other websites out there with previous links still pointing to your old HTTP address. The 301-redirect and the hew HTTPS sitemap will give crawlers overwhelming directive of the correct URLs to index. (And in fact, the OG URL isn't even technically a link, so crawlers likely aren't even following it.)
As Martijn says though - you're kind of in limbo. The pages will show the old counts, but won't be aggregating the new counts. There are some plugins for WordPress that purport to be able to combine both, but I haven't used one yet.
Paul
-
What I'm guessing is that Sameer has a significant amount of URLs that have a ton of shares. So some numbers around social share counts might be off due to that. Anyway, what you could evaluate is if you can change the OG urls for only new articles. In the end HTTPS is also the new reality for you, so you might just best suck it up and change it to HTTPS. Likely the impact will be low.
-
I think you probably know what I am going to say!
If you put the HTTP version of the URL you are sending out conflicting signs to Google by telling them that both pages exist. By sharing the HTTP links on Socials you are also creating needles redirects. There is conjecture, still, as to whether 301 causes any loss of link juice, but personally I'm with Rand Fishkin, that there will be a small loss of link juice suffered by having a redirect in place (Others - this is not the place to argue about this!), so I would make it as clear as possible without any HTTP mentions on the page whatsoever.
I'm not sure what you mean about 'social share issues' but maybe I can help? what are they?
Any SEO audit software will tell you the same - if you move to HTTPS then use the correct URL otherwise the software will scream with warnings.
Regards
Nigel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Serps Not Showing HTTPS in Front of URL
Hi Everyone, We implemented the HTTPS change to our four websites about 6 months ago. I have found something that I feel is strange. The homepage of each website shows www.domain.com, but all the internal pages show https://www.domain.com/page. If you click through it shows it as secure, but I feel that because it is happening on all four websites, that something was done incorrectly. Here is one Google SERP: https://www.google.com/search?client=firefox-b-1&biw=1920&bih=947&ei=gq9GWpizBuuF_Qa_p5e4Bw&q=tanzanite+jewelry+designs&oq=tanzanite+jewelry+designs&gs_l=psy-ab.3..0l2.130446.136028.0.136152.29.17.4.7.9.0.207.2214.7j9j1.17.0....0...1c.1.64.psy-ab..1.28.2350...0i131k1j0i22i30k1.0.BA5-meGmuA0 As you can see, our site displays with no https, but all the internal pages do. It just worries me as I have seen our internal pages increasing in positioning, but not our homepage. Any ideas?
Technical SEO | | vetofunk0 -
Content change within the same URL/Page (UX vs SEO)
Context: I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page. Question: The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page. My concern: 1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page. 2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
Technical SEO | | alexrbrg0 -
Our site recently switched from http to https. Do I still need to setup a redirect for the incoming links pointing to http?
Our site recently switched from http to https. If you type in the http://www.websitename.com then it will automatically go to https://www.websitename.com ... my question is... do I still need to create a redirect in the htaccess file to ensure we don't lose all the links currently pointing to the http version of the website?
Technical SEO | | ninel_P0 -
Is there an SEO advantage to blog content being a child of /blog/ rather than the homepage?
I'm working on a website where all the blog content is listed as separate pages from the homepage, eg: www.domain.com/first-blog-post
Technical SEO | | MillyShaw
www.domain.com/second-blog-post However, it would make my life easier if all blog content was listed under /blog/ so that I could analyse it better in Google Analytics. Eg I'd like it to be: www.domain.com/blog/first-blog-post
www.domain.com/blog/second-blog-post The developer is not keen because it would create extra work for him, and he's also said it's a bad idea from an SEO point of view. But is this the case? Presumably with 301s in place it wouldn't make a difference? Thanks for your help!0 -
Changed URL of all web pages to a new updated one - Keywords still pick the old URL
A month ago we updated our website and with that we created new URLs for each page. Under "On-Page", the keywords we put to check ranking on are still giving information on the old urls of our websites. Slowly, some new URLs are popping up. I'm wondering if there's a way I can manually make the keywords feedback information from the new urls.
Technical SEO | | Champions0 -
Internal search : rel=canonical vs noindex vs robots.txt
Hi everyone, I have a website with a lot of internal search results pages indexed. I'm not asking if they should be indexed or not, I know they should not according to Google's guidelines. And they make a bunch of duplicated pages so I want to solve this problem. The thing is, if I noindex them, the site is gonna lose a non-negligible chunk of traffic : nearly 13% according to google analytics !!! I thought of blocking them in robots.txt. This solution would not keep them out of the index. But the pages appearing in GG SERPS would then look empty (no title, no description), thus their CTR would plummet and I would lose a bit of traffic too... The last idea I had was to use a rel=canonical tag pointing to the original search page (that is empty, without results), but it would probably have the same effect as noindexing them, wouldn't it ? (never tried so I'm not sure of this) Of course I did some research on the subject, but each of my finding recommanded one of the 3 methods only ! One even recommanded noindex+robots.txt block which is stupid because the noindex would then be useless... Is there somebody who can tell me which option is the best to keep this traffic ? Thanks a million
Technical SEO | | JohannCR0 -
Why do I get duplicate pages, website referencing the capital version of the url vs the lowercase www.agi-automation.com/Pneumatic-grippers.htm
Can I the rel=canonical tag this? http://www.agi-automation.com/Pneumatic-grippers.htm****http://www.agi-automation.com/pneumatic-grippers.htm
Technical SEO | | AGIAutomation0 -
Duplicate Homepage: www.mysite.com/ and www.mysite.com/default.aspx
Hi, I have a question regarding our client's site, http://www.outsolve-hr.com/ on ASP.net. Google has indexed both www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx creating a duplicate content issue. We have added
Technical SEO | | flarson
to the default.aspx page. Now, because www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx are the same page on the actual backend the code is on the http://www.outsolve-hr.com/ when I view the code from the page loaded in a brower. Is this a problem? Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. We cannot do a 301 redirect from www.outsolve-hr.com/default.aspx to www.outsolve-hr.com/ because this causes an infinite loop because on the backend they are the same page. So my question is two-fold: Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. Is the rel="canonical" the best solution to fix the duplicate homepage issue on ASP. And lastly, if Google has not indexed duplicate pages, such as https://www.outsolve-hr.com/DEFAULT.aspx, is it a problem that they exist? Thanks in advance for your knowledge and assistance. Amy0