Moving from http to https: image duplicate issue?
-
Hello everyone,
We have recently moved our entire website virtualsheetmusic.com from http:// to https:// and now we are facing a question about images.
Here is the deal: All webpages URLs are properly redirected to their corresponding https if they are called from former http links. Whereas, due to compatibility issues, all images URLs can be called either via http or https, so that any of the following URLs work without any redirect:
http://www.virtualsheetmusic.com/images/icons/ResponsiveLogo.png
https://www.virtualsheetmusic.com/images/icons/ResponsiveLogo.png
Please note though that all internal links are relative and not absolute.
So, my question is: Can that be a problem from the SEO stand point? In particular: We have thousands of images indexed on Google, mostly images related to our digital sheet music preview image files, and many of them are ranking pretty well in the image pack search results. Could this change be detrimental in some way? Or doesn't make any difference in the eyes of Google? As I wrote above, all internal links are relative, so an image tag like this one:
Hasn't changed at all, it is just loaded in a https context.
I'll wait for your thoughts on this. Thank you in advance!
-
No problem
-
Great! Glad to know that. Thank you Dimitrii, I appreciated your help very much!
-
Oh, I see. Yeah, there shouldn't be any problems, if someone else links to your images with http. And yes, your assumption is correct
-
Thank you Dimitrii to clarifying, actually all our webpages now load images only via the https://, but since many external websites are hard-linking to many of our images via the regular http:// protocol, I was thinking to allow linking to them the "insecure" way if requested. Do you see my point? So... to better clarify my initial question, let's say Google is spidering one of those external affiliates and finds an image tag like this:
Will Google consider the image found at:
http://www.virtualsheetmusic.com/image.jpg
a duplicate of:
https://www.virtualsheetmusic.com/image.jpg
?? This was my original question...
In any case, I have made some testings today, and I have been able to redirect all images via .htaccess permanently (301) to https:// and looks like even if an image is requested with the http:// from the browser, it shows up correctly because the web browser handles redirects for images in the same way it handles them for the web page itself.
So... my concern should be solved this way. But in case, for any reason, I need to be able to serve the same image from both protocols (http or https) it is my understand that that shouldn't be an issue anyway. Is my assumption correct?
Thanks again.
-
I did quick search, and there are lots of good articles about why images are not duplicate content: http://bfy.tw/9Qy4
-
So, the reason I recommend having images loading only through one resource is the "insecurity" of https connection, if any resources are loaded not over https. You might have seen that sometimes instead of green lock in a browser bar, it can show yellow exclamation mark - that's one of the reasons. And also it's just cleaner, if everything is loaded the same way.
Here is a link to resource about mixed content: https://developers.google.com/web/fundamentals/security/prevent-mixed-content/fixing-mixed-content
-
Thank you Dimitrii for your reply.
Well, your two statements above contradicts each other, in my opinion. You see, what really concerns me is your last suggestion:
"it's better to make sure that images (and all the other resources) available only through one protocol - http or https."
And hence my original concern. Why should we make sure that images are available only through one protocol if you say first that there isn't such thing as duplicate content for images? Why should we concern about that then?
Sorry for my further request for clarification. I really appreciated your help!
-
Howdy.
As far as I understand, there is no such thing as duplicate content just for images. Duplicate content is more for the page as a whole. Especially, since you guys redirected all the links, you shouldn't have any problems, since google will simply "realize" the change.
Now, it's better to make sure that images (and all the other resources) available only through one protocol - http or https.
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.co.uk to .com domain move Dec 26th, still 40% down - do I risk moving back? (desperate)
Hi All, I'm desperate for a bit of advice. I run www.tyrereviews.com which has been my project since 2006, and after LOTS of hard work over 15 years held 1000's of P1 positions in the SERPs. I recently moved from the original .co.uk to .com to aid with future internationalising plans. I was very careful not to change ANYTHING else, just 301 from the UK to the .com and updated everything in webmaster consoles. My background is development and I spent weeks triple researching everything to make sure I followed all the google best practices, as this is my life's work and primary income source. From a tech point of view the change went perfectly, but sadly google quickly started deranking the new domain, and now two months on it seems to have stabilised at around 40% down on traffic year on year and mostly dropped from the UK region. This is mostly from medium to long tail keywords. One such example is "Michelin Primacy 4" in google UK, old webmaster tools is showing my average position this time last year as 1.4 and now I'm 12.4! The .com site is geo targeted to the UK by both webmaster tools and href lang tags. So, my question is, so I keep waiting, or do I give up andrisk the switch back to the uk domain before it's too late? Thanks in advance.
Intermediate & Advanced SEO | | TyreReviews0 -
Big retailers and duplicate content
Hello there! I was wondering if you guys have experience with big retailers sites fetching data via API (PDP content etc.) from another domain which is also sharing the same data with other multiple sites. If each retailer has thousands on products, optimizing PDP content (even in batches) is quite of a cumbersome task and rel="canonical" pointing to original domain will dilute the value. How would you approach this type of scenario? Looking forward to read your suggestions/experiences Thanks a lot! Best Sara
Intermediate & Advanced SEO | | SaraCoppola1 -
The images on site are not found/indexed, it's been recommended we change their presentation to Google Bot - could this create a cloaking issue?
Hi We have an issue with images on our site not being found or indexed by Google. We have an image sitemap but the images are served on the Sitecore powered site within <divs>which Google can't read. The developers have suggested the below solution:</divs> Googlebot class="header-banner__image" _src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx"/>_Non Googlebot <noscript class="noscript-image"><br /></span></em><em><span><div role="img"<br /></span></em><em><span>aria-label="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>title="Arctic Safari Camp, Arctic Canada"<br /></span></em><em><span>class="header-banner__image"<br /></span></em><em><span>style="background-image: url('/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx?mw=1024&hash=D65B0DE9B311166B0FB767201DAADA9A4ADA4AC4');"></div><br /></span></em><em><span></noscript> aria-label="Arctic Safari Camp, Arctic Canada" title="Arctic Safari Camp, Arctic Canada" class="header-banner__image image" data-src="/~/media/images/accommodation/arctic-canada/arctic-safari-camp/arctic-cafari-camp-david-briggs.ashx" data-max-width="1919" data-viewport="0.80" data-aspect="1.78" data-aspect-target="1.00" > Is this something that could be flagged as potential cloaking though, as we are effectively then showing code looking just for the user agent Googlebot?The devs have said that via their contacts Google has advised them that the original way we set up the site is the most efficient and considered way for the end user. However they have acknowledged the Googlebot software is not sophisticated enough to recognise this. Is the above solution the most suitable?Many thanksKate
Intermediate & Advanced SEO | | KateWaite0 -
Webmaster is giving errors of Duplicate Meta Descriptions and Duplicate Title Tags
Webmaster is giving errors of Duplicate Meta Descriptions and Duplicate Title Tags after I changes the permalinks structure in wordpress. It there a quick fix for this and how damaging is the above for seo. Thanks T
Intermediate & Advanced SEO | | Taiger0 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
Why do my https pages index while noindexed?
I have some tag pages on one of my sites that I meta noindexed. This worked for the http version, which they are canonical'd to but now the https:// version is indexing. The https version is both noindexed and has a canonical to the http version, but they still show up! I even have wordpress set up to redirect all https: to http! For some reason these pages are STILL showing in the SERPS though. Any experience or advice would be greatly appreciated. Example page: https://www.michaelpadway.com/tag/insurance-coverage/ Thanks all!
Intermediate & Advanced SEO | | MarloSchneider0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Press Release and Duplicate Content
Hello folks, We have been using Press Releases to promote our clients business for a couple of years and we have seen great results in referral traffic and SEO wise. Recently one of our clients requested us to publish the PR on their website as well as blast it out using PRWeb and Marketwire. I think that this is not going to be a duplicate content issue for our client's website since I believe that Google can recognize which content has been published first, but I will be more than happy to get some of the Moz community opinions. Thank you
Intermediate & Advanced SEO | | Aviatech0