Proper CDN Implementation
-
I am working with a site that is using the Cloudflare CDN to deliver their images. The problem is that the URLs for the images are all cloudflare URLs (cloudflare.com/example-filename.png). We want to use a cname to change the image URLs to be on a subdomain of their site to take advantage of the SEO strength.
Currently, they get a high percentage of traffic from image search and Google is directing users to their site when returning images hosted on Cloudflare, so I am apprehensive about making changes. I'm curious if anyone has any experience with this or has any insight on how Google will handle it.
Will the longer term benefits out way any short term drop in traffic from image search? WIll there be a drop in image search?
Any thoughts or experiences are appreciated.
Thanks in advance.
-
Hi Lisa
This is interesting.
As far as I can tell, everything is configured 100% correctly.
Few questions:
1. Do you keep all images in /sites/default/files/?
Were all (most) of your errors related to items using that path?2. This could be a downtime issue. Did you experience any availability problems?
3. Do you see any changes in the error report? (different results for the same URL)
If so, this makes my downtime theory more plausible.Also, the site loaded VERY slowly for me.
I've run Google Page speed and saw that you have issue with Cache headers (low expiration date that makes it harder to leverage local and CDN caching capabilities)
This should be looked at. Beyond the immediate ux and seo implications, this can contribute to server load and can be have a negative effect on website's availability. (i.e. the above mentioned downtime)
Best
Igal
-
Igal-
Related question. We just moved to a CDN for our images. Our site is www.mnn.com and our CDN is http://images.mnn.com/. The only thing we’re allowing Google to crawl are the images, so we shouldn’t have any duplicate content issues.
However, what I AM seeing is several thousand warnings in Google Webmaster Tools for our video sitemap. Our video sitemap contains about 618 videos, but we have 2,472 errors citing “Sitemap contains urls which are blocked by robots.txt” which I believe is related to the video icons in search.
When I look for a video in Google, I still see our icons related. I can’t seem to figure this one out.
Any thoughts?
Lisa
-
I`ve recently covered this in a blog post about SEO & CDN Myth busting. The short answer is "No", as Google is well aware of CDN structures.
I think the image traffic drop is un-related and it may have something to do with Bot Blocking features that prevent access from Google Image bot (which, as we recently discovered, can use non-US IPs and thus may also be considered as "fake" ).Having said that, Google official statement says that speed factors only affect 1% of SERP results. If your image traffic is high or/and valuable I would re-think this.
-
The typical way of doing this is to setup a subdomain, specifically for the content that is going to be served from a CDN, and then use a CNAME to map the ugly CDN url to the clean subdomain that you setup. I've only done this with Rackspace and AWS. If you want a good description of how this is done on AWS, see this link:
http://www.labnol.org/internet/setup-content-delivery-network-with-amazon-s3-cloudfront/5446/
Apparently, cloudflare has a way to map your website url to the CDN, as long as you're hosting with a cloud server system that they support. I haven't tried this before, but it might actually be a cleaner way of doing this, if you're hosting setup supports it.
http://blog.cloudflare.com/zone-apex-naked-domain-root-domain-cname-supp
It sounds like this would be a big win for your client, as they get a large percentage of traffic from image traffic. Using a CDN service is going to improve the speed of loading images and reduce their bandwidth bill, and mapping the CNAME should give their domain the credit for being the source of the image.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Different ways to implement schema markup?
Hey guys, What are all the ways you can implement schema markup Plugins (e.g. wordpress, shopify) Google data highlighter in GSC Google Structured Data Markup Helper Google Tag Manager Also is google data highlighter different to Google Structured Data Markup Helper? Cheers.
Intermediate & Advanced SEO | | michel_80 -
What is the proper way to execute 'page to page redirection'
I need to redirection every page of my website to a new url of another site I've made. I intend to add:"Redirect 301 /oldpage.html http://www.example.com/newpage.html"I will use the 301 per page to redirect every page of my site, but I'm confused that if I add:"Redirect 301 / http://mt-example.com/" it will redirect all of my pages to the homepage and ignore the URLs i have separately mentioned for redirection.Please guide me.
Intermediate & Advanced SEO | | NABSID0 -
Proper 301 in Place but Old Site Still Indexed In Google
So i have stumbled across an interesting issue with a new SEO client. They just recently launched a new website and implemented a proper 301 redirect strategy at the page level for the new website domain. What is interesting is that the new website is now indexed in Google BUT the old website domain is also still indexed in Google? I even checked the Google Cached date and it shows the new website with a cache date of today. The redirect strategy has been in place for about 30 days. Any thoughts or suggestions on how to get the old domain un-indexed in Google and get all authority passed to the new website?
Intermediate & Advanced SEO | | kchandler0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0 -
Authorship, SEO and implementation at an institutional level
Hi Everyone, Does anyone know of a great case study on how an institution and/or company implemented authorship in an organized manner to benefit their SEO efforts as an organization? Moz, I noticed you guys seemed to have done this? Can you give me some high level guidelines? BTW this would be a great blog post. Thanks Carla
Intermediate & Advanced SEO | | Carla_Dawson0 -
Implementing Large-Scale Redirects
Hello All, I have a situation with my site where a vendor created a local directory of locations on a sub-domain of my site. This sub-domain has approximately 2000 pages. It is a PR3 and a good backlink profile (not many links. Mostly citations. Not spammed). It get decent traffic but 80% of the traffic is driven by ppc. We have created a new local section on the main page of our website and we are trying to weigh the benefit of redirecting all of those pages on the old sub-domain. We anticipate that this new section will begin to replace the old sub-domain in serps. Additionally, when our deal with the company that manages this sub-domain ends in three months, the pages will no longer exist. Is it worth redirecting the pages (you might need more information to give good insight into that)? Also, if we do implement approx. 2000 redirects, what effect will that have on the main site from an SEO perspective. Is it possible that Google might ignore this large scale redirect effort? Will the value also be limited by the fact the redirect might only be live for a month before the original pages are deleted? Any help/insight with this would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | ResponseMine0 -
Implementing Canonicals on Existing ASP Ecommerce Store with Pagination
So I have a store which has been around for a while and is custom built on ASP.net. Store has thousands of sku's and at least a few hundred subcategories. Have been tackling a bunch of the onsite issues and for categories which have more than 6 products in them, there are multiple pages and a view all present. Example.com/category example.com/category?PageIndex=2 example.com/category?PageIndex=3 example.com/category?PageIndex=4 example.com/category?viewall=True As well as the following for every page example.com/category?PageIndex=2&viewall=True So I know how I wish to handle the pagination/canoncial issue as per google's suggestions you do it to the view all or they have the rel=next/prev. But my question is google says if view all is present they should already do a good job at ranking the view all version. Well in the rankings, there are a lot of page1 variations showing. So once this is implementated, is it safe to assume that I will see a drop? Feel like if it was a brand new site it is easy but for something this old and established, it could cause some decent harm which at the current time we are already tackling a massive list of issues which in the long haul will improve it. Looking for some insight for someone who has dealt with ASP.net and this specific area. thanks
Intermediate & Advanced SEO | | Sean_Dawes
Sean0 -
How was cdn.seomoz.org configured?
The SEOmoz CDN appears to have a "pull zone" that is set to the root of the domain, such that any static file can be addressed from either subdomain: http://www.seomoz.org/q/moz_nav_assets/images/logo.png http://cdn.seomoz.org/q/moz_nav_assets/images/logo.png The risk of this configuration is that web pages (not just images/CSS/JS) also get cached and served by the CDN. I won't put the URL here for fear of Google indexing it, but if you replace the 'www' in the URL below with 'cdn', you'll see a cached copy of the original: http://www.seomoz.org/ugc/the-greatest-attribution-ever-graphed The worst-case scenario is that the homepage gets indexed. But this doesn't happen here: http://cdn.seomoz.org/ That URL issues a 301 redirect back to the canonical www subdomain. As it should. Here's my question: how was that done? Because maxcdn.com can't do it. If you set a "pull zone" to your entire domain, they'll cache your homepage and everything else. googlebot has a field day with that; it will reindex your entire site off the CDN. Maybe the SEOmoz CDN provider (CloudFront) allows specific URLs to be blocked? Or do you detect the CloudFront IPs and serve them a 301 (which they'd proxy out to anyone requesting cdn.seomoz.org)? One solution is to create a pull zone that points to a folder, like example.com/images... but this doesn't help a complex site that has cacheable content in multiple places (do you Wordpress users really store ALL your static content under /wp-content/ ?). Or, as suggested above, dynamically detect requests from the CDN's proxy servers, and give them a 301 for any HTML-page request. This gets complex quickly, and is both prone to breakage and very difficult to regression-test. Properly retrofitting a complex site to use a CDN, without creating a half-dozen new CDN subdomains, does not appear to be easy.
Intermediate & Advanced SEO | | mcglynn0