I went CDN and now Roger is ticked off?
-
I have optimized my blogs (for speed) and it was a huge success.
One site that was loading around the 7-8 second mark now actually has more content including a walk on video and my best load time is a .42 seconds with both sides giving me low to mid 90s.
Roger has been pretty quiet lately but now as the CDN requires 302 redirects to function with dynamic content, I have hundreds and hundreds of 302 warnings.
I did not realize that site speed was weighted so heavy but 3 of my blogs have gone from PR1 to PR3 overnight without any other changes at all.
I know Google will penalize a site with too many 302 but I am pretty sure that until you can afford a memcache system this is standard practice. Just wanted to make sure that I do not have another HARD lesson ahead.
Thanks
-
Hi David,
It's odd that Roger would pick up on so many 302s. When I ran a crawl of your site with Screaming Frog and the only 302s were for static assets, which I don't think roger would be crawling anyway. What sort of URLs are you getting the 302s for? If you think these are errors, feel free to contact the help team ([email protected]) and see if they can help you sort this out.
You could most likely rewrite whatever Wordpress plugin to return a 301 instead of a 302 (or have a wordpress php developer do it for you - it should be a pretty simple job) but overall, it probably won't make much a huge difference in your SEO.
But in general, you want to minimize your redirects when speeding up your site. For example, SEOmoz uses a CDN, but each static resource is linked directly to the CDN server, without the redirect.
For my own wordpress sites, I use WP Super Cache and CloudFlare. Both are free and have sped up my sites amazingly well. In particular, I'm a huge fan of CloudFlare, given the extra security measures. Well worth checking out.
-
Thank You,
The site that is springing the most 302 is http://86theprint.com which I had running on a full CDN (w3 Total Cache) but it broke the site so I have tried a plugin that uses S3 buckets...
As to not waste any of your time, although it is awesome you are willing to look, all I really need to know is that the 302's must go. If that is true then it is what it is and I will get it back to localhost files and start over. Thank You Very Much.
Yeager
I also have several 302 from QuickResponseQR.com. Everything on this site is on the CDN, all cnamed through the sub domain cdn.quickresponseqr.com & cdn2.quickresponseqr.com. CDN is Cloudfront.
-
Hey David
can you post a link to your page and highlight some CDN content?
Lots of sites use a CDN for assets and don't have these kind of issues and certainly don't use a 302 system to generate the URL.
If you look at Unbounce and their noobs guide the main site runs on www.unbounce.com but the assets are on what I assume to be their CDN which runs on assets.unbounce.com.
If you were to look at one of the common cloud vendors like say rackspace.co.uk then you would just get a web ready URL that returns your content via a standard HTTP 200 OK response from the most local cache of the content.
I worked on a site recently that used rackspace and it seemed like a pretty solid solution and certainly there were no 302 redirects.
Post a link though and happy to take a quick look and see if I can't feed back a little more else maybe consider using a different CDN!
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirected all my pages to my new domain, now I have a problem with Google Search Console
Hi guys! I bought a new domain name and redirected all my URLs from the old domain to the new one. Everything worked perfectly but now I have a little problem. I want to use the option 'Address Change' in google search console. Step 1 Works (Select new website in the list) Step 2 Works (Confirm that the 301 are working) Step 3 Asks me to Verify the old domain (huh!?) in order to complete the request. Obviously that doesn't work because my 301s WORKS! So if I try to verify the old website by putting a google file in the root of my domain Google tries to access it and it automatically redirects to the new domain. I must be missing something lol help!
Technical SEO | | benoit_20180 -
URL has caps, but canonical does not. Now what?
Hi, Just started working with a site that has the occasional url with a capital, but then the url in the canonical as lower case. Neither, when entered in a browser, resolves to the other. It's a Shopify site. What do you think I should do?
Technical SEO | | 945010 -
Added Schema and Rankings Went Down
Hello - We launched a schema plugin for our WordPress site to make our blog seen as articles and main page as an organization. The day after, we saw a dramatic decrease in Keyword rankings but our website health improved with Google. Any thoughts on what could be causing this?
Technical SEO | | Erin_IAN0 -
Images on sub domain fed from CDN
I have a client that uses a CDN to fill images, from a sub domain ( images.domain.com). We've made sure that the sub domain itself is not blocked. We've added a robots.txt file, we're creating an image sitemap file & we've verified ownership of the domain within GWT. Yet, any crawler that I use only see's the first page of the sub domain (which is .html) but none of the subsequent URL's which are all .jpeg. Is there something simple I'm missing here?
Technical SEO | | TammyWood0 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Spam posts indexed, what to do now?
Hi, So we had a staff problem last week and we let some spam posts (cheap nike jerseys etc.) that also got indexed by Google. (We just checked and there are lik 105 already indexed) Of course we have now removed all these spam posts but what is the best practice at this point? Are we supposed to do something else to remove these from Google's index? (maybe through google webmaster tools?) We have already edited robots.txt to disallow those pages as a quick remedy. And finally, could this have done any harm? We were quite slow noticing these posts to remove them. They were there for about 12 days. thanks
Technical SEO | | Gamer070 -
Ranked on Page 1, now between page 40-50... Please help!
My site, http://goo.gl/h0igI was ranking on page one for many of our biggest keywords. All of a sudden, we completely fell off. I believe I'm down somewhere between page 40-50. I have no warning or error messages in webmaster tools. Can anyone please help me identify what the problem is? This is completely unexpected and I don't know how to fix it... Thanks in advance
Technical SEO | | Prime850 -
Very well established blog, new posts now being indexed very late
I have an established blog.We update it on daily basis. In the past, when I would publish a new post, it would get indexed within a minute or so. But since a month or so, its taking hours. Sometimes like 10-12 hours for new posts to get indexed. Only thing I have changed is robots.txt. This is the current robots file. User-agent: * Disallow: /cgi-bin Disallow: /wp-admin Disallow: /wp-includes Disallow: /wp-content/plugins Disallow: /wp-content/cache Disallow: /wp-content/themes Disallow: /wp-login.php Disallow: /*wp-login.php* Disallow: /trackback Disallow: /feed Disallow: /comments Disallow: /author Disallow: /category Disallow: */trackback Disallow: */feed Disallow: */comments Disallow: /login/ Disallow: /wget/ Disallow: /httpd/ Disallow: /*.php$ Disallow: /*?* Disallow: /*.js$ Disallow: /*.inc$ Disallow: /*.css$ Disallow: /*.gz$ Disallow: /*.wmv$ Disallow: /*.cgi$ Disallow: /*.xhtml$ Disallow: /*?* Disallow: /*? Allow: /wp-content/uploads User-agent: TechnoratiBot/8.1 Disallow: # ia_archiver User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / # allow google image bot to search all images User-agent: Googlebot-Image Disallow: /wp-includes/ Allow: /* # allow adsense bot on entire site User-agent: Mediapartners-Google* Disallow: Allow: /* Sitemap: http://www.domainname.com/sitemap.xml.gz Site has tons of backlinks. Just wondering if something is wrong with the robots file or if it could be something else.
Technical SEO | | rookie1230