I went CDN and now Roger is ticked off?
-
I have optimized my blogs (for speed) and it was a huge success.
One site that was loading around the 7-8 second mark now actually has more content including a walk on video and my best load time is a .42 seconds with both sides giving me low to mid 90s.
Roger has been pretty quiet lately but now as the CDN requires 302 redirects to function with dynamic content, I have hundreds and hundreds of 302 warnings.
I did not realize that site speed was weighted so heavy but 3 of my blogs have gone from PR1 to PR3 overnight without any other changes at all.
I know Google will penalize a site with too many 302 but I am pretty sure that until you can afford a memcache system this is standard practice. Just wanted to make sure that I do not have another HARD lesson ahead.
Thanks
-
Hi David,
It's odd that Roger would pick up on so many 302s. When I ran a crawl of your site with Screaming Frog and the only 302s were for static assets, which I don't think roger would be crawling anyway. What sort of URLs are you getting the 302s for? If you think these are errors, feel free to contact the help team ([email protected]) and see if they can help you sort this out.
You could most likely rewrite whatever Wordpress plugin to return a 301 instead of a 302 (or have a wordpress php developer do it for you - it should be a pretty simple job) but overall, it probably won't make much a huge difference in your SEO.
But in general, you want to minimize your redirects when speeding up your site. For example, SEOmoz uses a CDN, but each static resource is linked directly to the CDN server, without the redirect.
For my own wordpress sites, I use WP Super Cache and CloudFlare. Both are free and have sped up my sites amazingly well. In particular, I'm a huge fan of CloudFlare, given the extra security measures. Well worth checking out.
-
Thank You,
The site that is springing the most 302 is http://86theprint.com which I had running on a full CDN (w3 Total Cache) but it broke the site so I have tried a plugin that uses S3 buckets...
As to not waste any of your time, although it is awesome you are willing to look, all I really need to know is that the 302's must go. If that is true then it is what it is and I will get it back to localhost files and start over. Thank You Very Much.
Yeager
I also have several 302 from QuickResponseQR.com. Everything on this site is on the CDN, all cnamed through the sub domain cdn.quickresponseqr.com & cdn2.quickresponseqr.com. CDN is Cloudfront.
-
Hey David
can you post a link to your page and highlight some CDN content?
Lots of sites use a CDN for assets and don't have these kind of issues and certainly don't use a 302 system to generate the URL.
If you look at Unbounce and their noobs guide the main site runs on www.unbounce.com but the assets are on what I assume to be their CDN which runs on assets.unbounce.com.
If you were to look at one of the common cloud vendors like say rackspace.co.uk then you would just get a web ready URL that returns your content via a standard HTTP 200 OK response from the most local cache of the content.
I worked on a site recently that used rackspace and it seemed like a pretty solid solution and certainly there were no 302 redirects.
Post a link though and happy to take a quick look and see if I can't feed back a little more else maybe consider using a different CDN!
Cheers
Marcus
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can we validate a CDN like Max in Webmasters?
Hi, Can we validate a CDN like Max in Webmasters? We have images hosted in CDN and they dont get indexed in Google images. Its been a year now and no luck. Maxcdn says they have no issues at there end and images have ALT and they are original images with no copyright issues
Technical SEO | | ArchieChilds0 -
Just saw a competitor jump in rank by double digits, questioning my url structure choice now.
Currently I have for our big keyword oursite.com/big-keyword/ and clicking on a material type will be oursite.com/big-keyword/material-type/ Our competition has **theirsite.com/big-keyword/ **and when you click on their material type **theirsite.com/material-type-big-keyword/ ** The also have 20 some pages, while we have around 652 as a eCommerce site as well, not sure why they jumped so high in rankings, while their backlink structure is so small still and they have a DA half of ours. I'm in the middle of a site redesign and very close to restructuring the urls the way they have it, since it really seems to have worked well. How do you feel about that?
Technical SEO | | Deacyde0 -
Goolge: Mobile friendliness is now a ranking factor on Mobile
John Mueller from Google has just confirmed: Google has just confirmed that mobile friendliness is now a ranking factor for people on smartphones. The date it goes live is 24th April 2015. There is a great presentation on the hangout - which will be available later to watch and once its available I will update with a link - but I would make sure your checking webmaster tools for any alerts / messages if you care about smartphone users. What do people recommend, regarding responsive vs mobile site vs dynamic serving. Is your site ready? What are you planning on doing since the update has been announced.
Technical SEO | | Andy-Halliday1 -
Our stage site got crawled and we got an unnatural inbound links warning. What now?
live site: www.mybarnwoodframes.com stage site: www.methodseo.net We recently finished a redesign of our site to improve our navigation. Our developer insisted on hosting the stage site on her own server with a separate domain while she worked on it. However, somebody left the site turned on one day and Google crawled the entire thing. Now we have 4,320 pages of 100% identical duplicate content with this other site. We were upset but didn't think that it would have any serious repercussions until we got two orders from customers from the stage site one day. Turns out that the second site was ranking pretty decently for a duplicate site with 0 links, the worst was yet to come however. During the 3 months of the redesign our rankings on our live site dropped and we suffered a 60% drop in organic search traffic. On May 22, 2013 day of the Penguin 2.0 release we received an unnatural inbound links warning. Google webmaster tools shows 4,320 of our 8,000 links coming from the stage site domain to our live site, we figure that was the cause of the warning. We finished the redesign around May 14th and we took down the stage site, but it is still showing up in the search results and the 4,320 links are still showing up in our webmaster tools. 1. Are we correct to assume that it was the stage site that caused the unnatural links warning? 2. Do you think that it was the stage site that caused the drop in traffic? After doing a link audit I can't find any large amount of horrendously bad links coming to the site. 3. Now that the stage site has been taken down, how do we get it out of Google's indexes? Will it be taken out over time or do we need to do something on our end for it to be delisted? 4. Once it's delisted the links coming from it should go away, in the meantime however, should we disavow all of the links from the stage site? Do we need to file a reconsideration request or should we just be patient and let them go away naturally? 5. Do you think that our rankings will ever recover?
Technical SEO | | gallreddy0 -
Page has a 301 redirect, now we want to move it back to it's original place
Hi - This is the first time I've asked a question! My site, www.turnkeylandlords.co.uk is going through a bit of a redesign (for the 2nd time since it launched in July 2012...) First redesign meant we needed to move a page (https://www.turnkeylandlords.co.uk/about-turnkey-mortgages/conveyancing/) from the root to the 'about-us' section. We implemented a 301 redirect and everything went fine. I found out yesterday that the plan is to move this page (and another one as well, but it's the same issue so no point in sharing the URL) back to the root. What do I do? A new 301? Wouldn't this create a loop? Or just delete the original 301? Thanks in advance, Amelia
Technical SEO | | CommT0 -
We just fixed a Meta refresh, unified our link profile and now our rankings are going crazy
Crazy in a bad way!I am hoping that perhaps some of you have experienced this scenario before and can shed some light on what might be happening.Here is what happened:We recently fixed a meta refresh that was on our site's homepage. It was completely fragmenting our link profile. All of our external links were being counted towards one URL, and our internal links were counting for the other URL. In addition to that, our most authoritative URL, because it was subject to a meta refresh, was not passing any of its authority to our other pages.Here is what happened to our link profile:Total External Links: Before - 2,757 After - **4,311 **Total Internal Links: Befpre - 125 After - 3,221
Technical SEO | | danatanseo
Total Links: Before - 2,882 After - 7,532Yeah....huge change. Great right? Well, I have been tracking a set of keywords that were ranking from spots 10-30 in Google. There are about 66 keywords in the set. I started tracking them because at MozCon last July Fabio Riccotta suggested that targeting keywords showing up on page 2 or 3 of the results might be easier to improve than terms that were on the bottom of page 1. So, take a look at this. The first column shows where a particular keyword ranked on 11/8 and the second column shows where it is ranking today and the third column shows the change. For obvious reasons I haven't included the keywords.11/8 11/14 Change****10 44 -34
10 26 -16
10 28 -18
10 34 -24
10 25 -15
15 29 -14
16 33 -17
16 32 -16
17 24 -7
17 53 -36
17 41 -24
18 27 -9
19 42 -23
19 35 -16
19 - Not in top 200
19 30 -11
19 25 -6
19 43 -24
20 33 -13
20 41 -21
20 34 -14
21 46 -25
21 - Not in top 200
21 33 -12
21 40 -19
21 61 -40
22 46 -24
22 35 -13
22 46 -24
23 51 -28
23 49 -26
24 43 -19
24 47 -23
24 45 -21
24 39 -15
25 45 -20
25 50 -25
26 39 -13
26 118 - 92
26 30 -4
26 139 -113
26 57 -31
27 48 -21
27 47 -20
27 47 -20
27 45 -18
27 48 -21
27 59 -32
27 55 -28
27 40 -13
27 48 -21
27 51 -24
27 43 -16
28 66 -38
28 49 -21
28 51 -23
28 58 -30
29 58 -29
29 43 -14
29 41 -12
29 49 -20
29 60 -31
30 42 -12
31 - Not in top 200
31 59 -28
31 68 -37
31 53 -22Needless to say, this is exactly the opposite of what I expected to see after fixing the meta refresh problem. I wouldn't think anything of normal fluctuation, but every single one of these keywords moved down, almost consistently 20-25 spots. The further down a keyword was to begin with, it seems the further it dropped.What do you make of this? Could Google be penalizing us because our link profile changed so dramatically in a short period of time? I should say that we have never taken part in spammy link-building schemes, nor have we ever been contacted by Google with any kind of suspicious link warnings. We've been online since 1996 and are an e-commerce site doing #RCS. Thanks all!0 -
Ranked on Page 1, now between page 40-50... Please help!
My site, http://goo.gl/h0igI was ranking on page one for many of our biggest keywords. All of a sudden, we completely fell off. I believe I'm down somewhere between page 40-50. I have no warning or error messages in webmaster tools. Can anyone please help me identify what the problem is? This is completely unexpected and I don't know how to fix it... Thanks in advance
Technical SEO | | Prime850 -
How to disallow google and roger?
Hey Guys and girls, i have a question, i want to disallow all robots from accessing a certain root link: Get rid of bots User-agent: * Disallow: /index.php?_a=login&redir=/index.php?_a=tellafriend%26productId=* Will this make the bots not to access any web link that has the prefix you see before the asterisk? And at least google and roger will get away by reading "user-agent: *"? I know this isn't the standard proceedure but if it works for google and seomoz bot we are good.
Technical SEO | | iFix0