301s - A Year Late
-
A website I recently was asked to help with was redesigned last year, but no 301s were setup. Looking at the old URLs 95% of the ones from early 2013 are 404s. Their traffic dropped from 50,000 per month to 10,000 and I believe this is one of the reasons.
Now the question is: a year later, will it do any good to setup 301 redirects from those old urls. My current thought is that the old URLs have probably lost any link juice they had. But it should hurt anything to setup the 301s anyway.
Any thoughts on whether this is worth my time and effort?
-
Absolutely get those 301s into place as soon as possible Beth! Not only will you likely see some increased traffic from links that are out there to the old pages, but you'll also likely see a nice rankings boost. Right now, any links to the old pages are essentially "lost" to your site for ranking influence purposes. Getting the redirects in place will allow that ranking influence to again be credited to the client's new pages.
When you do start adding the redirects, make sure to add an Annotation to the related Google Analytics profile. Depending on the number and quality of the redirected pages, and on whether the site's 404 page currently has Analytics tracking, you're going to see a bit of a shift in engagement metrics. If there's no tracking on the 404 page, you'll see an increase in visits as visitors land on "real" pages instead of the 404. If there was 404 tracking before, you'll see a decrease in Bounce Rate and increase in pages/visit as far more visitors stick around the real pages instead of just bouncing from the 404 page. You'll want to be able to refer back to the date the redirecting started so you'll always be able to put stats changes into context around this process. (e.g. a year form now when the client is trying to figure out why there was a site improvement around this time)
[Hint - make sure you've got solid 404 page tracking in Analytics and keep checking it as you go along. It's an essential addition to just watching for what's showing up in Webmaster Tools, for example.]
Some more suggestions for the process:
- Use Analytics to track improvements in the metrics you expect to benefit from this process. This is how you'll demonstrate the benefit of the work, and get credit (and therefore reputation) for your efforts. You can even set up Goals around the expected improvements to make them easier to track.
- Use Screaming Frog, Xenu Link Sleuth or equivalent tool to run a check of all internal pages to ensure none of your own pages include broken internal links. Screaming Frog (paid version) can also be used to bulk-test your redirects immediately after implementation.
- Watch for any high-value incoming links to old pages that you think you might be able to get corrected at source (i.e. an external site you have any sort of relationship with). Since each redirect wastes a bit of "link juice" you're even better off getting the original link corrected to point to the right page, instead of having to go through the redirect. Only worth it for strong links.
- Watch for opportunities to use REGEX to combine several redirects into one rule. Fewer rules is better for site speed.
- If you don't have a copy of the original site to extract the URLs from, you can use the Wayback Machine to see a version of the site form before the migration.
- to create a list of the old URLs that are still indexed, use the site:mydomain.com search operator to find the majority of still-indexed URLs. You can then use the SERPSRedux bookmarklet to scrape all the results into a csv and use Excel filtering to find all the old URLs (tip - set your Google Search results to show 100 results per page to make the scraping faster)
- Set up an ongoing and regular process for checking for and dealing with such 404s. Any site should have this in place, but especially one that has been redeveloped.
Lastly, since you know you've got a lot of 404's coming in, make certain you have a really top-notch 404 error page that is designed to capture as many visitors and possible and help move them to real content without losing them. Again, important for any site, but well worth extra attention for any site that knows it has a 404 problem. (This is far better than "soft 404ing" to a home page, for example, for a number of technical and usability reasons.)
So bottom line on "whether this is worth my time and effort?" You better believe it is. probably one of the best things you could do for the site at this point. I have direct experience doing this for several sites and the improvements are significant and quite gratifying - both for you and the site owner.
Hope those are useful ideas?
Paul
-
Hiya, they may have lost link juice but then again there may be a blog giving you praise with a link that still "active". It's never too late to make a 301, remember though its best to 301 to the most relevant category or closest page. You can also set up a soft 404 page so even if you miss one the user can still navigate to a page like home page.
Moz has some great tips if you want a read or to refresh your mind.
-
Yes. It's better late than never. You might not get any rank but I consider 301s to be good policy beyond the SEO aspect. I hate going to a page where there's a link and I click it and I get a 404 or bounced to the front page. Perhaps I have a bookmark. Perhaps it's an old link. Whatever the case, do your visitors a courtesy and redirect them to the correct page.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console 'Change of Address' Just 301s on source domain?
Hi all. New here, so please be gentle. 🙂 I've developed a new site, where my client also wanted to rebrand from .co.nz to .nz On the source (co.nz) domain, I've setup a load of 301 redirects to the relevant new page on the new domain (the URL structure is changing as well).
Technical SEO | | WebGuyNZ
E.G. On the old domain: https://www.mysite.co.nz/myonlinestore/t-shirt.html
In the HTACCESS on the old/source domain, I've setup 301's (using RewriteRule).
So that when **https://www.mysite.co.nz/**myonlinestore/t-shirt.html is accessed, it does a 301 to;
https://mysite.nz/shop/clothes/t-shirt All these 301's are working fine. I've checked in dev tools and a 301 is being returned. My question is, is having the 301's just on the source domain only enough, in regards to starting a 'Change of Address' in Google's Search Console? Their wording indicates it's enough but I'm concerned, maybe I also need redirects on the target domain as well? I.E. Does the Search Console Change of Address process work this way?
It looks at the source domain URL (that's already in Google's index), sees the 301 then updates the index (and hopefully pass the link juice) to the new URL. Also, I've setup both source and target Search Console properties as Domain Properties. Does that mean I no longer need to specify that the source and target properties are HTTP or HTTPS? I couldn't see that option when I created the properties. Thanks!0 -
Old site selected as canonical on GSC 3 years after migration?
Recently my company started consulting for a SaaS company. They're clearly the best known, most trusted company on their area of work and they have the strongest brand, best product and therefore more users than any of their competitors by a big margin. Still, 99% of their traffic comes from branded, despite having 3x more domains, better performance scores and more content. Even using tools such as SimilarWeb for comparing user satisfaction metrics, they seem to have lower bounce rates and more visits per session. Still, they rank for almost nothing that is non branded on Google (they rank extremely well for almost everything on bing and DuckDuckGo). They don't have any obvious issues with crawling or indexation - we've gone to great depths to tick off any issues that could be affecting this. My conclusion is that it's either a penalty or a bug, but GSC is not flagging any manual actions. These are the things we've identified: All the content was moved from domain1.com to domain2.com at the end of 2017. 301s were put in place, migration was confirmed on GSC. Everything was done with great care and we couldn't identify any issues with it. Some subdomains of the site, especially support, rank extremely well for all sorts of keywords, even very competitive ones but the www subdomain ranks for almost nothing on Google. The www subdomain has 1,000s of domains pointing to it while the support has only a few 100s. Google is performing delayed rendering attempts on old pages, JS and CSS particularly versions of assets that were live before the migration in 2017, including the old homepage. Again, the redirects have been in place for 3 years. Search Console frequently showing old HTML (at least a year old) in cache despite a recent crawl date and a current 301. Search Console frequently processing old HTML (at least a year old) when reporting on schema. Search Console is sometimes selecting pages from the old domain as the canonical of a URL of an existing page of the current domain, despite a long-standing 301 and the canonicals being well configured for 3 years now. Has anyone experienced anything similar in the past? We've been doing an analysis of old SEO practices, link profile, disavow... nothing points to black hat practices and at this point we're wondering if it's just Google doing a terrible job with this particular domain.
Technical SEO | | oline1230 -
Mass 301s
Hi All, im trying to find a way to do a mass list of 301s instead of just doing them individually, does anyone have any ideas or tips into how i can do this?
Technical SEO | | Kennelstore0 -
Photography site still not ranking well after over year
I have a photography business In the Orlando Fl market. I have noticed my site: joeyvalyphotography.com Does not rank even In the top 50, for my strongest keyword: Orlando Wedding Photographer However, the keyword Orlando engagement Photography/photographer Ranks Very well, and has been rising I'm rank very rapidly over the last two weeks. I'm not sure why my main keyword is not going anywhere, and what my next move should be. I'm considering spending few fhousand dollars on a n seo company, however I though I could learn how to improve my keyword ranking on my own to no avail. Any and all tips, suggestions, advice, will be greatly appreciated.
Technical SEO | | gaji0 -
Javascript late loaded content not read by Gogglebot
Hi, We have a page with some good "keyword" content (user supplied comment widget), but there was a design choice made previously to late load it via JavaScript. This was to improve performance and the overall functionality relies on JavaScript. Unfortunately since it is loaded via js, it isn't read by Googlebot so we get no SEO value. I've read Google doesn't weigh <noscript>content as much as regular content. is this true? Once option is just to load some of the content via <noscript> tags. I just want to make sure Google still reads this content.</p> <p>Another option is to load some of the content via simple html when loading the page. If JavaScript is enabled, we'd hide this "read only" version via css and display the more dynamic user friendly version. - Would changing display based on js enabled be deemed as cloaking? Since non-js users would see the same thing (and this provides a ways for them to see some of the functionality in the widget, it is an overall net gain for those users too).</p> <p>In the end, I want Google to read the content but trying to figure out the best way to do so.</p> <p>Thanks,</p> <p>Nic</p> <p> </p></noscript>
Technical SEO | | NicB10 -
Pagerank and 301s
Hi all For various reasons some of our pages were renamed from: http://www.meresverige.dk/rejser/malmoe to: http://www.meresverige.dk/rejser/malmo We have made proper 301 redirects and also updated sitemap.xml accordingly. The change was done about 5th of September. The content on the pages remain identical. This page, and all pages below it in the structure now get very low or no page-rank at all. Much lower than it was before the name change. Any ideas to how long, if ever, it will take for Google to transfer the page-rank from the old page? Any suggestions to what we can do to make the process faster?
Technical SEO | | Resultify0 -
Too many 301s?
Hi there, If there is a website that has accidently generated say 1,000 pages of duplicate content, would the seo be hurt if all those pages were re-directed to the origional source of the content? There are no plans to re-write the 1,000 duplicate pages, they are already cached and indexed by Google. I thought about canonical tags but as they have some traffic and a little seo value i thought 301 re-direct would be more appropiate to the relevant pages? I am also right in thinking you would be able to remove the 301 in the .htaccess file once the index has updated? Also once removed the 301 - i could use those urls later from scratch if i wanted? Any info much appreciated.
Technical SEO | | pauledwards0 -
Very, very confusing behaviour with 301s. Help needed!
Hi SEOMoz gang! Been a long timer reader and hangerouter here but now i need to pick your brains. I've been working on two websites in the last few days which are showing very strange behaviour with 301 redirects. Site A This site is an ecommerce stie stocking over 900 products and 000's of motor parts. The old site was turned off in Feb 2011 when we built them a new one. The old site had terrible problems with canonical URLs where every search could/would generate a unique ID e.g. domain.com/results.aspx?product=1234. When you have 000's of products and Google can find them it is a big problem. Or was. We launche the new site and 301'd all of the old results pages over to the new product pages and deleted the old results.aspx. The results.aspx page didn't index or get shown for months. Then about two months again we found some certain conditions which would mean we wouldn't get the right 301 working so had to put the results.aspx page back in place. If it found the product, it 301'd, if it didn't it redirected to the sitemap.aspx page. We found recently that some bizarre scenerio actually caused the results.aspx page to 200 rather than 301 or 404. Problem. We found this last week after our 404 count in GWMT went up to nearly 90k. This was still odd as the results.aspx format was of the OLD site rather than the new. The old URLs should have been forgetten about after several months but started appearing again! When we saw the 404 count get so high last week, we decided to take severe action and 301 everything which hit the results.aspx page to the home page. No problem we thought. When we got into the office on Monday, most of our product pages had been dropped from the top 20 placing they had (there were nearly 400 rankings lost) and on some phrases the old results.aspx pages started to show up in there place!! Can anyone think why old pages, some of which have been 301'd over to new pages for nearly 6 months would start to rank? Even when the page didn't exist for several months? Surely if they are 301's then after a while they should start to get lost in the index? Site B This site moved domain a few weeks ago. Traffic has been lost on some phrases but this was mainly due to old blog articles not being carried forward (what i'll call noisy traffic which was picked up by accident and had bad on page stats). No major loss in traffic on this one but again bizarre errors in GWMT. This time pages which haven't been in existence for several YEARS are showing up as 404s in GWMT. The only place they are still noted anywhere is in the redirect table on our old site. The new site went live and all of the pages which were in Googles index and in OpenSiteExplorer were handled in a new 301 table. The old 301s we thought we didn't need to worry about as they had been going from old page to new page for several years and we assumed the old page had delisted. We couldn't see it anywhere in any index. So... my question here is why would some old pages which have been 301'ing for years now show up as 404s on my new domain? I've been doing SEO on and off for seven years so think i know most things about how google works but this is baffling. It seems that two different sites have failed to prevent old pages from cropping up which were 301d for either months or years. Does anyone has any thoughts as to why this might the case. Thanks in advance. Andy Adido
Technical SEO | | Adido-1053990