Are 301s advisable for low-traffic URL's?
-
We are using some branded terms in URLs that we have been recently told we need to stop using. If the pages in question get little traffic, so we're not concerned about losing traffic from broken URLs, should we still do 301 redirects for those pages after they are renamed?
In other words, are there other serious considerations besides any loss in traffic from direct clicks on those broken URLs that need to be considered?
This comes up because we don't have anyone in-house that can do the redirects, so we need to pay our outside web development company. Is it worth it?
-
If those pages are indexed by Google and Google returns them in SERPs then yes, they will 404. That is why you need to test the page first and do a header redirect 301 to either the category page or the home page.
Hope that was the This Answered My Question : )
-
Great feedback! I still just have 1 remaining question, though, which I've posted below Richard's comments. Thanks!
-
The trademark issue is with the names of the subfolders, not the domain name.
-
So can you just change the links to look at the new URL? Still best to redirect them though.
Curious about why you have to change them now though as I just assumed you were using a competitors trademark in a domain before
-
Thanks for that tool! I was not familiar with it.
-
This almost fully answers my question. Those pages don't have inbound links from other sites. We have over 10,000 pages on the site, so we can't have links to them all. So, they aren't worth keeping for traffic or links.
But you say, "I would hope that you capture your 404 errors and 301 redirect all the time anyway." So, my last remaining question is: Am I necessarily creating 404 errors by not redirecting?
Thanks, everyone!
-
Yes, these are just pages on our main site. They will be renamed, and we will be keeping the content on the site.
-
If I'm reading this right though, it is only the URLs they've got to stop using, not the content. Therefore a 404 provide alternate content suggestions isn't necessary in this case; I agree that a 301 redirect is best solution - it passes the human traffic and the link juice to the correct location.
As to whether it is worth the cost, then of course it is the famous answer of "it depends". However, I'd imagine that the cost of redirects should be pretty minimal and if the old URLs drive just a couple of conversions (whatever that may be) then it should have been worthwhile, even ignoring the link juice.
-
As Ryan was stating; if those pages have inbound links, test those links for strength and if they are worth keeping, then 301.
Either way, I would hope that you capture your 404 errors and 301 redirect all the time anyway.
-
Sites put up and take down pages all the time. Broken links are of no consequence to the overall site quality.
This is a different discussion altogether, but broken URL situations actually offer an opportunity for a 404 page that offers users alternate content.
-
Are you linking out to these sites you have to get rid of?
In fact are they even sites or just other pages on your main site? I have maybe misunderstood
EDIT - I'll go ahead and assume I've just got the wrong end of the stick and it's pages on your site that you need to get rid of.
In that case if you can't redirect them can you change the links to point to different pages or even just remove them?
-
Thanks for this reply, and for the others!
OK, so the fact that your site has broken URLs doesn't bring your site in general down in the search engine rankings? Broken URLs aren't necessarily an indicator of a poor quality site that would result in some sort of penalty?
-
Redirecting them won't help the main domain rank for these brand terms, but it will capture the type in traffic and pass most of the link juice coming into these other sites.
Ultimately it shouldn't take your web development company long (unless you have hundreds) and indeed you could maybe even do it at the registrar easily (if not efficiently), so don't pay through the nose for it.
On the other hand, unless you rely on links from those other sites it won't harm your main site in any way by letting them die.
-
There are two things I would look closely at in such a situation...
Traffic: First, you want to know if these pages are generating any traffic. If they are, you should keep them. If they aren't (which it sounds like they aren't), move on to checking links...
Links: Before you scrap pages generating little inbound traffic, you should check to see if said pages have any inbound links. If they do, you would want to evaluate the quality of those links and determine if that is greater or lessor than the cost of keeping the pages and setting up redirects. If you determine these pages have valuable links, definitely 301 redirect them to a good substitute page.
When I speak of the cost associted with setting up the redirects I'm talking about the time taken to set up the redirects (likely your time or ITs time).
We use Open Site Explorer to help us audit inbound links to pages.
-
The link doesn't need to be broken. 301 redirect the existing link to the new one and anyone that is linking or typing or clicking into the old URL will be forwarded to the new one and they wont know it. Make sense? Yes, do it!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO URLs: 1\. URLs in my language (Greek, Greeklish or English)? 2\. Αt the end it is good to put -> .html? What is the best way to get great ranking?
Hello all, I must put URLs in my language Greek, Greeklish or in English? And at the end of url it is good to put -> .html? For exampe www.test.com/test/test-test.html ? What is the best way to get great ranking? I am a new digital marketing manager and its my first time who works with a programmer who doesn't know. I need to know as soon as possible, because they want to be "on air" tomorrow! Thank you very much for your help! Regards, Marios
Technical SEO | | marioskal0 -
Moving wordpress to it's own server
Our company wants to remove wordpress from our current windows OS server at provider 1 and move it to a new server at provider 2. Godaddy handles our DNS. I would like to have it on the same domain without masking. I would like to make a DNS entry on godaddy so that our current server and our new server can use the same URL (ie sellstuff.com). But I only want the DNS to direct traffic to our current server. The goal here is to have the new server using the same URL as the old server so nothing needs to be masked once traffic is redirected with a 301 rule in the htaccess file. But no traffic outside of the 301 rule will end up going to the new server. I would then like to edit the htaccess file on our current server to redirect to the new servers IP address when someone goes to sellstuff.com/blog. Does this make since and is it possible?
Technical SEO | | larsonElectronics0 -
Http to https - is a '302 object moved' redirect losing me link juice?
Hi guys, I'm looking at a new site that's completely under https - when I look at the http variant it redirects to the https site with "302 object moved" within the code. I got this by loading the http and https variants into webmaster tools as separate sites, and then doing a 'fetch as google' across both. There is some traffic coming through the http option, and as people start linking to the new site I'm worried they'll link to the http variant, and the 302 redirect to the https site losing me ranking juice from that link. Is this a correct scenario, and if so, should I prioritise moving the 302 to a 301? Cheers, Jez
Technical SEO | | jez0000 -
'External nofollow' in a robots meta tag? (advertorial links)
I believe this has never worked? It'd be an easy way of preventing any penalties from Google's recent crackdown on paid links via advertorials. When it's not possible to nofollow each external link individually, what are people doing? Nofollowing and/or noindexing the whole page?
Technical SEO | | Alex-Harford0 -
Blocked URL's by robots.txt
In Google Webmaster Tools shows me 10,936 Blocked URL's by robots.txt and it is very strange when you go to the "Index Status" section where shows that since April 2012 robots.txt blocked many URL's. You can see more precise on the image attached (chart WMT) I can not explain why I have blocked URL's ? because I have nothing in robots.txt.
Technical SEO | | meralucian37
My robots.txt is like this: User-agent: * I thought I was penalized by Penguin in April 2012 because constantly i'am losing visitors now reaching over 40%. It may be a different penalty? Any help is welcome because i'm already so saturated. Mera robotstxt.jpg0 -
Should I worry about these 404's?
Just wondering what the thought was on this. We have a site that lets people generate user profiles and once they delete the profile the page then 404's. I was told there is nothing we can do about those from our developers, but I was wondering if I should worry about these...I don't think they will affect any of our rankings, but you never know so I thought I would ask. Thanks
Technical SEO | | KateGMaker1 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0