Thousands of 301 redirections - .htaccess alternatives?
-
Hi guys,
I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
-
Thank you for your answer ! I will share it with our IT team.
-
Why don't you just have a VPS server with NGINX the stream handler/reverse proxy for your IIS web server?
- https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-load-balancing
- http://www.iborgelt.com/windows-home-server-behind-nginx-reverse-proxy/
You're just using the VPS as an interface to handle your redirects and for $5 a month. You can't beat it. Im sure if your IT department googles: nginx reverse proxy iis they will get the idea.
-
Hi guys, I have a similar problem, but on IIS7. Our IT department says our 301 redirections file is at it's max size in the webconfig. They could increase the limit, but says it will impact page load speed negatively. What's the impact on page speed of having 5000 to 10000 urls in the rewrite map ?
Also, they're also looking at a solution to look at the redirections only when the site gives a 404, so it would hit 404, then 301, then 200. I am a little scared of this SEO wize. Would it be a problem?
Thanks !
-
Putting aside server load / config issues, and from the pure SEO point of view.
No, you shouldn't have any major issues with that many 301s. However, what you might find is that depending on the size of your site and the frequency of Googlebots visits that some of these pages take a long time (months) to drop out of the index and be replaced by their newer alternatives. This normally isn't cause for alarm.
In some instances you might end up with pages that now have now links to them (as their parent categories were all redirected also) and so seem to get stuck and never get recrawled by Google to update. In a couple of instances I have had success using XML sitemap files that just include these 'blocked' pages (the old URLs still in the index) to prompt Google to recrawl them.
Also there is Google Webmaster Tools feature to 'crawl as Googlebot' which then prompts you to 'submit to index' which you can use to prompt recrawls on a per-page basis (but you have credits here, so should only be for the more important pages).
Best of luck!
-
The main benefit of this would be in reducing server load / response time, and potentially in maintainability of the server config.
The most important aspect of this side of thing would be based on how many separate rules you have in your .htaccess file for those 10,000 redirects.
-
Hi Kevin,
What's the difference of this method to the standard 301 redirection using .htaccess?
-
Do you guys have a step-by-step guide in implementing 301 redirection using this httpd main server config file?
-
Well, if you're on a VPS/Dedicated Machine. - I would take a look at http://httpd.apache.org/docs/current/rewrite/rewritemap.html
RewriteMap has 0 effect on the load time like if you were to have the same in .htaccess it will eat those redirect rules. Remember 301s cache in the browser so when you're testing have them all 302s until you're happy and then watch your rewrite log when you launch. If you need help let us know.
This does take some knowhow and learning but you should be able to get this done in a few days. ( testing, reading documentation )
-
Do you have access to the httpd main server config file? If so, please read Apache HTTP Server Tutorial: .htaccess files.
".htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.
However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <directory>section in your main server configuration file."</directory>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Score & Redirecting Inbound Links
Hi, I recently downloaded a spreadsheet of inbound links to my client sites and am trying to 301 redirect the ones that are formatted incorrectly or just bad links in general (they all link to the site domain, but they used to have differently formatted urls on their old site, or the link URL in general has strange stuff on it). My question is, should I even bother redirecting these links if their spam score is a little high (i.e. 20-40%)? it already links to the existing domain, just with a differently formatted URL. I just want to make sure it goes to a valid URL on the site, but I don't want to redirect to a valid URL if it's going to harm the client's SEO. Also not sure what to do about the links with the --% spam score. I really appreciate any input as I don't have a lot of experience with how to deal with spammy links.
White Hat / Black Hat SEO | | AliMac260 -
Do I need to undo a 301 redirect to dissavow links from the source domain?
A client came to me after being hit by Penguin and had already performed a 301 redirect from site A to Site B. Site B was subsequently hit by the penalty a number of weeks later and we are planing on performing link removal for Site A. Only the webmaster tools account for Site B exists, none is still available for site A. I assume that I cannot dissavow links to site A from Site B's webmaster tool account (even though website A's links show up in the GWT account). So do I need to undo the 301 and then create a new GWT account for site A in order to disavow the links pointing to site A, or can I submit from Site B's GWT account since they are 301'd to site B? Thanks! Chris [edited for formatting]
White Hat / Black Hat SEO | | SEOdub0 -
Should I redirect old pages
I have taken over SEO on a site. The old people built thousands of pages with duplicate content. I am converting site to wordpress and was wondering if I should take the time to 301 redirect all 10,000 or so pages with duplicate content. The 10,000 pages all have links back to different to different pages as well as to the homepage. Should I just let them go to a 404 page not found.
White Hat / Black Hat SEO | | Roots70 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
Using Redirects To Avoid Penalties
A quick question, born out of frustration! If a webpage has been penalised for unnatural links, what would be the effects of moving that page to a new URL and setting up a 301 redirect from the old penalised page to the new page? Will Google treat the new page as ‘non-penalised’ and restore your rankings? It really shouldn’t work, but I’m convinced (although not certain) that our clients competitor has done this, with great effect! I suppose you could also achieve this using canonicalisation too! Many thanks in advance, Lee.
White Hat / Black Hat SEO | | Webpresence0 -
Deny visitors by referrer in .htaccess to clean up spammy links?
I want to lead off by saying that I do not recommend trying this. My gut tells me that this is a bad idea, but I want to start a conversation about why. Since penguin a few weeks ago, one of the most common topics of conversation in almost every SEO/Webmaster forum is "how to remove spammy links". As Ryan Kent pointed out, it is almost impossible to remove all of these links, as these webmasters and previous link builders rarely respond. This is particularly concerning given that he also points out that Google is very adamant that ALL of these links are removed. After a handful of sleepless nights and some research, I found out that you can block traffic from specific referring sites using your.htaccess file. My thinking is that by blocking traffic from the domains with the spammy links, you could prevent Google from crawling from those sites to yours, thus indicating that you do not want to take credit for the link. I think there are two parts to the conversation... Would this work? Google would still see the link on the offending domain, but by blocking that domain are you preventing any strength or penalty associated with that domain from impacting your site? If for whatever reason this would nto work, would a tweak in the algorithm by Google to allow this practice be beneficial to both Google and the SEO community? This would certainly save those of us tasked with cleaning up previous work by shoddy link builders a lot of time and allow us to focus on what Google wants in creating high quality sites. Thoughts?
White Hat / Black Hat SEO | | highlyrelevant0 -
Thinking of redirecting *all* mobile traffic to another site (via an advertiser) - safe to do?
Hi, I am thinking of redirecting all mobile (iphone, cell phone, etc) to an advertiser (so completely different content than my site). Is there any risk of getting banned from google (etc) for doing this? (this is for an adult site)
White Hat / Black Hat SEO | | dmn020