Thousands of 301 redirections - .htaccess alternatives?
-
Hi guys,
I just want to ask if there are other possible issues/problems (other than server load) once we implement 301 redirections for 10,000+ URLs using .htaccess. Are there other alternatives?
-
Thank you for your answer ! I will share it with our IT team.
-
Why don't you just have a VPS server with NGINX the stream handler/reverse proxy for your IIS web server?
- https://www.digitalocean.com/community/tutorials/how-to-set-up-nginx-load-balancing
- http://www.iborgelt.com/windows-home-server-behind-nginx-reverse-proxy/
You're just using the VPS as an interface to handle your redirects and for $5 a month. You can't beat it. Im sure if your IT department googles: nginx reverse proxy iis they will get the idea.
-
Hi guys, I have a similar problem, but on IIS7. Our IT department says our 301 redirections file is at it's max size in the webconfig. They could increase the limit, but says it will impact page load speed negatively. What's the impact on page speed of having 5000 to 10000 urls in the rewrite map ?
Also, they're also looking at a solution to look at the redirections only when the site gives a 404, so it would hit 404, then 301, then 200. I am a little scared of this SEO wize. Would it be a problem?
Thanks !
-
Putting aside server load / config issues, and from the pure SEO point of view.
No, you shouldn't have any major issues with that many 301s. However, what you might find is that depending on the size of your site and the frequency of Googlebots visits that some of these pages take a long time (months) to drop out of the index and be replaced by their newer alternatives. This normally isn't cause for alarm.
In some instances you might end up with pages that now have now links to them (as their parent categories were all redirected also) and so seem to get stuck and never get recrawled by Google to update. In a couple of instances I have had success using XML sitemap files that just include these 'blocked' pages (the old URLs still in the index) to prompt Google to recrawl them.
Also there is Google Webmaster Tools feature to 'crawl as Googlebot' which then prompts you to 'submit to index' which you can use to prompt recrawls on a per-page basis (but you have credits here, so should only be for the more important pages).
Best of luck!
-
The main benefit of this would be in reducing server load / response time, and potentially in maintainability of the server config.
The most important aspect of this side of thing would be based on how many separate rules you have in your .htaccess file for those 10,000 redirects.
-
Hi Kevin,
What's the difference of this method to the standard 301 redirection using .htaccess?
-
Do you guys have a step-by-step guide in implementing 301 redirection using this httpd main server config file?
-
Well, if you're on a VPS/Dedicated Machine. - I would take a look at http://httpd.apache.org/docs/current/rewrite/rewritemap.html
RewriteMap has 0 effect on the load time like if you were to have the same in .htaccess it will eat those redirect rules. Remember 301s cache in the browser so when you're testing have them all 302s until you're happy and then watch your rewrite log when you launch. If you need help let us know.
This does take some knowhow and learning but you should be able to get this done in a few days. ( testing, reading documentation )
-
Do you have access to the httpd main server config file? If so, please read Apache HTTP Server Tutorial: .htaccess files.
".htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system. In the event that the server administrator is not willing to make frequent configuration changes, it might be desirable to permit individual users to make these changes in .htaccess files for themselves. This is particularly true, for example, in cases where ISPs are hosting multiple user sites on a single machine, and want their users to be able to alter their configuration.
However, in general, use of .htaccess files should be avoided when possible. Any configuration that you would consider putting in a .htaccess file, can just as effectively be made in a <directory>section in your main server configuration file."</directory>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our forum links are redirecting to high spammy & NSFW sites: Any impact on main website?
Hi all, We have a discussion forum like subdomain.website.com. Some spammers have created many links with our subdomain URL which are redirecting to high spammy and NSFW sites (Not sure how they did). We are trying to stop the redirects. So far many visitors and bots have recorded visits to these spammy sites with our URL. Will this impact our website anyhow ? I noticed that our website spam score has been increased and not sure if this is coincidental or penalized. Ranking even dropped without manual actions. I wonder how much of this subdomain activity will impact main website? Please advise.
White Hat / Black Hat SEO | | vtmoz1 -
Mobile Redirect - Cloaking/Sneaky?
Question since Google is somewhat vague on what they consider mobile "equivalent" content. This is the hand we're dealt with due to budget, no m.dot, etc, responsive/dynamic is on the roadmap but still a couple quarters away but, for now, here's the situation. We have two sets of content and experiences, one for desktop and one for mobile. The problem is that desktop content does not = mobile content. The layout, user experience, images and copy aren't the same across both versions - they are not dramatically different but not identical. In many cases, no mobile equivalent exists. Dev wants to redirect visitors who find the desktop version in mobile search to the equivalent mobile experience, when it exists, when it doesn't they want to redirect to the mobile homepage - which really isn't a homepage it's an unfiltered view of the content. Yeah we have push state in place for the mobile version etc. My concern is that Google will look at this as cloaking, maybe not in the cases where there's a near equivalent piece of content, but definitely when we're redirecting to the "homepage". Not to mention this isn't a great user experience and will impact conversion/engagement metrics which are likely factors Google's algorithm considers. What's the MOZ Community say about this? Cloaking or Not and Why? Thanks!
White Hat / Black Hat SEO | | Jose_R0 -
Question about "sneaky" vs. non-sneaky redirects?
One of my client's biggest keyword competitors is using, what I believe to be, sneaky redirects. The company is a large, international corporation that has a local office. They use a totally unrelated domain name for local press and advertising, but there is no website. The anchor text in the backlinks automatically redirects to the corporate website. Is this sneaky or not?
White Hat / Black Hat SEO | | JCon7110 -
Buy exact match domain and 301 worth it?
So there is this exact match domain that gets about 500 visitors a day. it has trust flow 17 and citation flow of 23 which is just a little lower than our own website. The website talks about one of our keywords and rank on second page in SERPs. I am not interested in buying and running that website, but rather just to liquidate all the pages with 301s into our existing domain and onto relevant pages. So the 301s would be to relevant pages. The question is, would this strategy be worth it in todays SEO world and Google updates?
White Hat / Black Hat SEO | | TVape0 -
Remedies, Cure, and Precautions for 302 redirect Hijacking.
Hi Moz Guys, I hope all of you are good out there. I am here to discuss remedies, cure, and precautions for 302 redirect hijacking. Although it is quite old and whenever I searched in Google, it looks like a long gone glitch of Google serps but it just happened to one of my customers' site. The site in question is www(dot)solidswiss(dot)cd. If you check the cache(cache:site) then you can see a hijacked site in the urls of the cached page. As a result all my customer's listing in the serps are replaced with this site. This hacked site then is redirecting to a competitor's site. I did many things to cop with the problem, site came back in the serps but hackers are doing this on lots of domains so when it recovered from one site then another site catches it. I am doing lots of reporting on submit spam site. I am doing lots of feedback on the serps page. I have switched to https . But seems like nothing is working. This community is full of experts and technical people. I am wondering that what are your views and suggestions to handle the problem permanently?
White Hat / Black Hat SEO | | adqas0 -
IT want to do a name server redirect
Hi, I am in a little bit of a pickle, and hope that you clever people can help me... A little background: In April this year we relaunched one of our brands as a standalone business. I set up page to page 301 redirects from the old website to the new branded domain. From an SEO perspective this relaunch went amazingly smoothly - we only lost around 10% of traffic and that was just for a couple of months. We now get more traffic than ever before. Basically it's all going swimmingly. I noticed yesterday that the SSL certificate on the old domain has expired, so I asked IT to repurchase one for us to maintain the 301 redirects. IT are saying that they would prefer to do a name server redirect instead, which would remove all the page to page 301s. They are saying that this would maintain the SEO. As far as I am aware this wouldn't. Please can someone help me put together a polite but firm response to basically say no? Thanks, I really welcome and appreciate your help on this! Amelia
White Hat / Black Hat SEO | | CommT0 -
Potential Implications of using the Disavow tool to remove thousands of links
So here's the situation. My companies site has over 30 thousand backlinks from Rippling.info These links all point to 3 product pages, some of which are no longer in production. Apparently a former employee was experimenting with some link farm ideas. My questions are; 1. does anyone here have experience with rippling.info? Is it legit? It seems like a link farm but Google allows adsense ads??? I thought Google was against link farms... 2. if I use the Disavow tool in Webmaster Tools to tell Google these 30k+ incoming links are to be ignored, will there be any consequences? -Google Analytics shows zero referral traffic since jan 1st 2012.
White Hat / Black Hat SEO | | mjmorse0 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0