Unnecessary 301s?
-
hi mozzers,
I'm doing an audit on a website. I detected over 60 301s of this nature: www.example.com/help 301d to www.example.com/help/.
I believe these are completely useless and increase page load time. Am I right? should i kill those 301s?
Thanks
-
I second what Ray says, having the rule in your .htaccess file is always a good practice. Especially to prevent the annoyance of seeing both versions in your Google Analytics.
This question was answered in another Moz Q&A, and although it's from 2012 the responses are still good.
-
It looks like the 301s were meant to force a trailing slash at the end of the URL. This would be done to prevent duplicate URLs. However, rather than add each URL manually, you could replace it with a regex
Example:
<code>RewriteRule ^(.*)([^/])$ http://%{HTTP_HOST}/$1$2/ [L,R=301]</code>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I need multiple 301s to preserve SEO
Many years ago I created website www.original.com. Two years ago I redirected website www.original to www.neworiginal.comusing 301 redirects. I have now created www.rebranded.com. I want to maintain all SEO value. Should I redirect both www.original.comand www.new original.com to www.rebranded.com? Or do I only need to redirect one of them and if only one which one? If I need only to redirect one can I delete the other, why or why not. Of course the url are fictitious. I truly appreciate your help
Intermediate & Advanced SEO | | PhotoStl0 -
Main Nav Redirects Issue: Unnecessary Evil or Better Solution
Hi, I'm somewhat stumped on the best course of action for our navigation menu. Our site is "divided" into two areas; informational and transactional. Because of compliance, transnational users have to be geo targeted; therefore, we serve them a specific page (we have 6 different regions: uk, aus, eu, etc). If users visit informational side, it's not geo specific. Example: https://site/charts https://site/uk/money Within our main nav, we don't specify the geo transaction page and use a generic https://site/money/ (page doesn't exist) then when a user clicks that link, we'll detect their location and serve up a 301 redirect to the correct geo page. This has obviously caused a ton load of unnecessary redirects and a waste of powerful link equity from the header of the site. It's been recommended to dynamically change the linked URL in this header based on the location of the user. That sounds good but what about Google? Since we can't detect Google crawler IP, we would have to pick a default geo URL like /uk/money. If we do that, the other regional URLs suffer link equity. How do we minimize redirects and make Google happy for all our geo pages. Hope this makes sense and thanks for your time!
Intermediate & Advanced SEO | | Bragg0 -
URL Migration: Better to have .301s processed or 200s?
I'm migrating sub-domains to sub-folders, but this question is likely applicable for most URL migrations. For example: subdomain1.example.com to example.com/subdomain1 and any child pages. Bear with me as it may just be me but I'm having trouble understanding whether internal links (menu, contextual etc and potentially the sitemaps) should be kept as the pre-migration URL (with .301 in place to the new URL) to give Google a chance to process the redirects or if they should be updated straight away to the new URL to provide a 200 response as so many guides suggest. The reason I ask is unless Google specifically visits the old URL from their index (and therefore processes the .301), it's likely to be found by following internal links on the website or similar which if they're updated to reflect the new URL will return a 200. I would imagine that this would be treated as a new page, which is concerning as it would have a canonical pointing toward itself and the same content as the pre-migrated URL. Is this a problem? Do we need to allow proper processing of redirects for migrations or is Google smarter than this and can work it out if they visit the old URL at a later date and put two and two together? What happens in-between? I haven't seen any migration guides suggest leaving .301s in place but to amend links to 200 as soon as possible in all instances. One thought is I guess there's also the Fetch as Google tool within Search Console which could be used with the old URLs - could this be relied on? Apologies if this topic has been covered before but it's quite difficult to search for without returning generic topics around .301 redirects. Hope it makes sense - appreciate any responses!
Intermediate & Advanced SEO | | AmyCatlow0 -
URL Re-Writes & HTTPS: Link juice loss from 301s?
Our URLs are not following a lot of the best practices found here: http://moz.com/blog/11-best-practices-for-urls We have also been waiting to implement HTTPS. I think it might be time to take the plunge on re-writing the URLs and converting to a fully secure site, but I am concerned about ranking dips from the lost link juice from the 301s. Many of our URLs are very old, with a decent amount of quality links. Are we better off leaving as is or taking the plunge?
Intermediate & Advanced SEO | | TheDude0 -
GWT does not play nice with 410 status code approach to expire content? Use 301s?
We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow
Intermediate & Advanced SEO | | petersocapro0 -
301s being indexed
A client website was moved about six months ago to a new domain. At the time of the move, 301 redirects were setup from the pages on the old domain to point to the same page on the new domain. New pages were setup on the old domain for a different purpose. Now almost six months later when I do a query in google on the old domain like site:example.com 80% of the pages returned are 301 redirects to the new domain. I would have expected this to go away by now. I tried removing these URLs in webmaster tools but the removal requests expire and the URLs come back. Is this something we should be concerned with?
Intermediate & Advanced SEO | | IrvCo_Interactive0 -
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
Intermediate & Advanced SEO | | FireMountainGems0 -
Any solutions for implementing 301s instead of 302 redirects in SharePoint 2010?
We have an issue with Google indexing multiples of each page in our sitemap (www.upmc.com). We've tried using rel_canonical, but it appears that GoogleBot is not honoring our canonicals. Specifically, any of the pages Google indexes that end without a file extension, such as .aspx are 302 redirected to a .aspx page. Example - The following pages all respond as 302 redirects to http://www.upmc.com/services/pages/default.aspx http://www.upmc.com/services/ http://www.upmc.com/services http://www.upmc.com/Services/ http://www.upmc.com/Services Has anyone been able to correct this inherent issue with Sharepoint so that the redirects are at least 301s?
Intermediate & Advanced SEO | | Jessdyl0