301'ing over 700 internal links to the main page
-
I just got a contract for a site. After I analyzed their website, I noticed that they have over 700 pages indexed. However, their internal linking structure sucks. It's basically all 700 pages in one directory.
What do you recommend? I redirect all the internal structures to their new locations, or would it be better to redirect all those internal pages to their main domain name, and build a completely new seo-friendly structure?
Redirecting their current pages to each individual page is gonna take a lotta time, and I don't think they're gonna pay for it. :l
-
EGOL has a very important answer you need to consider, which is what I was really getting at in my second and last sentences. (but I didn't say it well)
But to answer about the redirect of every page to the root: I don't have any actual experience of this but I wouldn't think it would be a good thing because the front page is untargeted, compared to the page the visitor thought they were going to get.
If doing this was a good idea, there would be no reason for using a 404 page not found. You would instead just redirect all missing pages to the front page.
My guess is that as soon as that pattern was discovered, the value you got from it would evaporate.
Remember that the whole purpose of all those pages was to satisfy the needs of a visitor. Send them to an inappropriate place and you have failed. (by the way, google does this a lot themselves. Their own redirects leave a lot to be desired.)
-
After I analyzed their website, I noticed that they have over 700 pages indexed. However, their internal linking structure sucks. It's basically all 700 pages in one directory.
What do you recommend?
STOP!
Do not do this until you have clearly defined two things.
-
Why you are doing this.
-
The goal that you want to accomplish.
Any actions that you take without considering BOTH of those questions are shooting without taking aim and not knowing what your target should be.
You might not need the redirect because linking structure and directory structure are completely different things. Redirects could cause a significant loss of power for this site.
-
-
If the pages you are redirecting have links pointing at them and you 301 them to your home page, chances are your home page will rank higher and DA will increase.
Whatever plan you decide on, my advice would be to do it a few pages at a time and see what happens.
-
Thanks for the detailed response.
I've been analyzing the website and their pages for the past hour.
For example, lets say the keyword is "send flowers to city". The domain name is "cityflorists". And she has pages for every city. I'm gonna 410 them, and just keep pages relevant to the city, it'll being down the number of pages to below 50 for sure.
The main question I want to ask, however, without any complications is, that what happens if you 301 over 700 internal pages to the main domain? Will the website get penalized? Or will the domain authority increase?
-
700 pages is nothing.
It really depends on what you are trying to do, but if you are a programmer, it would be very simple and fast to decide how to split up the files into the final structure, then get a listing of the new locations and generate the .htaccess file from that, because you know the original location.
I would do it by taking an exact copy of the current data and do this in a different location, so as not to cause any disruption. Once it all works, then do a fast swap.
What would make this more complex is if you had to rename the pages, but I would just write a file that contains old,new and use that to generate the needed changes.
interlinking the pages is another job (in their new locations) but if you are a programmer, as noted in the previous line, it is simple.
Something else you must consider is how will the client manage the system afterwards. Be sure not to put them in the situation that they need you in order to add a new page..
At least you are thinking about this before doing it. I just heard from a friend whose client got someone else to restructure his site and they completely destroyed the fantastic #1 and page 1 ranking they spent years creating.
-
Hi,
Will do my best to understand and your answer question appropriately.
If those pages are not are not providing any value in terms of traffic, ranking or links then there's no point of keeping them on the site. You could redirect but I would just implement a 410 status code to indicate to search engines that these pages no longer exist.
For pages which are providing value, restructure the site and place them in different directories (with 301 from old to new pages). Obviously you should build out these pages in terms of content and links in the following months.
It wouldn't be too difficult to evaluate your most important pages through Google Analytics, Webmaster Tools e.t.c
However the most difficulty for this project is prioritsation and how you present this to the client, considering there are 700 pages. Take the following steps when handling this with the client:
(1) Map out and plan how you want to structure the site.
(2) In plain english, list what you want to do with the site and the steps you will be taking to do so (for the next 3-6 months). Also be prepared for alternatives and difficult questions the client might potentially ask. In addition calculate the time, resource and how much you can do based on your allocated budget.
(3) Present this to the client in a clear and honest approach- be upfront with the situation. You should try and negotiate a common ground - Maybe he can give you a bit more budget this month but you get less next month for link building. How you bargain this depends on your payment terms which I do not know. But you should also be prepared to take some of the costs from your own pocket.
Next time I would suggest before taking on a contract to clearly identify the number of pages, site issues e.t.c so that project scope creep and this difficult situation does not occur to you again.
But to end this on a calmer note, buddy it's ok and you shouldn't allow yourself to get stressed about this, its all good and take this as a positive step towards your career. Whatever happens, just know that in future effectively planning will help you anticipate these issues beforehand and behave accordingly.
Take care and hope this helps.
Vahe
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links - Dofollow or Nofollow and why?
Hey there Mozzers, I am a question about internal links. If I am writing a article about something and want to link to another one of my articles inside my blog, do i have to make that link nofollow or dofollow? If possible tell me why also. Thanks in advance
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
Pagination and View All Pages Question. We currently don't have a canonical tag pointing to View all as I don't believe it's a good user experience so how best we deal with this.
Hello All, I have an eCommerce site and have implemented the use rel="prev" and rel="next" for Page Pagination. However, we also have a View All which shows all the products but we currently don't have a canonical tag pointing to this as I don't believe showing the user a page with shed loads of products on it is actually a good user experience so we havent done anything with this page. I have a sample url from one of our categories which may help - http://goo.gl/9LPDOZ This is obviously causing me duplication issues as well . Also , the main category pages has historically been the pages which ranks better as opposed to Page 2, Page 3 etc etc. I am wondering what I should do about the View All Page and has anyone else had this same issue and how did they deal with it. Do we just get rid of the View All even though Google says it prefers you to have it ? I also want to concentrate my link juice on the main category pages as opposed being diluted between all my paginated pages ? - Does anyone have any tips on how to best do this and have you seen any ranking improvement from this ? Any ideas greatly appreciated. thanks Peter
Intermediate & Advanced SEO | | PeteC120 -
Removing Blogs and 301 redirect to blog home page?
Hi, I was at the MozCon conference in Seattle this Summer and heard great concepts about deleting a lot of pages on your site that are deemed excess. It got me thinking to remove all of our old blogs that were: Sales(ee) less than 400 words Flat out bad blogs When i begin removing these links, i know i will get a lot of 404 errors because of previous social links. So in your opinion, what would you do? Do i just 301 those blogs to my main /blog page? Thanks
Intermediate & Advanced SEO | | Shawn1240 -
Interlinking vs. 'orphaning' mobile page versions in a dynamic serving scenario
Hi there, I'd love to get the Moz community's take on this. We are working on setting up dynamic serving for mobile versions of our pages. During the process of planning the mobile version of a page, we identified a type of navigational links that, while useful enough for desktop visitors, we feel would not be as useful to mobile visitors. We would like to remove these from our mobile version of the page as part of offering a more streamlined mobile page. So we feel that we're making a fine decision with user experience in mind. On any single page, the number of links removed in the mobile version would be relatively few. The question is: is there any danger in “orphaning” the mobile versions of certain pages because links don’t exist pointing to those pages on our mobile pages? Is this a legitimate concern, or is it enough that none of the desktop versions of pages are orphaned? We were not sure whether it’s even possible, in Googlebot’s eyes, to orphan a mobile version of a page if we use dynamic serving and if there are no orphaned desktop versions of our pages. (We also plan to link to "full site" in the footer.) Thank you in advance for your help,
Intermediate & Advanced SEO | | Eric_R
Eric0 -
Why isn't my uneven link flow among index pages causing uneven search traffic?
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
Intermediate & Advanced SEO | | GilReich0 -
Any idea why this page isn't indexing?
Hi Mozzers, Question for all of you. Any idea why this page isn't indexing in Google? It's indexing in Bing, but we don't see it in Google's results. It doesn't seem like we have any noindex tags or anyway issues with the robots files either. Any ideas? http://ohva.k12.com/
Intermediate & Advanced SEO | | petertong230 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0