Which is better, a directory 301 redirect or each page in the directory?
-
A customer of mine has a site with lots of articles and they are all quite spammy. They have not been affected by penguin yet so they asked what to do. I suggested losing the articles directory and 301 redirect to either the home page or another important page.
Would a 301 redirect on the entire directory to a single page be the way to go or add redirects from each page within the directory and spread out redirects to various pages in website?
Or do you have a better suggestion?
-
Hi Anthony,
First of all, it is always better to redirect URLs to individual pages than perform sitewide 301 directs. But in this case, if the individual pages aren't getting much traffic, it may not make much difference.
If the articles are truly low quality, and you are worried about a future penalty, you may want to simply remove them without a redirect at all. Serve a 410 HTTP response (gone) instead, and carefully watch your traffic/rankings to make sure nothing drops.
It's most likely Google is simply ignoring these pages. The best defense is to build up an offense of quality material so the bad doesn't outweigh the good.
Hope this helps. Best of luck with your SEO.
-
No, I would only do this for articles that have a good "inbound" link profile. 301 redirects can slow down the servers, so having too many isn't good either.
-
Thanks MargaritaS, good point on the spammy links and redirecting with them. To clarify, it is the article content that is not well written and just about keyword stuffed, but not terribly. The articles look like someone use a boiler template and just replaced keywords and a sentence here and there and called it a new article.
I was thinking the best thing to do would be to bury the evidence (haha!!) and apply redirects to valued pages. Then I would have them start writing good original content in a blog.
So should I apply 40+ individual redirects? or just redirect whole directory to a single page?
Thanks again for your feedback.
-
Anthony,
This sounds tricky. When you say "spammy" are you talking about their inbound links, or are you referring to the content within those articles? I think there is an important difference there. If you do indeed have articles that look spammy because of the content itself (ex: keyword stuffing) but actually happen to have decent inbound links from reliable sources, then I would say to individually 301 redirect those. If you mean spammy as in the case that the inbound links pointing to these individual articles, then I would say lose them because you don't want to pass those bad links onto this "other" important page on your client's site.
Hope this helps!
MS
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirects
I am in the process of truncating many of the URLs using a plugin installed on Wordpress. The question is does google penalize or have issues with too many 301 redirects on your site. I have many many products I want to do this with. I thought I read somewhere that 301 redirects should be held to a minimum. Would appreciate any assistance
On-Page Optimization | | xinar0 -
Should you do on-page optimization for a page with rel=canonical tag?
If you ad a rel=canonical tag to a page, should you still optimize that page? I'm talking meta description, page title, etc.
On-Page Optimization | | marynau0 -
What to do about pages I have deleted?
I have been working through the dead links on my page and recreating the page with new content for those pages that it still makes sense to have on the site. But I have a few that were just changes of the title, spelling mistakes or other ways of saying the same thing In other words I created a page called "areas of the UK we cover" but decided to change it to "areas covered" However, I must have created links to this page and now it is a dead link with a page authority of 19 I think it would be spammy to have two pages, one called "areas covered" and the other called "areas of the UK we cover. It's not a disallow in Robots.txt because the page does not exist Please note I do not have access to the header to add code for a 301 redirect. I'm still using webs.com but not for new sites. I also have a page called singing telegrams london, that I changed from singagrams london. These are two words for the same thing but they are two very different keywords would it be ok to recreate this page and create content for singagrams london. Help is much appreciated
On-Page Optimization | | singingtelegramsuk0 -
Pages vs Posts
What are your thoughts on pages vs posts? I am setting up a new blog for a client but not sure how to structure the content. I may just do posts or a whole bunch of page listed down the sidebar. It seems like my pages always rank better than my posts. Has anyone else noticed this? Could it be because of the dates tied posts?
On-Page Optimization | | SixTwoInteractive0 -
To Reduce (pages)... or not to Reduce?
Our site has a large Business Directory with millions of pages. For examples' sake, let's say it's a directory of Restaurants. Each Restaurant has 4 pages on the site, each tied together through a row of tabs across the top of the page: Tab 1 - Basic super 7 info - name, location, contact info Tab 2 - Restaurant menu Tab 3 - Restaurant reviews Tab 4 - Photos of food The Tab 1 page generates 95% of our traffic, and 90% of conversions. The conversion rate on Tab 2 - Tab 4 pages is 6 - 10x greater than Tab 1 conversions. Total Conversions from search queries on menus, reviews and food are 20% higher than are conversions resulting from searches on restaurant name & info alone. We're working with a consultant on a redesign, who wants to consolidate the 4 pages into one. Their advice is to focus on making a better page, featuring all of the content, sacrifice a little organic traffic but make up any losses by improving conversion. My counterpoint is that we shouldn't scrap the Tab 2-4 pages just because they have lower traffic - we should make the pages BETTER. The content we display is thin, and we have plenty of data we could expose to make the pages more robust. By consolidating it will also be hard to optimize a page for people searching for name/location AND menu AND reviews AND photos. We're asking that one page to do too much, and it's likely we will see diminished search volume for queries on menu, reviews and food. I think the decline will be much more significant than the consultant estimates. The consultant says there will be little change to organic traffic. since Tab 1 already generates 95% of traffic. Through basic math, they're saying the risk is a 5% decline in organic traffic. Further, they see little chance of queries for menu, reviews, and food declining because most of those queries tend to send people too the home page or Tab 1 page anyway. Finally, the designer of the new wireframes admitted that potential organic traffic risks were not taken into consideration when they recommended consolidating the pages. I sincerely appreciate your thoughts and consideration! Trisha
On-Page Optimization | | lzhao0 -
On Page Optimization Reports
How is it determined which terms and associated urls are chosen when SEOmoz tracks your On-Page Report Card? I'm receiving a lot of F Grades for terms I'm not really interested in and a lot of terms I'd like to be tracked aren't. Is there a way I can manually choose which terms and pages I'd like to be shown?
On-Page Optimization | | ClaytonKendall0 -
Correct use of Canonical link vs 301 redirect
Hi All, Seeking yet more advice. SEOMOZ tools have told me I have duplicate content on one of my sites and I am keen to clean this up. I am not to familiar with the following so thought I would ask. The duplicate content is shown on : www.mysite.com www.mysite.com/index.html Obviously I only see index.html when I check the code so what is the best method of resolving the duplicate content, Canonical or 301? Can you give me an example 🙂 Thanks all
On-Page Optimization | | wedmonds0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5