Too many 301s?
-
Hi there, If there is a website that has accidently generated say 1,000 pages of duplicate content, would the seo be hurt if all those pages were re-directed to the origional source of the content?
There are no plans to re-write the 1,000 duplicate pages, they are already cached and indexed by Google.
I thought about canonical tags but as they have some traffic and a little seo value i thought 301 re-direct would be more appropiate to the relevant pages?
I am also right in thinking you would be able to remove the 301 in the .htaccess file once the index has updated?
Also once removed the 301 - i could use those urls later from scratch if i wanted?
Any info much appreciated.
-
Great insight Highland!!!
-
If they had links, I would 301 the pages with links. Everything else I would 404
-
How are these pages generating traffic? Are they being found in the search engine?
The real question, do these pages have links to them?
There is little value to a 301 redirect if you are not moving link traffic in the direction you are pointing. If you are out ranking the original content, then perhaps a 301 could help. How well does the original content rank?
-
Ha, yes you can my friend.
-
But you can do it, yes?
-
Bringing back URL's that you didn't want and then decide that you do want is pretty annoying to Google...
-
I would see if they had links, and get rid of the rest, it may look to Bing that you are trying to be tricky. Its not natural
-
Ok, i probably wont but in what istance would you not recommend this?
I understand pa and pr etc will be back to nothing but its the keyword url i might want to use from scratch
-
Yes but I wouldn't really recommend this.
-
Also last one, if i wanted to revive the 301s say in a year i would be allowed to and the pages would index again?
-
Thanks highland.
-
I would 301 the pages and get them out of your site's index. Even if you canonical all of them Google will still have to index 1000 pages instead of 1. The 301 will transfer most of your rank to the new page and you'll improve your crawl budget.
Why take the 301s out? Just leave them in there in case there are links pointed to them.
-
Well they seem to be generating traffic.
In principal is what i intend on doing ok, will it hard the seo or be seen as ok do you know?
Many thanks,
-
That sounds weird! If you generated 1000s of pages automatically, and these are all duplicate content, why don't you remove them? Google will end up removing them from its cache as well after a short period!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No link data in many of my clients GSC profiles !!
Hi I notice today that a few of my clients GSC profiles are devoid of link data (that did have before) Anyone know if this is a bug with Google or other potential issue ? All Best
Technical SEO | | Dan-Lawrence
Dan Links to Your Site | Total links |
| No data available. | | Who links the most No data available. |0 -
Html Improvements in Webmaster shows many as Duplicate Titles
Html Improvements in Webmaster shows many as Duplicate Titles. As attached they are not duplicates we made a way to make text hyperlinks if the name matches other objects in our site. How can we deal in such case for Google not to this it as 2 different URl's rather they are one. As the ones with ?alinks are just hyperlink URL's Say we have a name as "James" and he has a biography in our site. Say "Gerald" has a Bio as well and we talk about "James" in "Geralds" bio the word "James" gets a hyperlink automatically so when anyone clickes "James" it goes to his bio. k5jDM
Technical SEO | | ArchieChilds0 -
Is there a limit to how many URLs you can put in a robots.txt file?
We have a site that has way too many urls caused by our crawlable faceted navigation. We are trying to purge 90% of our urls from the indexes. We put no index tags on the url combinations that we do no want indexed anymore, but it is taking google way too long to find the no index tags. Meanwhile we are getting hit with excessive url warnings and have been it by Panda. Would it help speed the process of purging urls if we added the urls to the robots.txt file? Could this cause any issues for us? Could it have the opposite effect and block the crawler from finding the urls, but not purge them from the index? The list could be in excess of 100MM urls.
Technical SEO | | kcb81780 -
Guidance for setting up new 301s after having just done so (
Hi I've recently set up a load of 301 redirects for a clients new site design/structure relaunch One of the things we have done is take the kw out of the sub-category landing page url's since they now feature in the top level category page urls and don't want to risk over-optimisation by having kw repeats across the full urls. So the urls have changed and the original pages 301'd to the new current pages. However If rankings start to drop & i decide to change urls again to include kw in final part of url too for the sub category landing pages, whats best way to manage the new redirects ? Do i redirect the current urls (which have only been live for a week and have the original/old urls 301'd to them) to the new url's ? (worried this would create a chain of 301's which ive heard is not ideal) Or just redirect the original urls to the new ones, and can forget about the current pages/url's since only been live for a week ?
Technical SEO | | Dan-Lawrence
(I presume best not since GWT sitemaps area says most new urls indexed now so I presume sees those as the original pages replacement now) Or should they all be 301'd (original urls and current urls to the new) ? Or best to just run with current set up and avoid making too many changes again, and setting up even more 301's after having just done so ? Many Thanks 🙂 Dan0 -
Too many links in header menu
I'm working on a few clients who are starting to get big header menus. Their site now easily exceeds the 100 links per page recommendation. Normally I would recommend them to cut down on the links, bit in this case these sites have menus that makes navigation easier. I honestly think these menus adds value for the users. The dilemma is that I think the menus provide value from an UX standpoint, but I'm not sure from the SEO standpoint. Any recommendations to this dilemma? Some examples: http://moodsofnorway.com/no/ http://www.gmax.no/ http://www.flust.no/
Technical SEO | | Inevo0 -
Too many internal links on one page
Hello All, I have just started using SEO moz. I had one quick question i would like answered. Currently SEOmoz is telling me that there are too many internal links. The recommendation is 100 links per page but the majority of my pages have 125+ links Will this effect the page when its crawled? Look forward to your comments. Thanks in advance
Technical SEO | | TWPLC_seo0 -
How many steps for a 301 redirect becomes a "bad thing"
OK, so I am not going to worry now about being a purist with the htaccess file, I can't seem to redirect the old pages without redirect errors (project is an old WordPress site to a redesigned WP site). And the new site has a new domain name; and none of the pages (except the blog posts) are the same. I installed the Simple 301 redirects plugin on old site and it's working (the Redirection plugin looks very promising too, but I got a warning it may not be compatible with the old non-supported theme and older v. of WP). Now my question using one of the redirect examples (and I need to know this for my client, who is an internet marketing consultant so this is going to be very important to them!): Using Redirect Checker, I see that http://creativemindsearchmarketing.com/blog --- 301 redirects to http://www.creativemindsearchmarketing.com/blog --- which then 301 redirects to final permanent location of http//www.cmsearchmarketing.com/blog How is Google going to perceive this 2-step process? And is there any way to get the "non-www-old-address" and also the "www-old-address" to both redirect to final permanent location without going through this 2-stepper? Any help is much appreciated. _Cindy
Technical SEO | | CeCeBar0 -
How to remove the 4XX Client error,Too many links in a single page Warning and Cannonical Notices.
Firstly,I am getting around 12 Errors in the category 4xx Client error. The description says that this is either bad or a broken link.How can I repair this ? Secondly, I am getting lots of warnings related to too many page links of a single page.I want to know how to tackle this ? Finally, I don't understand the basics of Cannonical notices.I have around 12 notices of this kind which I want to remove too. Please help me out in this regard. Thank you beforehand. Amit Ganguly http://aamthoughts.blogspot.com - Sustainable Sphere
Technical SEO | | amit.ganguly0