Whether letting an old category just 404 out is OK
-
Hello,
We've got some hidden categories that are still indexed in the search engines. If there are no links to these hidden categories, can we just let them 404 out and be OK SEO wise? We can't 301 redirect them. It's about 50 categories.
-
Hi Bob,
Just to extend on Jane's point and answer your question - if you 301 redirected every single page on your site, it would overtime slow down your website and if a lot of those pages don't have strong authority then it might be detected and potentially harm your sites rankings. A link to a quora post below provides additional reasons.
http://www.quora.com/Why-should-one-avoid-overusing-301-redirects
-
Hi Bob,
This is fine - Google recommends 404 / 410 for removing pages - if you want them gone even quicker, have them return 404s and use Google's page removal tool.
-
1. Why can't you 301 them?
2. If you want them out of the index, try a manual URL removal via Webmaster Tools.
-
Hey Bob,
If these pages have been clearly hidden in the first place, then from my perspective I already see no value in these pages.
Having non-important pages returning 404 error pages is ok. We only maintain the existing link juice or page authority for pages that have been taken down or reoptimised, in order to maintain existing SEO efforts or to direct users to an alternative page.
I hope this helps.
Regards,
Vahe
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Drops 10,000 links since last index. Lets play detective.
One of the intriguing things about SEO is being able to reverse engineer your competitors rankings because all the technical information is available for those who know where to look. I recently looked at my Dashboard and saw that one of my competitors had dropped 10,000 links. The questions is why? Google algorthm change? Blackhat Penalty? Something else.? Here are the numbers, I am going to lieave my own clients site out because his numbers are pathetic. www.Leafly(dot)com 50.4k Links Down 10k www.thcfinder(dot)com 1,530 links Down 71 www.weedmaps(dot)com 64,000k links Up 1.5K Is it just me or is that a lot of links to loose over one indexing period?
White Hat / Black Hat SEO | | DavidMeshah0 -
Old Press Release sites - Which ones do you Disavow and leave alone
Hi Mozers! I need your help. I'm in the final stages of a huge link audit and press releases are a big concern. As you know, press release distribution sites up until 2012 had "follow" links, giving webmasters a delight of having their keyword anchor texts a big boost in rankings. These are the websites that are troubling me today so i would appreciate your input on my strategy below as most of these websites are asking for money to remove them: 1. Press Release sites that are on the same C-class - Disavow 2. Not so authoritative press release websites that just follow my www domain only (no anchor texts) - I leave it alone 3. Not so authoritative press release websites but have anchor texts that are followed - Disavow 4. Post 2012 press release websites that have "followed" anchor text keywords - Request to remove, then disavow 5. Post 2012 press release websites that just follow my www domain only (no anchor texts) - leave it alone #2 and #5 are my biggest concern. Now more than ever I would appreciate your follow ups. I will respond quickly and apply "good answers" to the one's that make the most sense as my appreciation to you. God bless you all.
White Hat / Black Hat SEO | | Shawn1240 -
Negative SEO campaign just started against my site. What do I do?
As the question says, I have just got alerts of new links, being clearly a negative seo campaign against my site. We are talking, lots of spammy, rude anchor text type keywords being used. Whilst I only have alerts of a small number (around 30), it has just happened and I know from the type of spammy links they are that more will be coming. So, question is, should I disavow? Do I keep submitting new disavows every few days as more are discovered? Any advice will be greatly be appreciated.
White Hat / Black Hat SEO | | jonathan790 -
GWT 404 best practices
I'm getting back lots of 404 errors for old websites that are linking back to my current website. If the website content/anchor text has no relevancy to my current content is it still best practice to redirect to current home page, contact the web master to remove link or any other suggestions? Not exactly sure if redirecting looks spammy since it's irrelevant content. Thanks for your help!
White Hat / Black Hat SEO | | IceIcebaby0 -
Do some sites get preference over others by Google just because? Grandfathered theory
So I have a theory that Google "grandfathers" in a handful of old websites from every niche and that no matter what the site does, it will always get the authority to rank high for the relevant keywords in the niche. I have a website in the crafts/cards/printables niche. One of my competitors is http://printable-cards.gotfreecards.com/ This site ranks for everything... http://www.semrush.com/info/gotfreecards.com+(by+organic) Yet, when I go to visit their site, I notice duplicate content all over the place (extremely thin content, if anything at all for some pages that rank for highly searched keywords), I see paginated pages that should be getting noindexed, bad URL structure and I see an overall unfriendly user experience. Also, the backlink profile isn't very impressive, as most of the good links are coming from their other site, www.got-free-ecards.com. Can someone tell me why this site is ranking for what it is other than the fact that it's around 5 years old and potentially has some type of preference from Google?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
Dealing with internal pages with bad backlinks - is this approach OK?
Hi all, I've just been going through every page of my company website, and found a couple of internal pages with nasty backlinks/profiles. There are a significant number of article marketing and rubbish directory pages pointing to these internal pages. These internal pages have low PR, yet are performing well in terms of SERPs. I was planning to: (1) change URLs - removing current (soon to be former) URLs from Google via Webmaster Tools. Then (2) remove website's 404 for a while so nasty links aren't coming anywhere near the website (hopefully nasty links will fail to find website and broken links will result in link removal - that's my thinking anyway). PS. I am not planning to implement any kind of redirect from the old URLs. Does this sound like a sensible approach, or may there be problems with it? Thanks in advance, Luke
White Hat / Black Hat SEO | | McTaggart0 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
Multiple H1 tags are OK according to developer. I have my doubts. Please advise...
Hi, My very well known and widely respected developer is using multiple H1 tags I see - they like using them in their code and they argue multiple H1s conform with HTML5 standards. They are resisting a recode to one H1 tag per page. However, I know this is clearly an issue in Bing, so I don't want to risk it with Google. Any thoughts on whether it's best to avoid multiple H1 tags in Google (any evidence and reasoning would be great - I can then put that to my developer...) Many thanks for your help, Luke
White Hat / Black Hat SEO | | McTaggart0