What is the best way to handle links that lead to a 404 page
-
Hi Team Moz,
I am working through a site cutover with an entirely new URL structure and have a bunch of pages that could not, would not or just plain don't redirect to new pages.
Steps I have taken:
-
Multiple new sitemaps submitted with new URLs and the indexing looks solid
-
used webmasters to remove urls with natural result listings that did not redirect and produce urls
-
Completely built out new ppc campaigns with new URL structures
-
contacted few major link partners
Now here is my question:
I have a pages that produce 404s that are linked to in forums, slick deals and stuff like that which will not be redirected. Is disavowing these links the correct thing to do?
-
-
Hi,
Definitely don't use disavow unless you think that the links are poor quality and could harm your site, or are actively harming it right now. That is what disavow is for, not for removing your 404 pages.
There is no harm waiting for Google to remove the 404 pages on its own, especially if you have used its URL removal tool as well. If there are any good links in the backlink profile of the 404ing pages, do attempt to contact the webmaster and have them changed - most people are more than happy to do this.
-
If the links are good ones, 301 redirect to a good page, you don't have to have a blank page at that url.
if they are bad links just leave them. if that are 404'ing then they can do you no harm.
The only 404's that can do you harm are ones from your own internal links, because it means you have link juice leaks. fix any if you have them
-
Edit the link backs you were getting to the 404 pages and point it to the new pages. Another option is to host a blank page (with header and footer) on the 404 page and 301 redirect it to the new pages. the page rank/ link profile will get passed to the new page.
-
Well, the correct / best thing to do would be to try and get all of those links edited and pointed to live pages. That said, if you don't know who posted the links or have no way to get in touch with those who do, then it can be very awkward to achieve - still, link reclamation can be a great way to help with new links, seeing as they are already pointing to your site.
-Andy
-
If you feel the links are harming you or your SEO efforts in anyway, you can go ahead and disavow them. However, the disavow link does not remove the links so it does not help with 404 errors, but will ignore them when it comes to your rankings.
Hope this helps!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do I submit a sitemap for a highly dynamic site or not? If so, what's the best way to go about doing it?
I do SEO for online boutique marketplace. I've been here for about 4 weeks and no one's done there SEO (they've been around for about 5 years), so there's lots to do. A big concern is whether or not to submit a sitemap, and if I do submit one, what's the best way to go about doing one.
Technical SEO | | Jane.com0 -
Panda Cleanup - Removing Old Blog Posts, Let Them 404 or 301 to Main Blog Page?
tl;dr... Removing old blog posts that may be affected by Panda, should we let them 404 or 301 to the Blog? We have been managing a corporate blog since 2011. The content is OK but we've recently hired a new blogger who is doing an outstanding job, creating content that is very useful to site visitors and is just on a higher level than what we've had previously. The old posts mostly have no comments and don't get much user engagement. I know Google recommends creating great new content rather than removing old content due to Panda concerns but I'm confident we're doing the former and I still want to purge the old stuff that's not doing anyone any good. So let's just pretend we're being dinged by Panda for having a large amount of content that doesn't get much user engagement (not sure if that's actually the case, rankings remain good though we have been passed on a couple key rankings recently). I've gone through Analytics and noted any blog posts that have generated at least 1 lead or had at least 20 unique visits all time. I think that's a pretty low barrier and everything else really can be safely removed. So for the remaining posts (I'm guessing there are hundreds of them but haven't compiled the specific list yet), should we just let them 404 or do we 301 redirect them to the main blog page? The underlying question is, if our primary purpose is cleaning things up for Panda specifically, does placing a 301 make sense or would Google see those "low quality" pages being redirected to a new place and pass on some of that "low quality" signal to the new page? Is it better for that content just to go away completely (404)?
Technical SEO | | eBoost-Consulting0 -
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
How is IT handling multi-page search results for this url?
How is the IT team handling multi-page results? The URL is the same - with out any parameters, but the content changes. Is this best way to handle it from an SEO perspective?
Technical SEO | | S.S.N0 -
Link rel next previous VS duplicate page title
Hello, I am running into a little problem that i would like to have a feedback on. I am running multiple wordpress blogs and Seo Moz pro is telling me that i have duplicate title tags on canadiansavers.ca vs http://www.canadiansavers.ca/page/2 I can dynamically add a page 2 to the title but I am correctly using the link rel: next and rel:previous Why is it seen as a duplicate title tag and should i add the page 2, page 3... in the meta title thanks
Technical SEO | | pixweb0 -
Best strategy for category filtering links eg by colour
Hi All, I hope you can help with some basic on page seo questions! I have an ecommerce site which allows users to filter/restrict the view of a category by one or more colours. This is done by appending a querystring value to the url ie to view blue, green and purple widgets the link might be: www.example.com/my-widgets-category/?colors=123,92,64 On each category page is a group of coloured boxes with links to filter by that colour, (only if there are available coloured widgets in that category). Each category has rel=canonical set to be the appropriate unfiltered category url ie: www.example.com/my-widgets-category/ I used to have these colour filter links all nofollowed- but am not sure that this is a good idea. So my questions are: 1/ what are the implications of these colour links that can generate a lot of different urls (as you can keep on adding colours to the filter) and how can i enure that i am not shooting myself in the foot- my customers love it! 2/ I also have page=1 etc appended for paging through results- the canonical url is set in all instnaces to be the plain category page as above- do i need to add the rel=prev and re=next? 3/ all of these links can really bump up my total page link count- at the moment i have colour filtering boxes in my main menu drop downs so that users can filter all the products that exists in all of the nested child categories of top level categories by colour. Should i remove these to reduce my total link count, nofollow them or leave as is? Its a great site feature for users- i just don't want to be shooting myself in the foot unecessarily. Thanks!
Technical SEO | | blessig0 -
Way to find how many sites within a given set link to a specific site?
Hi, Does anyone have an idea on how to determine how many sites within a list of 50 sites link to a specific site? Thanks!
Technical SEO | | SparkplugDigital0 -
Too many on page links for WP blog page
Hello, I have set my WP blog to a page so new posts go to that page making it the blog. On a SEOmoz campaign crawl, it says there are too many links on one page, so does this mean that as I am posting my blog posts to this page, the search engines are seeing the page as one page with links instead of the blog posts? I worry that if I continue to add more posts (which obviously I want to) the links will increase more and more, meaning that they will be discounted due to too many links. What can I do to rectify this? Many thanks in advance
Technical SEO | | mozUser14692366292850