Google found bad links delete them or 301 redirect?
-
we went into our google account and saw about 70 bad links that they found on our site.
what's the best thing to do, seo-wise:
-
should we go into the pages that have the bad links and delete them from the html code, or
-
re-direct them in our htaccess script?
-
-
Hi Derek,
I think kjay, Harald & Will have all pretty much covered removal (or correction) of links that are being called from within the pages on your site and using 301 redirects to capture traffic from external links.
I would also make sure that you have a well designed custom 404 page set up for your site so that any link that isn't covered by the work you have done on the above will still provide an opportunity for the visitor to get to the content they are looking for.
You should make sure that your custom 404 page contains some well written text that invites the visitor to explore the site to find other content that will help them and a menu that will allow them to click through to other parts of your site.
Hope that helps,
Sha
-
Hi Derek, I assume by "bad links" you mean pages on your domain that are returning a 404 error?
If this is the case then there're normally two things I do in these situations.
-
Once I've found the list of 404s I'll look at the page authority of that page in open site explorer (the URL will still show results even if the content is no longer there).
-
If any of these 404 pages are being linked to from EXTERNAL pages then I'll 301 those pages to the most similar other page on the domain.
-
Run Xenu Link Sleuth (it's free, Google it if you don't have it) over the domain and find every link to these 404 pages.
-
Manually go through and remove these dud links.
Hope that helps!
301 redirects should always be used if you delete a page that's built up links from external pages (or if you've merged the content into a larger page).
-
-
Hi DerekM88, When we talk about broken/bad links, it always affects our website. It can be very destructive & can create a adverse effects on our website. The website may results in negative user experience because when a page is removed we get an 404 error message , it will leads to user a negative experience & it will never visit the website again.This overall reduces the the reputation of the website from the SEO point of view.
So we have to take certain measure for fixing these bad links. In my personal opinion you should delete the bad links because whenever search engine spiders crawl a website & suddenly he founds some bad links then it will not crawl it & leaves a negative feedback also. Below there are some measures from which you can easily fix these erros.
I hope that your query had been solved.
-
If you have links to dead pages on your site remove them from the HTML.
301 redirects are used when you are moving pages / changing URL's.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When using external links for onsite optimization, should they be follow or nofollow links?
I'm trying to optimize my home page and need to add external links. I'm planning to link to other authoritative sites. Should they be follow or no follow links?
On-Page Optimization | | ntaparia171 -
ECommerce Website Internal Links
We run an ecommerce website... approx 8K products. When using the page grader, MOZ tools consistently tell me that I have too many Internal Links on the page.
On-Page Optimization | | Ampweb
These are caused from our fairly large menu system, and probably from the sub-category links on the category landing pages as well. I was reading an article that mentioned a no-follow on these Internal links would not really solve the "Too many internal links issue", so wanted to check if anyone has ideas or should I just dis-regard this MOZ suggestion that there are too many in this type of environment?0 -
Do follow links
Is it good(according to SEO) to give dofollow rel to external links of highly trusted websites from our content?
On-Page Optimization | | Obbserv0 -
Navigational Links in Dropdown Menus
Hi Mozzers will navigational links in dropdown menus carry the same weight for Google, just as navigational links that are always visible to the user do? Thx, Chris
On-Page Optimization | | Diderino0 -
Boatload of 301 Redirects Question
We have a client that came to us and they recently did a site makeover. Previously they had all their pages in root directory including 75+ spammy article pages. On their makeover, they moved all the article pages into a directory and added 301 redirects. In going over their site we noticed they have redundant articles, like an article on blue-marble-article.htm and blue-marbles-article.htm Playing on singular and plural with dulpicate content for most part with exception to making it plural. If they have 75 articles, Id say 1/3 are actually somewhat original content. I would like to 301 redirect 2/3's of the articles to better re-written article pages but that would add a whole lot more 301 redirects. We would then have a 301 redirect from root directory to article directory, then another 301 redirect from spam article to new re-written article. My question is, would this be too many redirects for googlebot to sort through and would it be too confusing or send bad signals? Or should I create a new directory with all good articles and just redirect the entire old articles directory to the new one? Or just delete the redirects and old spammy directory and let those fall on a 404 error page. Id hate to lose 50-75 pages but I think its in fact those spammy pages that could be why the site fell from top of first page google to third page and now 10th page in a years time. I know, Im confused just typing this out. Hope it makes sense for some good feedback and advise. Thanks.
On-Page Optimization | | anthonytjm0 -
301 rediects - weird characters
Hi, Just cleaning up some 404 errors in my site in GWT and noticed a couple of external links pointing at me that are wrong. Basically they are from a couple of DIY forum sites (I've not put these links in place myself they are 100% natural) and it appears that the owners of the forums amend the links (inserting characters into link) so they are not quite right. How would I go about redirecting the following - www.example.co.uk/blueberry_pie.htm Whatever I try doesn't work and I always end up with a 404! Cheers
On-Page Optimization | | Jon-C
J0 -
Which is better, a directory 301 redirect or each page in the directory?
A customer of mine has a site with lots of articles and they are all quite spammy. They have not been affected by penguin yet so they asked what to do. I suggested losing the articles directory and 301 redirect to either the home page or another important page. Would a 301 redirect on the entire directory to a single page be the way to go or add redirects from each page within the directory and spread out redirects to various pages in website? Or do you have a better suggestion?
On-Page Optimization | | anthonytjm0 -
Where does link juice flow on a cloaked link?
Hello, I use a wordpress plug in that allows me to display tot he user any link I want from my domain, so it might be like: www.domain.com/gift-card, but the actual link is www.someaffiliatelink/w09fjai;owfoienw <--- and then a bunch of crap after the domain for the affiliate link. It uses the common technique of an iframe to hide the actual url from the user and show the one that I want them to see. What I am wondering is, does link juice in this case flow to my site, or to their site? And also, do you have any comments regarding this type of link cloaking? Thanks. Thanks
On-Page Optimization | | BigJohnson0