GWT 404 best practices
-
I'm getting back lots of 404 errors for old websites that are linking back to my current website. If the website content/anchor text has no relevancy to my current content is it still best practice to redirect to current home page, contact the web master to remove link or any other suggestions?
Not exactly sure if redirecting looks spammy since it's irrelevant content.
Thanks for your help!
-
You're most welcome, glad you found it useful!
-
Thanks Marty, exactly what I was looking for.
-
If the content is irrelevant I wouldn't redirect it to specific pages on your site. If it is generally still relevant to the business as a whole, you could 301 redirect it to the root URL. If not, at the very least, if some referring visitors might become customers, you could change your 404 page to offer up calls to action or shopping options for the folks who see it.
My 2 cents.
I wouldn't worry thought about removing the links unless they're very poor quality and their percentage is high compared to your "good" links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years. We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items? Your help and thoughts is much appreciated.
White Hat / Black Hat SEO | | JimJ3 -
What is the best way to eliminate ghost traffic from Google Analytics?
Hey Mozzers, I just wanted to see how you all deal with eliminating Google ghost traffic sources from Google. I tried setting up a RegEx 'include' list before, but it seemed as though I was blocking potential traffic sources when I did as much (I'm probably missing something here). Anyway, I'm interested to read how you all have dealt with this issue in the past, thanks for reading!
White Hat / Black Hat SEO | | maxcarnage0 -
Sudden influx of 404's affecting SERP's?
Hi Mozzers, We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more). Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime. IE: We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000). And added relevant pages for each of our entertainment 'categories'.
White Hat / Black Hat SEO | | ChimplyWebGroup
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content. The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet. So with this in mind I have a few questions: How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term. How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?! **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page? Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index? Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'. Many, many thanks in advance. Ryan.0 -
Pagination for Search Results Pages: Noindex/Follow, Rel=Canonical, Ajax Best Option?
I have a site with paginated search result pages. What I've done is noindex/follow them and I've placed the rel=canonical tag on page2, page3, page4, etc pointing back to the main/first search result page. These paginated search result pages aren't visible to the user (since I'm not technically selling products, just providing different images to the user), and I've added a text link on the bottom of the first/main search result page that says "click here to load more" and once clicked, it automatically lists more images on the page (ajax). Is this a proper strategy? Also, for a site that does sell products, would simply noindexing/following the search results/paginated pages and placing the canonical tag on the paginated pages pointing back to the main search result page suffice? I would love feedback on if this is a proper method/strategy to keep Google happy. Side question - When the robots go through a page that is noindexed/followed, are they taking into consideration the text on those pages, page titles, meta tags, etc, or are they only worrying about the actual links within that page and passing link juice through them all?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
301, 404 or 410? what is the best practice
Hi I'm currently working on a project to correct some really bad practices from years of different SEO's. Basically they had made around 1500 pages of delivery counties and town, only change 3 words on every page. Now apart from duplicate content issues, this has really hammered the site with the latest round of Panda updates. I've pulled the pages, but i'm in several frames of mind on how to best fix this. The pages won't ever be used again, so i'm thinking a 410 code would be best, but reading another post: http://moz.com/community/q/server-redirect-query i'm not sure if i should just let them go to 404's if anyone ever finds them. Incidentally i'm Disavowing over 1100 root domains, so extremely unlikely to find links out there.
White Hat / Black Hat SEO | | eminent1 -
Need advice on best strategy for removing these bad links.
Heres the scenario... We recently took on a new client who's previous seo company had partaken in some dodgy link building tactics. They appear to have done some blog comment spam, very poorly. The situation we are now in is this: We have a site with an internal page deemed more important than the homepage (the homepage has 60 linking root domains and the internal page 879). It looks as though the previous seo company submitted a disavow request, theres a message in webmaster tools from a few weeks back saying it had been received, but no further correspondence. I have doubts as to whether this disavow request was done correctly... Plus im not sure that Google has issued the site a warning yet as they are ranking position one for the keyword on the internal page. Our clients want us to handle this in the correct manner, whether it be to simply ignore it and wait for Google to send a warning about the links, remove the offending internal page and leave a 404, or try to disavow the links that google doesnt know about yet from 800+ websites. Suggestions for the best practice for dealing with this situation? Any advice is much appreciated, Thanks, Hayley.
White Hat / Black Hat SEO | | Silkstream0 -
What's the best way to set up 301's from an old off-site subdomain to a new off-site subdomain?
We are moving our Online store to a new service and we need to create 301's for all of the old product URLs. Being that the old store was hosted off-site, what is the best way to handle the 301 re-directs? Thanks!
White Hat / Black Hat SEO | | VermilionDesignInteractive0 -
Which of these elements are good / bad link building practices?
Hi, I need some help. I recently got some help with an seo project from a contractor. He did 50 directory submissions and 50 article submissions. I got good results, going up about 20 places (still a long way to the first page!) on google.co.uk on a tough key word Since this project I learned article marketing is not cool. So I am wondering about what I should do next. The contractor has proposed a new bigger project consisting of the elements listed below. I don’t know which of these elements are ok and which aren’t. If they are not ok are they: 1) a waste of time or 2) something I could get penalized for? Let me know what you think?? Thanks, Andrew 100 ARTICLE SUBMISSIONS [APPROVED ARTICLES] -> 1 article submitted to 100 article directories 50 PRESS RELEASE SUBMISSIONS [APPROVED & SCREENSHOTS]-> 1 PR writing & submissions to top 50 PR distribution sites each 150 PRIVATE BLOGS SUBMISSION [APPROVED ARTICLES] -> 1 article submitted to 150 private blogs submission 100 WEBSITE DIRECTORY SUBMISSION -> 1 url (home page) submitted to 100 top free web directories 50 SOCIAL BOOKMARKING [CONFIRMED LINKS] -> 1 url of site submitted to top 50 social bookmarking websites 40 PROFILE BACK-LINKS [CONFIRMED LINKS] -> 1-3 url's of site submitted and create 40 profile websites 50 SEARCH ENGINES -> submission to all the major search engines 20 NEWS WEBSITES -> Ping all links from reports to news websites
White Hat / Black Hat SEO | | fleurya0