Spam pages / content created due to hack. 404 cleanup.
-
A hosting company's server was hacked and one of our customer's sites was injected with 7,000+ pages of fake, bogus, promotional content.
Server was patched and spammy content removed from the server.
Reviewing Google Webmaster's Tools we have all the hacked pages showing up as 404's and have a severe drop in impressions, rank and traffic. GWT also has 'Some manual actions apply to specific pages, sections, or links'...
What do you recommend for:
- Cleaning up 404's to spammy pages? (I am not sure redirect to home page is a right thing to do - is it?)
- Cleaning up links that were created off site to the spam pages
- Getting rank bank // what would you do in addition to the above?
-
You want those old spam pages to have a 410 code - gone (permanently). I'm not 100% sure how you will achieve this though... I'd speak to your hosting company and/or web developer.
404 code means the page is 'not found' which isn't the same as 410, which tells the search engines that the page has gone forever, so they won't keep looking for it.
Hope this helps!
Amelia
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Score 8/17 with no inbound links. Any hints ?
Hi everyone, One of my client's website has a 8/17 Spam Score. (Domain Authority 18) Thing is, he has 0 inbound links (using Open site explorer) but he has 5 referring domains (when using Majestic). No error messages whatsoever in his Search Console. My questions are : what could explain that 8 Spam Score ? what can I do to lower that spam score (except disavow links in GWT) ? Thanks!
Technical SEO | | julienraby0 -
Duplicate pages or note? Variations just due to language changes?
I have some pages marked as duplicates, so I want to do what I can to solve the issues concerned. One issue concerns duplicates where the page content is indeed the same except for the language that the content is offered in. The URL for example of the documentation page of the site, in English is as follows:
Technical SEO | | PulseAnalytics
http://www.domain.com/support/documentation We then have the same content in German, French, Russian using the following URLs.
http://www.domain.com/de/support/documentation
http://www.domain.com/fr/support/documentation
http://www.domain.com/ru/support/documentation Each page has links to PDFs which are all in fact in English so the links to the docs are the same. Moz is flagging up all these pages as being duplicate content (which it is when translated back into English, but is not if you just consider that they are using completely different languages!) Has anyone any thoughts on how to solve this? Or is this something not to worry about / disregard? Many thanks Simon0 -
With regard to tabbed content or accordions for text... would it be better to break these out into individual pages for SEO?
We often get asked by clients if they have a lot of content about a particular subject: would it be better to break that information out into smaller chunks as separate pages, OR would it be good to build a tabbed content container or accordion feature on one single page? Does anyone have any opinion on this in regards to SEO?
Technical SEO | | chansen0 -
How to fix duplicate page content error?
SEOmoz's Crawl Diagnostics is complaining about a duplicate page error. The example of links that has duplicate page content error are http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348855 http://www.equipnet.com/misc-spare-motors-and-pumps_listid_348852 These are not duplicate pages. There are some values that are different on both pages like listing # , equipnet tag # , price. I am not sure how do highlight the different things the two page has like the "Equipment Tag # and listing #". Do they resolve if i use some style attribute to highlight such values on page? Please help me with this as i am not really sure why seo is thinking that both pages have same content. Thanks !!!
Technical SEO | | RGEQUIPNET0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
How Can I Block Archive Pages in Blogger when I am not using classic/default template
Hi, I am trying to block all the archive pages of my blog as Google is indexing them. This could lead to duplicate content issue. I am not using default blogger theme or classic theme and therefore, I cannot use this code therein: Please suggest me how I can instruct Google not to index archive pages of my blog? Looking for quick response.
Technical SEO | | SoftzSolutions0 -
Duplicate Content Home Page
Hello, I am getting Duplicate Content warning from SEOMoz for my home page: http://www.teacherprose.com http://www.teacherprose.com/index html I tried code below in .htaccess: redirect 301 /index.html http://www.teacherprose.com This caused error "too many re-directs" in browser Any thoughts? Thank You, Eric
Technical SEO | | monthelie10 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0