Best Google Practice for Hacked SIte: Shift Servers/IP or Disavow?
-
Hi -
Over the past few months, I've identified multiple sites which are linking into my site and creating fake pages (below is an example and there's over 500K+ of similar links from various sites}. I've attempted to contact the hosting companies, etc. with little success. Was wondering if my best course of action might be at this point: A) which servers (or IP address). B) Use the Google Disavow tool? C) both.
example: { http://aryafar.com/crossings/200-krsn-team-part19.html }
Thanks!!
-
Few things... make sure you have a sitemap that is always upto date and submitted to search engines - this will encourage them to view your content first and recognise it as belonging to your domain.
In addition to this put links in your content to other parts of your site, if it gets scraped it will probably be with the links in it and so anyone actually wanting real content can get through.
If there are thousands from the same domain coming to your site, disavow the base url and also report that url for spam (it's your copyright). In fact if you notice a small site scraping you, do that after you've tried to contact them.
If this still doesn't stop them look at your logs and see where their crawlers are coming from and block their IP's.
On one of my old site I blocked the whole of China at one point because it was constantly being barraged by scrapers and people trying to guess account passwords.
Hope that helps
-
OK, so they're scraping much of your site, and then adding in their own garbage etc.
I wouldn't worry about the occasional instance of this, unless you do see a penalty. For the more egregious ones, where they're building a ton of links, I'd throw their domain in your disavow list.
-
Hi Michael -
Sorry for the confusion...My site is HHisland.com and sites like the example below are linking in and creating false pages...Most are adult sites, etc.
Thanks again -
Billy
-
Hi Billy,
I'm not sure exactly what's going on here. Is it YOUR site that's getting hacked, or is it other sites getting hacked and linking to you, and you're worried that the "bad neighborhood" links will hurt you?
Michael.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What IS SEO FRIENDLY BEST PRACTICE FOR URLS FILTERED 'TAGGED'
EX: https://www.STORENAME.com/collections/all-deals/alcatel– Tagged "Alcatel", when I run audits, I come across these URLS that give me duplicate content and missing H1. This is Canonical: https://www.STORENAMEcom/collections/all-deals/alcatel Any advice on how to tackle these I have about4k in my store! Thank you
Technical SEO | | Sscha0030 -
Best practices for making a very long URL shorter
Hi Moz folks! We are redesigning a website of 30,000+ pages. We are pulling together a spreadsheet for 301 redirects. So basically this: http://www.mywildlifesite.org/site/PageServerpagename=priorities_wildlife_endangered_species_protection#.Ws54SNPwbAw/mexican-spotted-owl Will direct to here, this is the nav architecture:
Technical SEO | | CalamityJane77
https://mywildlifesite.org/wildlife-conservtion/endangered-species-act-protections/endangered-species-list/birds/mexican-spotted-owl My question is, can I and should I truncate that new destination URL to make it easy for Google to see that the page topic is really the owl, like this:
https://mywildlifesite.org/endangered-species-list/mexican-spotted-owl Your input is greatly appreciated! Jane0 -
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Will Google Also Penalize Desktop Rankings If Your Site is Not Mobile Friendly?
Apologies if this question has already been answered. I was unable to find it. For desktop organic rankings: Will Google take into consideration mobile-readiness as a ranking factor? Thanks in advance for any reply, Kind regards,
Technical SEO | | Eric_Lifescript
Eric Darby1 -
Google Not liking Magento Sites?
Hello, I'm new to the community and I wonder if anyone can help us shed a light on this SEO issue we are having. We have 3 magento websites that is being affected. Whats happening is that those site were ranked for a specific keyword for few months, but all of a sudden, it just drop like crazy. It went from top 10 to about 150 in a bout a weeks period. Some site, it's not even ranked or stopped ranking and visible on the search engine. Is google not liking MAgento for some reason?? Any help or suggestions will be appreciated! thanks
Technical SEO | | solution.advisor0 -
Google Places - What is the best Service Areas Strategy?
I've found a lot of useful info on this topic in these forums, but still can't seem to find the answer to my specific question. Client has one physical location and services many areas. I have seen various comments that claim setting a service area actually has a negative effect on rankings and the login makes sense to me, so we don't want to do that. Using the actual physical address, seems to be what google would prefer, but the address is actually on the outskirts of the city and would mean that competitors that have addresses closer to the city center would show up before us. Our current places listing has the actual address, but the previous SEO put the larger city, with the smaller city zip on the on the website. City Center: San Diego, 92101 Actual: Street Address, El Cajon, 92020 On website: San Diego, 92020 It this large City + Actual zip code strategy any good? Which of these 3 strategies should we use to standardize all of our listings? *we will not be considering a location or mailbox per service are to use multiple listings at this time
Technical SEO | | vernonmack0 -
Local SEO best practices for multiple locations
When dealing with local search for a business with multiple locations, I've always created an individual page for each location. Aside from the address and business name being in there, I also like to make sure the title tag and other important markup features the state/city/suburb, or, in the case of hyper-local, hyper-competitive markets, information more specific than that. It's worked very well so far. But, the one thing you can always count on with Local is that the game keeps changing. So I'd like to hear what you think... How do you deal with multiple locations these days? Has Google (and others, of course) advanced far enough to not mess things up if you put multiple locations on the same page? (Do I hear snickers? Be nice now) How does Schema.org fit in to your tactics in this area, if at all? Cheers (Edit: dear SEOmoz, stop eating my line breaks)
Technical SEO | | BedeFahey0 -
How can I get Google to crawl my site daily?
I was wndering if there was a trick to getting google to crawl my website daily?
Technical SEO | | labradoodlelocator0