Site under attack from Android SEO bots - expert help needed
-
For last 25 days, we are facing a weird attack on our site.
We are getting 10x the normal mobile traffic - all from Android, searching for our name specifically. We are sure that this is not authentic traffic as the traffic is coming from Organic searches and bouncing off. Initially, we thought this was a DDoS attack, but that does not seem to be the case.
It looks like someone is trying to damage our Google reputation by performing too many searches and bouncing off.
Has any one else faced a similar issue before? What can be done to mitigate the impact on site.
(FYI - we get ~2M visits month on month, 80% from Google organic searches). Any help would be highly appreciated.
-
Just as EGOL describe it.
If you're on Amazon AWS then you can use their CloudFront as CDN. But also you can observe source of traffic. Could coming from one country, one IP range or one user-agent. There should be some kind of pattern and you should investigate it.
Then just need to make rule to block that traffic or just redirect them to one static "hello world" page.
I was also victim of such traffic, but was from humans trying to depleting an AdWords daily budget. Once budget it over ads was stopped showing, after few hours they recalculate clicks, some funds was returned, ads are shown again, they click it, budget is over... and so on.
-
By resetting your DNS to CF, your server is no longer used. All traffic is routed to one of CF's data centers and there are over 100 of them distributed throughout the world.
Also, in the CF settings, you want to "challenge" the visitors from problem countries. This will give them a captcha to complete. When they complete that captcha one time, you can then give them long term access without the challenge. CF will progressively become better at filtering the bots and allowing more trusted visitors in without a challenge.
-
Thanks for your help - this works to a large degree.
Have hit a new challenge though, our AWS servers are in one of these countries which are sending traffic. And we have multiple servers talking to each other enabling Login / other actions on the site.
While I have blocked all the other countries, blocking country with AWS servers is creating problem with Login. Trying to figure this out!
-
If you don't use Firewall, Cloudflare in your situation will have almost no effect.
We used our analytics to determine the countries where the traffic was coming from. Then went into CF FW.
Click the blue Help link for each tool to decide upon the settings that you want to try.
Here is what we used....
Security Level... Medium
Challenge Passage... one day
Access rules.... country name, challenge, this website
Impact of the above.... Many bots already recognized by CF will be blocked. Access rules will present each visitor from those countries a form similar to a captcha. They must pass the captcha to get in.
After you turn this on, watch your short term stats. You should see an increase in blocking.
We ran the above for a few weeks without any obvious SEO impact. Then switched our DNS back to normal, moving away from CF.... but kept the $20/month account and our settings in place. CF was time-consuming to set up.
-
This looks very similar to what we are seeing. We took CloudFlare as well - but stayed with Free account with "Site Under Attack" mode, which should force the visits to verify.
Will it be possible for you to share the settings on CloudFlare? Did you use their Firewall as well? Also, did you see any SEO impact, by any chance?
-
One morning, a few months ago we saw lots of mobile phone traffic building. All was hitting our homepage which is very resource intensive. All of this traffic generated one page view. All of the traffic was coming from a few countries in Asia and Africa. No referrer. Looked like a DDOS attack.
We go to Cloudflare, got a $20/month account, switched DNS to CF, forced untrusted visits from those countries to verify before allowing entry. Squeezed this traffic down to almost nothing within a few hours. Left CF run for a few weeks. Rouge traffic disappeared.
Now we have CF ready to go with all settings in place. Can turn it on in two minutes and have the shield in place as DNS propagates.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Duplicate ecommerce sites, SEO implications & others?
We have an established eCom site built out with custom php, dedicated SERPs, traffic, etc.. The question has arisen on how to extend commerce on social and we have found a solution with Shopify. In order to take advantage of this, we'd need to build out a completely new site in Shopify and would have to have the site live in order to have storefronts on Pinterest and Twitter. Aside from the obvious problem with having two databases, merchant processing, etc, does anyone know whether there are SEO implications to having two live sites with duplicate products? Could we just disavow a Shopify store in Webmaster Tools? Any other thoughts or suggestions? TIA!
Intermediate & Advanced SEO | | PAC31350 -
New site. How important is traffic for a new site? And what about domain age?
Hi guys. I've been building a new site because i've seen a real SEO opportunity out there. I'm a mixing professional by trade and so I wanted to take advantage of SEO to help gain more work. Here's the site: www.signalchainstudios.co.uk I'm curious about domain age. This site fairly well optimised for my keywords, and my site got pretty good content on it (i think so anyway). But it's no where to be seen on the SERP's (link at all). Is this just a domain age issue? I'd have though it might be in the top 50 because my site's services are not hard to rank for at all! Also what about traffic? Does Google want to see an 'active' site before it considers 'promoting' it up the ranks? Or are back links and good content the main factor in the equation? Thanks in advance. I love this community to bits 🙂 Isaac.
Intermediate & Advanced SEO | | isaac6631 -
SEO for Metasearch sites like KAYAK or Trivago
Interested in getting some opinions of best practices for large dynamic sites? My main question is, should we be focusing on a different "strategy" if we have a similar type site or does the same principles still apply? Second question is would building a blog presence in our community be a better strategy than trying to get our dynamic pages to rank? If there is a great step by step or white paper resource anyone could recommend, would be greatly appreciated, as our technical team is just diving into SEO. If anyone cares to take a look, our site is SupremeGolf.com
Intermediate & Advanced SEO | | Blake_Davis0 -
Best set up for mobile site for SEO
Hello Does anyone have any input into what is the best way to have a mobile website URL structure for not responsive display sites. mobile.site.com www.site.com/m/ or neither have it just display on the same URL. Thanks
Intermediate & Advanced SEO | | christaylorconsulting0 -
Troubled QA Platform - Site Map vs Site Structure
I'm running a Q&A forum that was built prioritizing UX over SEO. This decision has cause a bit of a headache as we're 6 months into the project with 2278 Q&A pages with extremely minimal traffic coming from search engines. The structure has the following hiccups: A. The category navigation from the main Q&A page is entirely javascript and only navigable by users. B. We identify Google bots and send them to another version of the Q&A platform w/o javascript. Category links don't exist in this google bot version of the main Q&A page. On this Google version of the main Q&A page, the Pinterest-like tiles displaying individual Q&As are capped at 10. This means that the only way google bot can identify link juice being passed down to individual QAs (after we've directed them to this page) is through 10 random Q&As. C. All 2278 of the QAs are currently indexed in search. They are just indexed very very poorly in SERPs. My personal assumption, is that Google can't pass link juice to any of the Q&As (poor SERP) but registers them from the site map so it gets included in Google's index. My dilemma has me struggling between two different decisions: 1. Update the navigation in the header to remove the javascript and fundamentally change the look and feel of the Q&A platform. This will allow Google bot to navigate through Expert category links to pass link juice to all Q&As. or 2. Update the redirected main Q&A page to include hard coded category links with 100s of hard coded Q&As under each category page. Make it similar, ugly, flat and efficient for the crawling bots. Any suggestions would be greatly appreciated. I need to find a solution as soon as possible.
Intermediate & Advanced SEO | | TQContent0 -
SEO question
Hi i changed my page titles for a competitive keyword last week and noticed it has dropped 9 search engine ranking positions. Was ranking 37 and now it 46. Would you guys leave it and see if it starts creeping back up or change again? the page title i used was across my pages for example was Primary keyword | secondary keyword | Heading on page thanks for you help
Intermediate & Advanced SEO | | wazza19850