Regular Expressions for Filtering BOT Traffic?
-
I've set up a filter to remove bot traffic from Analytics. I relied on regular expressions posted in an article that eliminates what appears to be most of them.
However, there are other bots I would like to filter but I'm having a hard time determining the regular expressions for them.
How do I determine what the regular expression is for additional bots so I can apply them to the filter?
I read an Analytics "how to" but its over my head and I'm hoping for some "dumbed down" guidance.
-
No problem, feel free to reach out if you have any other RegEx related questions.
Regards,
Chris
-
I will definitely do that for Rackspace bots, Chris.
Thank you for taking the time to walk me through this and tweak my filter.
I'll give the site you posted a visit.
-
If you copy and paste my RegEx, it will filter out the rackspace bots. If you want to learn more about Regular Expressions, here is a site that explains them very well, though it may not be quite kindergarten speak.
-
Crap.
Well, I guess the vernacular is what I need to know.
Knowing what to put where is the trick isn't it? Is there a dummies guide somewhere that spells this out in kindergarten speak?
I could really see myself botching this filtering business.
-
Not unless there's a . after the word servers in the name. The . is escaping the . at the end of stumbleupon inc.
-
Does it need the . before the )
-
Ok, try this:
^(microsoft corp|inktomi corporation|yahoo! inc.|google inc.|stumbleupon inc.|rackspace cloud servers)$|gomez
Just added rackspace as another match, it should work if the name is exactly right.
Hope this helps,
Chris
-
Agreed! That's why I suggest using it in combination with the variables you mentioned above.
-
rackspace cloud servers
Maybe my problem is I'm not looking in the right place.
I'm in audience>technology>network and the column shows "service provider."
-
How is it titled in the ISP report exactly?
-
For example,
Since I implemented the filter four days ago, rackspace cloud servers have visited my site 848 times, , visited 1 page each time, spent 0 seconds on the page and bounced 100% of the time.
What is the reg expression for rackspace?
-
Time on page can be a tricky one because sometimes actual visits can record 00:00:00 due to the way it is measured. I'd recommend using other factors like the ones I mentioned above.
-
"...a combination of operating system, location, and some other factors can do the trick."
Yep, combined with those, look for "Avg. Time on Page = 00:00:00"
-
Ok, can you provide some information on the bots that are getting through this that you want to sort out? If they are able to be filtered through the ISP organization as the ones in your current RegEx, you can simply add them to the list: (microsoft corp| ... ... |stumbleupon inc.|ispnamefromyourbots|ispname2|etc.)$|gomez
Otherwise, you might need to get creative and find another way to isolate them (a combination of operating system, location, and some other factors can do the trick). When adding to the list, make sure to escape special characters like . or / by using a \ before them, or else your RegEx will fail.
-
Sure. Here's the post for filtering the bots.
Here's the reg x posted: ^(microsoft corp|inktomi corporation|yahoo! inc.|google inc.|stumbleupon inc.)$|gomez
-
If you give me an idea of how you are isolating the bots I might be able to help come up with a RegEx for you. What is the RegEx you have in place to sort out the other bots?
Regards,
Chris
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Creating two websites from one and building up traffic to the new domain quickly
A client has an existing successful website that sells niche products - they are well known in their marketplace. They have two sets of key customers, let's call them (a) and (b), that need addressing in different ways to maximise sales. (a) is the more specialist end of the market, where people have complex needs - there are fewer of them but repeat business is likely, and we can talk to them in more technical language. (b) is the layman's end of the market - there is a vast pool of potential customers but they'll be more casual buyers and need to be addressed more in layman's terms. So what they want to do is to take their existing website, and essentially split it into two different websites, one for each market. The one that will use the existing domain, with all the links that have built up over the years pointing to it, will be the site for the more specialist end of the market (a). The domain name suits it better, which is why he wants to use the existing domain with that site and not the other. (b) will be a brand new domain. The client will write new product descriptions across the board so that the two sets of product information are not duplicate. I'd rather he didn't do this at all, because of the risk involved, and the difficulty of building up the traffic to the new site, which is after all the one with the best chance of mass market sales. But given that the client has decided that this is definitely what he wants, does anyone have any thoughts on what the action plan should be?
Intermediate & Advanced SEO | | helga730 -
What will happen if we 302 a page that is ranking #1 in google for a high traffic term?
We're planning to test something and we want to 302 a page to another page for a period of time. The question is, the original page is ranking #1 for a high traffic term. I want to know what will happen if we do this? Will we lose our rank? Will the traffic remain the same? Ultimately I do not want to lose traffic and I do not want to 301 until it has been properly tested.
Intermediate & Advanced SEO | | maxcdn0 -
Disavow links of my own in niche forums that i post to regularly?
Hi Yall, I'm disavowing a new set of links and have come across a wall: Let's say your niche is in web hosting and you post to forums such as a webhostingtalk.com (a forum very popular in the hosting business). If your sole purpose is mostly selling your business and you have links (not anchor text keywords) that you direct users to for specific products and such...do you do a disavow those links? I'm not leaving links like: Web hosting, or, Free Hosting... I'm posting deals and answering some questions on other posts that direct to my site with traditional links. Thank you
Intermediate & Advanced SEO | | Shawn1240 -
Why isn't my uneven link flow among index pages causing uneven search traffic?
I'm working with a site that has millions of pages. The link flow through index pages is atrocious, such that for the letter A (for example) the index page A/1.html has a page authority of 25 and the next pages drop until A/70.html (the last index page listing pages that start with A) has a page authority of just 1. However, the pages linked to from the low page authority index pages (that is, the pages whose second letter is at the end of the alphabet) get just as much traffic as the pages linked to from A/1.html (the pages whose second letter is A or B). The site gets a lot of traffic and has a lot of pages, so this is not just a statistical biip. The evidence is overwhelming that the pages from the low authority index pages are getting just as much traffic as those getting traffic from the high authority index pages. Why is this? Should I "fix" the bad link flow problem if traffic patterns indicate there's no problem? Is this hurting me in some other way? Thanks
Intermediate & Advanced SEO | | GilReich0 -
How to increase the traffic and keywords ranking
Am working on a website http://www.greavescotton.com. A month ago most of the keywords were in first page. This month I have noticed the fall of keyword rankings. I have checked the webmaster tools for any mails and messages but I did not find any. So can anyone please visit the site and check it and please give me the suggestions accordingly to increase the traffic and keywords. Also I would like to know the advantages of commenting in Youtube.com.
Intermediate & Advanced SEO | | PrasanthMohanachandran0 -
Block search bots on staging server
I want to block bots from all of our client sites on our staging server. Since robots.txt files can easily be copied over when moving a site to production, how can i block bots/crawlers from our staging server (at the server level), but still allow our clients to see/preview their site before launch?
Intermediate & Advanced SEO | | BlueView13010 -
When you provide traffic estimates, do you factor in CTR?
There are several studies that show CTR based on position. When a client asks for traffic estimates do you multiply CTR by estimated search volume? Why or why not?
Intermediate & Advanced SEO | | nicole.healthline0 -
Recommendation to fix Google backlink anchor text over optimisation filter penalty (auto)
Hi guys, Some of you may have seen a previous question I posted regarding a new client I started working with. Essentially the clients website steadily lost all non domain name keyword rankings over a period of 4-12 weeks, despite content changes and various other improvements. See following:: http://www.seomoz.org/q/shouldn-t-google-always-rank-a-website-for-its-own-unique-exact-10-word-content-such-as-a-whole-sentence After further hair pulling and digging around, I realised that the back link anchor text distribution was unnatural for its homepage/root. From OSE, only about 55/700 of links anchor text contain the clients domain or company name!....8%. The distribution of the non domain keywords isn’t too bad (most repeated keyword has 142 links out of the 700). This is a result of the client submitting to directories over the last 3 years and just throwing in targeted keywords. Is my assumption that it is this penalty/filter correct? If it is I guess the lesson is that domain name anchor texts should make up more of your links? MY QUESTION: What are some of the effective ways I can potentially remove this filter and get the client ranking on its homepage again? Ensure all new links contain the company name?
Intermediate & Advanced SEO | | Qasim_IMG
Google said there was no manual penalty, so not sure if there’s any point submitting another reconsideration request? Any advice or effective experiences where a fix has worked would be greatly appreciated! Also, if we assume company is "www.Bluewidget.com", what would be the best way to link most naturally: Bluewidget
Blue widget
Blue widget .com
www.bluewidget.com
http://www.bluewidget.com....etc I'm guessing a mix of the above, but if anyone could suggest a hierarchy that would be great.0