Regular Expressions for Filtering BOT Traffic?
-
I've set up a filter to remove bot traffic from Analytics. I relied on regular expressions posted in an article that eliminates what appears to be most of them.
However, there are other bots I would like to filter but I'm having a hard time determining the regular expressions for them.
How do I determine what the regular expression is for additional bots so I can apply them to the filter?
I read an Analytics "how to" but its over my head and I'm hoping for some "dumbed down" guidance.
-
No problem, feel free to reach out if you have any other RegEx related questions.
Regards,
Chris
-
I will definitely do that for Rackspace bots, Chris.
Thank you for taking the time to walk me through this and tweak my filter.
I'll give the site you posted a visit.
-
If you copy and paste my RegEx, it will filter out the rackspace bots. If you want to learn more about Regular Expressions, here is a site that explains them very well, though it may not be quite kindergarten speak.
-
Crap.
Well, I guess the vernacular is what I need to know.
Knowing what to put where is the trick isn't it? Is there a dummies guide somewhere that spells this out in kindergarten speak?
I could really see myself botching this filtering business.
-
Not unless there's a . after the word servers in the name. The . is escaping the . at the end of stumbleupon inc.
-
Does it need the . before the )
-
Ok, try this:
^(microsoft corp|inktomi corporation|yahoo! inc.|google inc.|stumbleupon inc.|rackspace cloud servers)$|gomez
Just added rackspace as another match, it should work if the name is exactly right.
Hope this helps,
Chris
-
Agreed! That's why I suggest using it in combination with the variables you mentioned above.
-
rackspace cloud servers
Maybe my problem is I'm not looking in the right place.
I'm in audience>technology>network and the column shows "service provider."
-
How is it titled in the ISP report exactly?
-
For example,
Since I implemented the filter four days ago, rackspace cloud servers have visited my site 848 times, , visited 1 page each time, spent 0 seconds on the page and bounced 100% of the time.
What is the reg expression for rackspace?
-
Time on page can be a tricky one because sometimes actual visits can record 00:00:00 due to the way it is measured. I'd recommend using other factors like the ones I mentioned above.
-
"...a combination of operating system, location, and some other factors can do the trick."
Yep, combined with those, look for "Avg. Time on Page = 00:00:00"
-
Ok, can you provide some information on the bots that are getting through this that you want to sort out? If they are able to be filtered through the ISP organization as the ones in your current RegEx, you can simply add them to the list: (microsoft corp| ... ... |stumbleupon inc.|ispnamefromyourbots|ispname2|etc.)$|gomez
Otherwise, you might need to get creative and find another way to isolate them (a combination of operating system, location, and some other factors can do the trick). When adding to the list, make sure to escape special characters like . or / by using a \ before them, or else your RegEx will fail.
-
Sure. Here's the post for filtering the bots.
Here's the reg x posted: ^(microsoft corp|inktomi corporation|yahoo! inc.|google inc.|stumbleupon inc.)$|gomez
-
If you give me an idea of how you are isolating the bots I might be able to help come up with a RegEx for you. What is the RegEx you have in place to sort out the other bots?
Regards,
Chris
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Using Similar Expired URLs to Send Traffic to My Site
Thanks in advance for any help! I have an existing website with content on a particular topic. I have discovered a few similar expired URLs that might still get some traffic. One in particular still has a number of valid links from other sites. Would it make sense for me to buy those URLs (which are really cheap) and just use them to send that traffic to my site? If so, am I better using a 301 redirect or having a home page on the new site that just mentions that the old site is expired, and that they might want to instead link over to my site?
Intermediate & Advanced SEO | | alanjosephs0 -
Whats the best way to implement rel = “next/prev” if we have filters?
Hi everyone, The filtered view results in paginated content and has different urls: example: https://modli.co/dresses.html?category=45&price=13%2C71&size=25 Look at what it says in search engine land: http://searchengineland.com/implementing-pagination-attributes-correctly-for-google-114970 Look at Advanced Techniques paragraph. do you agree? it seem like google will index the page multiple times for every filter variant. Thanks, Yehoshua
Intermediate & Advanced SEO | | Yehoshua0 -
Site experiencing drop in Google rankings and organic traffic after redesign.
Hello, The company that I work for recently implemented a complete redesign for our company website. The former site was old, cumbersome and in desperate need of an update. We streamlined the site structure and made sure to redirect as many pages as we could find to new thematically related pages with 301 redirects. After the launch of our new site we saw a large upswing in "soft" 404 errors despite the fact that most of these pages do redirect upon inspection. So in relation to the soft 404s, for example, is it merely a matter of labeling them as fixed if they redirect properly, or could their be an underling issue with the site itself? Also, a majority or the urls labeled "not found" in webmaster tools are properly redirected. Do these merely need to be marked as fixed, or is there something else that needs to be fixed like the sitemap structure? I appreciate any and all input. Beyond Indigo
Intermediate & Advanced SEO | | BeyondIndigo1 -
Do I have a Panda filter on a specific segment?
Our site gets a decent level of search traffic and doesn't have any site-wide penalty issues, but one of our sections looks like it might be under some form of filter. Unfortunately for us, they're our buy pages! Check out http://www.carwow.co.uk/deals/Volkswagen/Golf it's unique content and I've built white hat links into it, including about 5 from university websites (.ac.uk domains DA70+). If you search something like "volkswagen golf deals" the pages on page 1 have weak thin content and pretty much no links. That content section wasn't always unique, in fact the vast majority of it may well be classed as dupe content as there's no Trim data and they look like this: http://www.carwow.co.uk/deals/Fiat/Punto While we never had much volume, the traffic on all /deals/ pages appears to drop significantly around the time of the May Panda update (4.0). We're planning on completely re-launching these pages with a new design, unique trim content and a paragraph (c.200 words) about the model. Am I right in assuming that there's a Panda filter on the /deals/ segment so regardless of what I do to one deals page it won't rank well, and we have to re-do the whole section?
Intermediate & Advanced SEO | | Matt.Carwow0 -
Issues with Google-Bot crawl vs. Roger-Bot
Greetings from a first time poster and SEO noob... I hope that this question makes sense... I have a small e-commerce site, I have had Roger-bot crawl the site and I have fixed all errors and warnings that Volusion will allow me to fix. Then I checked Webmaster Tools, HTML improvements section and the Google-bot sees different dupe. title tag issues that Roger-bot did not. so A few weeks back I changed the title tag for a product, and GWT says that I have duplicate title tags but there is only one live page for the product. GWT lists the dupe. title tags, but when I click on each they all lead to the same live page. I'm confused, what pages are these other title tags referring to? Does Google have more than one page for that product indexed due to me changing the title tag when the page had a different URL? Does this question make sense? 2) Is this issue a problem? 3) What can I do to fix it? Any help would be greatly appreciated Jeff
Intermediate & Advanced SEO | | IOSC0 -
Featured citations vs. regular citations
Do regular citations have the same impact as featured citations for Google Local? or is it the higher a company is isted the more impact it will have on a google local page?
Intermediate & Advanced SEO | | SEODinosaur0 -
Fading Text Links Look Like Spammy Hidden Links to a g-bot?
Ah, Hello Mozzers, it's been a while since I was here. Wanted to run something by you... I'm looking to incorporate some fading text using Javascript onto a site homepage using the method described here; http://blog.thomascsherman.com/2009/08/text-slideshow-or-any-content-with-fades/ so, my question is; does anyone think that Google might see this text as a possible dark hat SEO anchor text manipulation (similar to hidden links)? The text will contain various links (4 or 5) that will cycle through one another, fading in and out, but to a bot the text may appear initially invisible, like so; style="display: none;"><a href="">Link Here</a> All links will be internal. My gut instinct is that I'm just being stupid here, but I wanted to stay on the side of caution with this one! Thanks for your time 🙂 http://blog.thomascsherman.com/2009/08/text-slideshow-or-any-content-with-fades
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Planning for Website traffic on a self-hosted web server
How would you plan for the levels of traffic on an in-house web server? The scenario is that website is basically running on a T1 (1.5 Mbps) connection pipe, and traffic projects to increase significantly with content growing from about 40 uniques a day (on less than 20 poorly optimized web pages + associated PDF documents), to over 150 search optimized content pages + offsite traffic and link building. I'm trying to figure out what kinds of avg traffic levels (plus spikes) would represent a maximum bandwidth capacity for this...given that its a narrow specialty B2B focus. Any answers would be useful.
Intermediate & Advanced SEO | | GLogic0