User Agent -teracent-feed-processing
-
Does anyone knows some info about "teracent-feed-processing" user agent?
IP's from which user agent reside: 74.125.113.145, 74.125.113.148, 74.125.187.84 ....
In our logs, 2 out of 3 requests are made by it, causing server crash.
-
It seems that the Sudden drop in indexed pages reported in WMT might relate to some reporting issues from Google - https://productforums.google.com/forum/#!topic/webmasters/qkvudy6VqnM;context-place=topicsearchin/webmasters/sitemap|sort:date
-
Since "teracent-feed-processing" didn't followed the rules in robots.txt, we had to hard-block it. If server detects the user agent beeing "teracent-feed-processing" it will drop the connection: _ (104) Connection reset by peer_
-
Well it isn't Googlebot and it isn't one I have come across before. Don't forget that any user agent can be spoofed very easily so I wouldn't worry about blocking it.
**Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent? **
I really don't think that this is Google. The only one they have is Googlebot, and tell you this is the one to add if you wish to block them.
Just a thought, can you share your robots.txt file just to make sure pages aren't being unintentionally blocked?
-Andy
-
It seems that "teracent-feed-processing" user agent is somehow linked to Google. If you analyse the Ip's , you'll noticed that are Google owned. Teracent company has been bought by Google in 2009.
btw - we've already blocked it, but I'm trying to figure it out what's the key role played by this user agent. We've also noticed a drastic decline in number of pages being reported in Google Webmaster Tools (half of what we used to have). Should I assume that the drop in reported indexed pages is a result of blocking the teracent-feed-processing user agent?
-
It sounds like your typical spammy site so I would suggest just blocking them. Add the following to the top of your robots.txt file:
**User-agent: teracent-feed-processing** **Disallow: /** However, before you go live with this, use the webmaster tools Robots.txt tester to make sure everything else still gets crawled. -Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console and User-declared canonical is actually Hreflang tag
Hey, We recently launched a US version of UK based ecommerce website on the us.example.com subdomain. Both websites are on Shopify so canonical tags are handled automatically and we have implemented Hreflang tags across both websites. Suddenly our rankings in the UK have dropped and after looking in search console for the UK site ive found that a lot of pages are now no longer indexed in Google because the User-declared canonical is the Hreflang tag for the US URL. Below is an example https://www.example.com/products/pac-man-arcade-cabinet - is the product page is the canonical tag rel="alternate" href="https://www.example.com/products/pac-man-arcade-cabinet" hreflang="en-gb" /> - UK hreflang tag rel="alternate" href="https://us.example.com/products/pac-man-arcade-cabinet" hreflang="en-us" /> - US Hreflang tag then in Google search console the user-defined canonical is https://us.example.com/products/pac-man-arcade-cabinet but it should be https://www.example.com/products/pac-man-arcade-cabinet The UK website has been assigned to target the United Kingdom in Search Console and the US website has been assigned to target the United States. We also do not have access to robots.txt file unfortunately. Any help or insight would be greatly appreciated.
Technical SEO | | PeterRubber0 -
Schema Markup for property listings (estate agent)
Hello, I've been looking online for some help with this. An estate agent has a page of properties for sale. Is it possible to mark these individual properties up and if so would they appear as rich snippets in the SERPS - never seen anything like this for properties for sale so just wondered,
Technical SEO | | AL123al1 -
How do you delete an admin user in wordpress that wont delete
I hired an indian company to do some work on three sites that I own. I used a freelancing platform and they have been banned and now when i check in my wordpress sites, the admin user will not delete. Everytime i try and delete them it comes back. I change the password and the email address, but when i check a couple of hours later it comes back again, giving them full control over my sites which they are playing around with. any help would be great. I have tried going into the cpanel but it still will not delete. my hosting company has tried to delete them but it is not working
Technical SEO | | in2townpublicrelations0 -
Homepage "personalisation" - different content for different users
Hi Mozians, My firm is looking to present different content to different users depending on whether they are new, return visitors, return customers etc... I am concerned how this would work in practice as far as Google is concrened- how would react to the fact that the bot would see different content to some users. It has the slight whiff of cloacking about it to me, but I also get that in this case it would be a UX thing that would genuinely be of benefit to users, and clearly wouldn't be intended to manipulate search rankings at all. Is there a way of acheiving this "personalisation" in such a way that Google understands thay you are doint it? I am thinking about some kind of markup that "declares" the different versions of the page. Basically I want to be as transparent about it as possible so as to avoid un-intended consequences. Many thanks indeed!
Technical SEO | | unirmk0 -
Robots User-agent Query
Am I correct in saying that the allow/disallow is only applied to msnbot_mobile? mobile robots file User-agent: Googlebot-Mobile User-agent: YahooSeeker/M1A1-R2D2 User-agent: MSNBOT_Mobile Allow: / Disallow: /1 Disallow: /2/ Disallow: /3 Disallow: /4/
Technical SEO | | ThomasHarvey1 -
Log in, sign up, user registration and robots
Hi all, We have an accommodation site that asks users only to register when they want to book a room, in the last step. Though this is the ideal situation when you have tons of users, nowadays we are having around 1500 - 2000 per day and making tests we found out that if we ask for a registration (simple, 1 click FB) we mail them all and through a good customer service we are increasing our sales. That is why, we would like to ask users to register right after the home page ie Home/accommodation or and all the rest. I am not sure how can I make to make that content still visible to robots.
Technical SEO | | Eurasmus.com
Will the authentication process block google crawling it? Maybe something we can do? We are not completely sure how to proceed so any tip would be appreciated. Thank you all for answering.3 -
Redirecting root domain to a page based on user login
We have our main URL redirecting non-logged in users to a specific page and logged in users are directed to their dashboard when going to the main URL. We find this to be the most user-friendly, however, this is all being picked up as a 302 redirect. I am trying to advise on the ideal way to accomplish this, but I am not having much luck in my search for information. I believe we are going to put a true homepage at the root domain and simply redirect logged in users as usual when they hit the URL, but I'm still concerned this will cause issues with Google and other search engines. Anyone have experience with domains that need to work in this manner? Thank you! Anna
Technical SEO | | annalytical0 -
Is having no robots.txt file the same as having one and allowing all agents?
The site I am working on currently has no robots.txt file. However, I have just uploaded a sitemap and would like to point the robots.txt file to it. Once I upload the robots.txt file, if I allow access to all agents, is this the same as when the site had no robots.txt file at all; do I need to specify crawler access on can the robots.txt file just contain the link to the sitemap?
Technical SEO | | pugh0