Honeypot Captcha - rated as "cloaked content"?
-
Hi guys,
in order to get rid of our very old-school captcha on our contact form at troteclaser.com, we would like to use a honeypot captcha.
The idea is to add a field that is hidden to human visitors but likely to be filled in by spam-bots. In this way we can sort our all those spam contact requests.
More details on "honeypot captchas":
http://haacked.com/archive/2007/09/11/honeypot-captcha.aspxAny idea if this single cloaked field will have negative SEO-impacts? Or is there another alternative to keep out those spam-bots?
Greets from Austria,
Thomas -
Just in case anyone stumbles across this topic:
We started using honeypot captchas in 2011 and it really paid off. Not only because we got rid of the old captchas, but also because they are keeping out 99,99% of all bot inquiries or spam.
-
Hey Casey,
Thanks for the reply. Will have this tested soon. Really looking forward to getting rid of that captcha.
Regards,
Thomas
-
Hi Thomas,
I've done some studies on this and you will be fine using this technique and Google won't give you any problems doing it. Check out my post on the Honeypot Technique, http://www.seomoz.org/blog/captchas-affect-on-conversion-rates. The technique works quite well blocking about 98% of SPAM.
Casey
-
Hi Keri,
Those are users without Java-Support.
Does that mean that Java Script is no issue then? -
Thomas, double-check if that stat is for users without Java, or users without javascript.
-
Good point, thanks.
As 15% of our visitors don't have Java, this won't work out
Actually we're trying to get rid of the captcha to increase our CR, that's why the "honeypot" version is very appealing.
-
You won't get any SEO impact, think about it for all the form with JS interaction on big sites
One easy solution is to use ajax post of the form only, very effective BUT you won't be able to get contact from visitors without javascript enabled. Maybe a good alternative.
Otherwise, you can use Recaptcha : http://www.google.com/recaptcha
This is free and easy to setup, works well with bots and access to everyone !
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Content
When I crawl my site through moz, it shows lots of Pages with Duplicate Content. The thing is all that pages are pagination pages. How should I solve this issue?
Technical SEO | | 100offdeal0 -
Handling "legitimate" duplicate content in an online shop.
The scenario: Online shop selling consumables for machinery. Consumable range A (CA) contains consumables w, x, y, z. The individual consumables are not a problem, it is the consumables groups I'm having problems with. The Problem: Several machines use the same range of consumables. i.e. Machine A (MA) consumables page contains the list (CA) with the contents w,x,y,z. Machine B (MB) consumables page contains exactly the same list (CA) with contents w,x,y,z. Machine A page = Machine B page = Consumables range A page Some people will search Google for the consumables by the range name (CA). Most people will search by individual machine (MA Consumables, MB Consumables etc). If I use canonical tags on the Machine consumable pages (MA + MB) pointing to the consumables range page (CA) then I'm never going to rank for the Machine pages which would represent a huge potential loss of search traffic. However, if I don't use canonical tags then all the pages get slammed as duplicate content. For somebody that owns machine A, then a page titled "Machine A consumables" with the list of consumables is exactly what they are looking for and it makes sense to serve it to them in that format. However, For somebody who owns machine B, then it only makes sense for the page to be titled "Machine B consumables" even though the content is exactly the same. The Question: What is the best way to handle this from both a user and search engine perspective?
Technical SEO | | Serpstone0 -
"Fourth-level" subdomains. Any negative impact compared with regular "third-level" subdomains?
Hey moz New client has a site that uses: subdomains ("third-level" stuff like location.business.com) and; "fourth-level" subdomains (location.parent.business.com) Are these fourth-level addresses at risk of being treated differently than the other subdomains? Screaming Frog, for example, doesn't return these fourth-level addresses when doing a crawl for business.com except in the External tab. But maybe I'm just configuring the crawls incorrectly. These addresses rank, but I'm worried that we're losing some link juice along the way. Any thoughts would be appreciated!
Technical SEO | | jamesm5i0 -
My site has a "Reported Web Forgery!" warning
When I check my bing cached page it comes up with a "Reported Web Forgery!" warning. I've looked at google web tools and no malware has been detected. I do have another site that has a very similar web address jaaronwoodcountertops.com and jaaron-wood-countertops.com. Could that be why? How do I go about letting bing and or firefox know this is not a forgery site?
Technical SEO | | JAARON0 -
Moz Crawl Reporting Duplicate content on "template" styled pages
We have a lot of detail pages on our site that reference specific scholarships. Each page has a different Title and Description. They also have unique information all regarding the same data points. The pages are displayed in a similar structure to the user so the data is easy to read. My problem is a lot of these pages are being reported as duplicate content when they certainly are not. Most of them are reported as duplicates when they have the same sponsor. They may have the same contact information listed. These two are being reported as duplicate of each other. They share some data but they are definitely different scholarships. http://www.collegexpress.com/scholarships/adelaide-mcclelland-garden-club-scholarship/9254/ http://www.collegexpress.com/scholarships/mary-wannamaker-witt-and-lee-hampton-witt-memorial-scholarship/10785/ Would it help to add a Canonical for each page to themselves? Any other suggestions would be great. Thanks
Technical SEO | | GeorgeLaRochelle0 -
404-like content
A site that I look after is having lots of soft 404 responses for pages that are not 404 at all but unique content pages. the following page is an example: http://www.professionalindemnitynow.com/medical-malpractice-insurance-clinics This page returns a 200 response code, has unique content, but is not getting indexed. Any ideas? To add further information that may well impact your answer, let me explain how this "classic ASP" website performs the SEO Friendly url mapping: All pages within the custom CMS have a unique ID which are referenced with an ?intID=xx parameter. The custom 404.asp file receives a request, looks up the ID to find matching content in the CMS, and then server.transfers the visitor to the correct page. Like I said, the response codes are setup correctly, as far as Firebug can tell me. any thoughts would be most appreciated.
Technical SEO | | eseyo20 -
URL Structure "-" vs "/"? Are there any advantages to one over the other?
An example would be domain.com/keyword/keyword2 vs domain.com/keyword-keyword2 Are there any advantages / disadvantages to one over the other?
Technical SEO | | nicole.healthline0 -
Up to my you-know-what in duplicate content
Working on a forum site that has multiple versions of the URL indexed. The WWW version is a top 3 and 5 contender in the google results for the domain keyword. All versions of the forum have the same PR, but but the non-WWW version has 3,400 pages indexed in google, and the WWW has 2,100. Even worse yet, there's a completely seperate domain (PR4) that has the forum as a subdomain with 2,700 pages indexed in google. The dupe content gets completely overwhelming to think about when it comes to the PR4 domain, so I'll just ask what you think I should do with the forum. Get rid of the subdomain version, and sometimes link between two obviously related sites or get rid of the highly targeted keyword domain? Also what's better, having the targeted keyword on the front of Google with only 2,100 indexed pages or having lower rankings with 3,400 indexed pages? Thanks.
Technical SEO | | Hondaspeder0