Is using Mechanical Turk to increase our number of Google +1's Black Hat?
-
Or will it affect our ranking positively or negatively at all?
-
And it's not allowed by Mechanical Turk, either.
https://www.mturk.com/mturk/help?helpPage=policies
<a name="#violation_examples">What are some specific examples of HITs that violate Amazon Mechanical Turk policies? [excerpted]</a>
- HITs that directly or indirectly promote a site, service, or opinion
- HITs that violate the terms and conditions of an activity or website (for instance asking Workers to vote for something)
- HITs that generate "referred" site visits or click-through traffic
- HITs that ask Workers to take action to manipulate a website's behavior or results
-
Black hat in my book....be interested to hear how you get on if you do try this strategy though.
-
Um, short answer... yes. A darker shade of gray.
-
Better to produce great content on the site so that people +1 you naturally.... If the site isn't up to scratch then you won't get many conversions anyway.
-
If black hat is defined as using techniques to fool the search engines, then yes. If not, it's at least grey.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Organic Ranking & Traffic Dropped
Hello, We have been struggling to keep our website (http://goo.gl/vS37qA) ranking well in Google since April 30, 2015. For some reason at that time, there were around 15000 blocked pages (mainly Magento layered navigation pages) showing in Google's Search Console. We used canonical tags, and now all these pages have been removed from Google's index and Google Search Console. We didn't do anything that is against Google's Guidelines. Currently in Google Search Console we see:- Around 50 crawl errors- no malware- no blocked pages - no other error messages in both Webmasters tool.We have never practiced black hat SEO, paid for links, or used tactics that Google penalizes. We noticed in the last few months there are around 1000 Chinese/Russian/Japanese links points to our website, and we have used the disavow tool to notify Google of these attacks.Any help would be greatly appreciated in advance!
White Hat / Black Hat SEO | | NancyH0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
80% of traffic lost over night, Google Penalty?
Hi all.
White Hat / Black Hat SEO | | Hemjakt
I have a website called Hemjakt (http://www.hemjakt.se/) which is a search engine for real estate currently only available on the Swedish market. The application crawl real estate websites and collect all estates on a single searchable application. The site has been released for a few months and have seen a steady growth since release, increasing by 20% weekly up to ~900 visitors per day. 3 days ago, over night, I lost 80% of my traffic. Instead of 900 visitors per day I'm at ~100 visitors per day and when I search for long, specific queries such as "Åsgatan 15, Villa 12 rum i Alsike, Knivsta" ( <adress><house type=""><rooms><area> <city>), I'm now only found on the fifth page. I suspect that I have become a subject of a Google Penalty. How to get out of this mess?</city></rooms></house></adress> Just like all search engines or applications, I do crawl other websites and scrape their content. My content is ~90% unique from the source material and I do add user value by giving them the possibility to compare houses, get ton of more data to compare pricing and history, giving them extra functionalities that source site do not offer and so on. My analytics data show good user engagement. Here is one example of a Source page and a page at my site:
Source: http://www.hemnet.se/bostad/villa-12rum-alsike-knivsta-kommun-asgatan-15-6200964
My Site: http://www.hemjakt.se/bostad/55860-asgatan-15/ So: How do I actually confirm that this is the reason I lost my traffic? When I search for my branded query, I still get result. Also I'm still indexed by Google. If I am penalized. I'm not attempting to do anything Black Hat and I really believe that the app gives a lot of value to the users. What tweaks or suggestions do you have to changes of the application, to be able to continue running the service in a way that Google is fine with?0 -
Does Trade Mark in URL matter to Google
Hello community! We are planning to clean up TM and R in the URLs on the website. Google has indexed these pages but some TM pages are have " " " instead displaying in URL from SERP. What's your thoughts on a "spring cleaning" effort to remove all TM and R and other unsafe characters in URLs? Will this impact indexed pages and ranking etc? Thank you! b.dig
White Hat / Black Hat SEO | | b.digi0 -
Is this a clear sign that one of our competitors is doing some serious black-hat SEO?
One of our competitors just recently increased their total external followed looks pretty drastically. Is it safe to say they are doing some pretty black-hat stuff? What actions exactly could this be attributed to? They've been online and in business for 10+ years and I've seen some pretty nasty drops in traffic on compete.com for them over the years. If this is black-hat work in action, would these two things be most likely related? Wh10b97
White Hat / Black Hat SEO | | Kibin0 -
Potential Implications of using the Disavow tool to remove thousands of links
So here's the situation. My companies site has over 30 thousand backlinks from Rippling.info These links all point to 3 product pages, some of which are no longer in production. Apparently a former employee was experimenting with some link farm ideas. My questions are; 1. does anyone here have experience with rippling.info? Is it legit? It seems like a link farm but Google allows adsense ads??? I thought Google was against link farms... 2. if I use the Disavow tool in Webmaster Tools to tell Google these 30k+ incoming links are to be ignored, will there be any consequences? -Google Analytics shows zero referral traffic since jan 1st 2012.
White Hat / Black Hat SEO | | mjmorse0 -
Anybody have useful advice to fix a very bad link profile?
Hello fellow mozzers. I am interested in getting the communities opinion on how to fix an extremely bad link profile, or whether it would be easier to start over on a new domain. This is for an e-commerce site that sells wedding rings. Prior to coming to our agency, the client had been using a different service that was doing some serious black hat linkbuilding on a truly staggering scale. Of the roughly 53,000 links that show up in OSE, 16,500 of them have the anchor text "wedding rings", 1,300 "wedding ring sets", etc. For contrast, there are only two "visit website", and just one domain name anchor text. So it is about the farthest from natural you can get. Anyway, the site traffic was doing great until the end of February, when it took a massive hit and lost over half the day to day traffic volume, and steadily declined until April 24th (Penguin), when it took another huge hit and lost almost 70% of traffic from Google. Note that the traffic from Yahoo/Bing stayed the same. So the question is, is it worth trying to clean up this mess of a backlink profile or would it be smarter to start fresh with a new domain?
White Hat / Black Hat SEO | | CustomCreatives0 -
Massive rank drop for 'unnatural links' . Help!
Hi Everyone, I work for a company called Danbro - www.danbro.co.uk Recently a massive penalty lead to a huge drop across all keywords in Google including the brand name. Since we have conducted a massive clean up; (requesting competitors to remove duplicate content, removing some poor quality links etc etc) We still have not seen any improvement whatsoever nor has Google responded. Has anyone ever received a positive response from Google? Since we sent a reconsideration request our ranks actually went worse!! Any advice would be great
White Hat / Black Hat SEO | | Townpages0