Crowdsearch.me - Is this a legit approach?
-
It seems like a less-than-white hat approach, and anyway I don't know whether or not it could work.
Does anyone have any advice about it?
Thanks!
-
Thanks for the heads up, I was really unsure about this as well but really glad I saved my money by not buying into it!
Matt
-
Thank´s for this page, I have receive a email about this today for my webstore www.arbeidslys.no . I will not use money on something like this.
Preben Want
Manager
Arbeidslys.no -
Terry Kyle has a report on his results with CrowdSearch:
http://seotraffichacks.com/crowdsearching-work-seo-results-far/
-
A nod from the wizard :0 - I'm counting this week as a good friggen week!
-
Thanks, Rand. It's kind of an honor to have you speaking up on my little question here!
It's probably predictable that someone (or more than one) would try to monetize this sort of trick, because of the Google pronouncements that you mentioned and the other articles that have appeared about CTR and time-on-site behavior.
Too bad. I guess that we all have to actually earn all those visits and page views.
-
Thanks, Ray. What you said confirms what I speculated - too good to be true. And not entirely above-board, either.
-
Totally agree with Ray that this isn't a legitimate tactic, nor would I expect it to work. Google's got a lot of defenses and checks to prevent manipulation of this kind, so while it could have an impact briefly and in some SERPs, I'd expect it to be mostly a waste of time and money.
The only part I'll disagree with is Google's disclosure that they do (or rather "might") use pogo-sticking. I believe this was mentioned at a conference last year or in 2013, though I can't find the reference now. There's also lots of test evidence, including the experiment I ran live at Mozcon, this one from my blog: http://moz.com/rand/queries-clicks-influence-googles-results/ (which I did repeat with success), and some mixed results from Darren Shaw here: http://www.slideshare.net/darrenshaw1/darren-shaw-user-behavior-and-local-search-dallas-state-of-search-2014.
Queries and clicks are most certainly impacting rankings, though how directly and with what caveats/other influences we don't yet know (and may never).
-
Is this a legit approach?
No, not really. Google has never confirmed the use of CTR as a ranking signal for their search rankings. And, services such as these point to the fact that if Google did use CTR as a heavy ranking signal, it could easily be manipulated. That's what this service is proposing they are doing, manipulating the search results.
Now, does CTR actually impact search rankings? It's only speculation at this time and does seem like a logical factor to influence ranking. Google wants to show the most relevant results to the user; the results that answer the users search query the quickest and most complete. However, I don't think it could ever be a heavy impact ranking factor because it can be so easily manipulated.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practice Approaches to Canonicals vs. Indexing in Google Sitemap vs. No Follow Tags
Hi There, I am working on the following website: https://wave.com.au/ I have become aware that there are different pages that are competing for the same keywords. For example, I just started to update a core, category page - Anaesthetics (https://wave.com.au/job-specialties/anaesthetics/) to focus mainly around the keywords ‘Anaesthetist Jobs’. But I have recognized that there are ongoing landing pages that contain pretty similar content: https://wave.com.au/anaesthetists/ https://wave.com.au/asa/ We want to direct organic traffic to our core pages e.g. (https://wave.com.au/job-specialties/anaesthetics/). This then leads me to have to deal with the duplicate pages with either a canonical link (content manageable) or maybe alternatively adding a no-follow tag or updating the robots.txt. Our resident developer also suggested that it might be good to use Google Index in the sitemap to tell Google that these are of less value? What is the best approach? Should I add a canonical link to the landing pages pointing it to the category page? Or alternatively, should I use the Google Index? Or even another approach? Any advice would be greatly appreciated. Thanks!
Intermediate & Advanced SEO | | Wavelength_International0 -
Best approach to rank for this keyword?
Hi i want to rank for the keyword "white sandals" on Google Australia. Currently, the top 5 ranking pages are not optimised and specific to white sandals. See screenshot: https://image.prntscr.com/image/WenSRHqTTFSqYNg2MHvH1A.png To rank for this keyword, would you create a page dedicated to white sandals even though it looks like it doesn't matter and you could rank the broader sandals page (not colour specific). Any recommendations? Cheers.
Intermediate & Advanced SEO | | crazy4seo780 -
Taxonomy question - best approach for site structure
Hi all, I'm working on a dentist's website and want some advice on the best way to lay out the navigation. I would like to know which structure will help the site work naturally. I feel the second example would be better as it would focus the 'power' around the type of treatment and get that to rank better. .com/assessment/whitening
Intermediate & Advanced SEO | | Bee159
.com/assessment/straightening
.com/treatment/whitening
.com/treatment/straightening or .com/whitening/assessment
.com/straightening/assessment
.com/whitening/treatment
.com/straightening/treatment Please advise, thanks.0 -
Prerender.io and similar services to index content - legit?
A client has a huge, unique, updated list of B2B products that are in javascript and not indexed. Reading around, I think I've found that: Google allows showing bots and users different content (if it's fundamentally the same) with no penalty There are good, bad, and ugly ways to do it It's a semi-common problem There are services like prerender.io and formerly ajaxsnapshots.com that can help with this However..... I can't find a single authoritative (read: from Google or Moz) that says the above point 1. I found this White Hat Cloaking: It exists. It's permitted. It's useful. But can't tell where my situation fits (or if it does). So... if I use prerender.io to surface content to get it indexed... is that a smart move? I'm 95% sure it is, but I need 100% to make the decision.
Intermediate & Advanced SEO | | DanSullivan0 -
GWT does not play nice with 410 status code approach to expire content? Use 301s?
We have been diligently managing our index size in Google for our sites and are returning a 410 status code for pages that we no longer consider "up-to-date" but still carry value for users to access to have Google remove them from our index to keep it lean. However we have been receiving GWT warning across sites because of the 410 status codes Google is encountering which makes us nervous that Google could interpret this approach as a lack of quality of our site. Does anyone have a view if the 410 approach is the right approach for the given example or if we should consider maybe simply using 301s or another status code to keep our GWT errors clean? Further notes there is hardly ever any link juice being sent to those pages so it is not like we are missing out on that the pages for which we return 410 are also marked as noindex and nofollow
Intermediate & Advanced SEO | | petersocapro0 -
What is best practice SEO approach to re structuring a website with multiple domains and associated search engine rankings for each domain?
Hello Mozzers, I'm trying to improve and establish rankings for my website which has never really been optimised. I've inherited what seems to be a mess and have a challenge for you! The website currently has 3 different www domains all pointing to the one website, two are .com domains and one is a .com.au - the business is located in Australia and the website is primarily targeting Australian traffic. In addition to this there are a number of other non www domains for the same addresses pointing to the website in the CMS which is Adobe Business Catalyst. When I check Google each of the www domains for the website has the following number of pages indexed: www.Domain1,com 5,190 pages
Intermediate & Advanced SEO | | JimmyFlorida
www.Domain2.com 1,520 pages
www,Domain3.com.au 149 pages What is best practice approach from an SEO perspective to re organising this current domain structure? 1. Do I need to use the .com.au as the primary domain given that we are in this market and targeting traffic here? Thats what I have been advised and it seems to be backed up by what I have read here. 2. Do we re direct all domains to the primary .com.au domain? This is easily done in the Adobe Business Catalyst CMS however is this the same as a 301 redirect which is the best approach from an SEO perspective? 3. How do we consolidate all of the current separate domain rankings for the 3 different domains into the one domain rankings within Google to ensure improved rankings and a best practice approach? The website is currently receiving very little organic search traffic so if its simpler and faster to start again fresh rather than go through a complicated migration or re structure and you have a suggestion here please feel free to let me know your ideas! Thank you!0 -
Looking for re-assurance on this one: Sitemap approach for multi-subdomains
Hi All: Just looking for a bit of "yeah it'll be fine" reassurance on this before we go ahead and implement: We've got a main accommodation listing website under www.* and a separate travel content site using a completely different platform on blog.* (same domain - diffn't sub-domain). We pull in snippets of content from blog.* > www.* using a feed and we have cross-links going both ways, e.g. links to find accommodation in blog articles and links to blog articles from accommodation listings. Look-and-feel wise they're fully integrated. The blog.* site is a tab under the main nav. What i'd like to do is get Google (and others) to view this whole thing as one site - and attribute any SEO benefit of content on blog.* pages to the www.* domain. Make sense? So, done a bit of reading - and here's what i've come up with: Seperate sitemaps for each, both located in the root of www site www.example.com/sitemap-www www.example.com/sitemap-blog robots.txt in root of www site to have single sitemap entry: sitemap : www.example.com/sitemap-www robots.txt in root of blog site to have single sitemap entry: sitemap: www.example.com/sitemap-blog Submit both sitemaps to Webmaster tools. Does this sound reasonable? Any better approaches? Anything I'm missing? All input appreciated!
Intermediate & Advanced SEO | | AABAB0 -
What is the best approach to a keyword that has multiple abbreviations?
I have a site for which the primary keyword has multiple abbreviations. The site is for the computer game "Football Manager", each iteration is often referred to as FM2012, FM12 or Football Manager 2012, the first two can also be used with or without spaces inbetween. While this is only 3 keywords to target, it means that every key phrase such as "FM2012 Tactics", must also be targeted in 3 ways. Is there a recommended approach to make sure that all 3 are targeted? At present I use the full title "Football Manager" in the the title and try to use the shorter abbreviations in the page, I also make sure the title tags always have an alternative e.g FM2012 Tactics Two specific questions as well as general tips: Does the <abbr>HTML tag help very much?</abbr> Are results likely to differ much for searches for "FM 2012" and "FM2012" i.e. without the space.
Intermediate & Advanced SEO | | freezedriedmedia1