Google Search Quality Team - Commission Based Reviews
-
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report.
Some were members of the Google Search Quality Team.
I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked.
May be I have been naive for all these years but I didn't realize that;
-
Google outsourced the review and reconsideration requests to individual reviewers for a compensation
-
Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best,
The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them.
We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh.
Obviously I don't want to post said document online here....
BUT, I wanted to know if:
a) any Mozzers had ever been part of such a group - the GSQT
b) had any dealings with them - in terms of having your website reviewed and known about it.
I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers.
Please don't get me wrong here... totally on board with manual reviews...
I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword?
What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick.
I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done.
When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients.
Carlos
-
-
Just for clarification - the outsource individuals I had heard about back in 2004-2006 were in the EWOQ team.
FYI: http://www.seroundtable.com/archives/006791.html
I had thought that outsourcing the reviewing had stopped though many years ago and the manual reviewing was done internally by GOOGLE now.
The documents (PDFs and such) that I was sent tonight by these reviewers was dated 2012. They are very lengthy documents on what to look for and what to penalize for.
My conversations with said reviewers were quite amazing. Their lack of knowledge on what to look for and what constituted gaming the system (GH / BH as opposed to natural links etc) was at best laughable.
It just bothers me that a company that is one of the largest on this planet cannot have a team in house that is continually trained on reviewing and tested regularly.
Reviewing is important. It is necessary. It is critical for good search results.
All I ask that it is done equally, fairly and by a company that is tested regularly for quality standards themselves and not commission based individuals. [falls off soap box - but then watches the Mayweather fight again :)]
Carlos
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Yelp (recrawl Google/Bing)
If Google and Bing show an older version of a site's Yelp rating in the search results, what options are there to help ensure Google and Bing recrawl the Yelp page? Additionally, it appears third-party sites such as MapQuest show Yelp ratings and appear in Google search results; is it possible to request MapQuest to recrawl Yelp and then ask Google to recrawl MapQuest? Any advice would be much appreciated!
Industry News | | Mack_1 -
How Google could quickly fix the whole Links problem...
A Thursday morning brainstorm that hopefully an important Google manager will see... Google could quickly end all the problems of link buying, spammy links, and negative SEO with one easy step: Only count the 100 best follow links to any domain. Ignore all the nofollows and everything beyond the 100 best. They can choose what "best" means. Suddenly links would be all about quality. Quantity would not matter. Fiverr links, comment links, and all the other mass-produced spam links would literally be ignored. Unless that's all a domain had, and then they would surely be stomped by any domain with 100 decent natural links. Would it be an improvement over today's situation?
Industry News | | GregB1230 -
What do people think of Google discouraging guest blogging
Google is recommending that websites nofollow any links from press releases or guest blogging. Do people think this is really the new standard and sites could get penalized for guest blogging with follow links? Is anyone changing their strategy after hearing this announcement? Where does this leave people who work in difficult niches (such as gambling)?
Industry News | | theLotter1 -
Google Trusted Stores
Hello, So we sell millions of dollars a month in merchandise - most of that comes from eBay transactions. We do have a script that posts to eBay and we do download our transactions from eBay and process the orders from our admin. Now I feel we will do a lot better in the SERPs if we have the trusted stores quality signal. However; it comes down to this. The conversion pixel. Since the don't pay through the site - do you think we can get away of sending a email to a second conversion page for eBay transactions? Have any of you noticed a boost in SERPs once you were approved with the Trusted Stores? Any advise?
Industry News | | joseph.chambers0 -
Organic Search Results Display
If you do a Google search on northface a beautiful display of the search results is show in the number 1 organic position. How are they able to get this type of search results? I have never seen anything like it before, and would like to have our search results displayed like this.
Industry News | | FreightBoy0 -
What does the New Frontier of Search Look Like? (Mobile and Tablet)
I'd like to start a discussion thread on the direction of search as it applies to mobile and tablet. Jeff Haden, one of our Inc.com contributors, wrote an article headline stating that SIRI would be the end of SEO. However, what the article really talks about is the direction mobile search is likely to take: 1. Search results will be bypassed 2. Even more emphasis on local 3. PPC will be irrelevant 4. Emphasis on social media I think these insights are fairly intuitive and it will be really interesting to see how the medium for mobile and tablet develops. It will be interesting to see what technologies will shape the means by which users find information and not just the search intent of the user. Please add any insights you may have or good reads you'd like to share here.
Industry News | | inc.com0 -
Google guidlines 2011
Guys I have asked for the leaked SEO guidlines just to fine tune my SEO campaigns and it seems no one wanted to send it to me. Anyone can do it here, please?
Industry News | | SearchOfficeSpace230 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690