Did Google's Farmer Update Positively/Negatively Affect Your Search Traffic?
-
Oddly the handful of sites that I have which should have most probably been affected negatively actually saw boosts in traffic, CTR on ads and eCPM on ads. Not huge jumps.. but yeah.. I benefited which was odd. These domains are testbeds I set up a long time ago to find the upper limit of what you can "get away with" in google so I know where to draw the line.
Other interesting facts. I rand some tests over the weekend (may not be large enough to be statistically relevant yet) but it seems the farmer update has almost no impact on indexation of poor or duplicate content given enough raw link juice (no anchor, ip diversity or any other cool factors, just a flat link from a big ol' bucket of link juice) which I find disappointing. =/ I expected a bit of a challenge after all this hoopla. Even though I rock the greyhat Im still pretty anti dupe/crap content.
-
Awesome use of the new Q&A Rand!
One of my biggest content sites actually has seen an increase in traffic since the "farmer" update. The content on it is definitely a tick above content farmed crap, but it's also not 5-star.
For what it's worth, it's monetized with AdSense ads and there's really no branded traffic to speak of naturally.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google penalty removal expert questions
We have searched online for a Google penalty “expert” (individual or company) and have located what appear to be “experts”. Please provide feedback on the following 2 individuals/companies we have found that can help with penalty removal. Have you or one of your clients used either of the “experts” below? What were the results? How many disavows and reconsideration requests did you/they have to make? 1.www.penaltypros.com . To give a quote and to see what your links are they use links from Google Webmaster Tools only. Penaltypros.com disavows first and then removes bad links second. This is opposite of what Google and Seo’s recommend but penaltypros.com claims 100% success using this non-traditional approach. See imgur.com link for screenshot. 2.http://www.hiswebmarketing.com/ To give a quote and to see what your links are they use links from https://ahrefs.com/ only. Please provide any and all feedback on the above 2 “experts” and also post the websites, individual names, company names of those that you consider Google penalty removal “experts” so that we may obtain a quote from them. Lp9F3FI
Industry News | | RetractableAwnings.com1 -
Subdomain initials vs full city name(s) for a multi city subdomain site?
Helping with a multi-city non-profit magazine/news blog. Subdomain options; sf.domain.com, ny.domain.com, la.domain.com sanfrancisco.domain.com, newyork.domain.com, ... Some cities added, will as an example seol.domain.com a city that doesnt have a recognizable initlals, like NYC for example. For brand, recognition, seo benefit, what have you used and why? Thanks
Industry News | | vmialik0 -
Google Trusted Stores
Hello, So we sell millions of dollars a month in merchandise - most of that comes from eBay transactions. We do have a script that posts to eBay and we do download our transactions from eBay and process the orders from our admin. Now I feel we will do a lot better in the SERPs if we have the trusted stores quality signal. However; it comes down to this. The conversion pixel. Since the don't pay through the site - do you think we can get away of sending a email to a second conversion page for eBay transactions? Have any of you noticed a boost in SERPs once you were approved with the Trusted Stores? Any advise?
Industry News | | joseph.chambers0 -
Does Google still have a standard search result? How can I get it?
I have heard a lot from the experts that there are no "Standard" Google search results anymore. They said that most of the SERP's of Google that show up are customized/tailored for each individuals even if they are not logged-in using their Google Custom Search. My questions are, Is there still a way to retrieve the standard Google search result? How? Are these scripts will be helpful when searching on Google? *webhp?
Industry News | | RafaelRanada
*complete=0
*pws=0 watch?v=B8ofWFx525s B8ofWFx525s watch?v=B8ofWFx525s0 -
Why Does OSE (Open Site Explorer) have such little backlink data on russian sites in the google.ru index?
OK this seems v strange, but google.ru are indexing far more BLs in their SERPS for a widget than OSE reports. Very little data is found in OSE for russian based sites. Is this the marketing intention? (I could send raw data if needed!) What is filtering this vast google.ru data list out? Is OSE only catered for US/UK?
Industry News | | Turkey0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Google International and National Algorithm
Hi guys, I have a question. Do you have an experience about google national and international ranking algorithm. For example, is the same algorithm for " Google.de " Germany, and " Google.com "??? For example, lot of tactics, are valid and effective on google.de and not effective on google.com, tell us about this . Do you have any idea? Share your skills please! we need your help!
Industry News | | leadsprofi0 -
Google Directory no longer available?
Now, we will forever not know what is in the Google Directory. I just clicked on the link..... and everything is dead and points you to DMOZ. What does this mean for us? Is DMOZ going to get more editor juice, so submissions are actually reviewed for once? The Yahoo! directory has also been glitching - new submissions have been disabled for over a week now. Any comments?
Industry News | | antidanis0