Manual action penalty by Google
-
Hello,
We have a big well-known brand - www.titanbet.com. This brand is well established and the site has been live for almost 4 years now ranking very well on some very strong KWs.
we received a message from Google on Aug 29<sup>th</sup> saying “Google has detected a pattern of artificial or unnatural links pointing to your site” and that “Google has applied a manual spam action to titanbet.com/”
The past 2 weeks since the penalty was received we saw some of our major KWs drop in rankings. BUT all brand related KWs were still ranked 1<sup>st</sup>
Over the last weekend the penalty has worsen and we no longer rank on any of the brands KWs (we find the site in 5<sup>th</sup> page at best).
Moreover, when searching for a sentence from the any of the page on the site in Google, we see other sites ahead of us in the SERPs.
Based on the message we originally received from Google we have started cleaning some of the bad links to the site. We found a lot of links from bad sites, some of them are not indexed and probably penalized as well, some are from affiliate websites and some are from some automatic indexation websites based in China and Russia
we have started reaching out to some of these sites to try and have them remove our links.We are also worried about the duplication of our site. We have found many other sites (mostly affiliate websites) have copied and in some cases completely duplicated our content. Google for some reason has chosen to penalize us for this. Although we do not have control over these other sites. We have run copyscape to try and figure out which pages are the most problematic and we will try to re-write the content on these pages. But what if the other sites copy us again?
Any suggestions on the above would be appreciated as we try to understand why Google has penalized us.
thank you
Titan Bet Team
-
The penalty is for links rather than thin or duplicate content, so at this stage I would primarily focus on your link profile. If your content is indexed first with social signal pointing to it then Google will (should) know you are the originator.
You will need to systematically go through every single link coming back to the site. If you have any URLs 301ed to your site then those link profiles will need close examination too. If you have had a manual penalty you can't mainly get it right, you have to completely clean out anything with a whiff of toxicity. Keep a close record of all your activities, possibly using a tool like Rmoov, when your have exhausted manual attempts use Disavow then file a reconsideration request. There are some good threads and posts on here covering all these aspects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Yelp (recrawl Google/Bing)
If Google and Bing show an older version of a site's Yelp rating in the search results, what options are there to help ensure Google and Bing recrawl the Yelp page? Additionally, it appears third-party sites such as MapQuest show Yelp ratings and appear in Google search results; is it possible to request MapQuest to recrawl Yelp and then ask Google to recrawl MapQuest? Any advice would be much appreciated!
Industry News | | Mack_1 -
Google Custom Search vendors and options
Hi everyone, We're in the process of finding someone who will be able to help set up Google Custom Search on our site and are having some trouble - most agencies we were hoping could help solely focus on Google Search Appliance, a hardware-specific approach that doesn't suit our needs. Specifically, we'd like to replace our current site search engine with Google Custom search, as well as configure it as deeply as possible for the best search experience I'm hoping people could give me some ideas on who might be able to help, or the best places to look. Thanks in advance!
Industry News | | digitalcrc1 -
Google Site Warnings via Phone?
I received a voicemail earlier stating that "there are two issues with your company's current Google listing that we need to discuss with the business owner. it is very important that we talk as soon as possible. press 1 to speak with an agent immediately. press 8 if you have already verified your account information or if you are no longer in business and want to be removed from this list. thank you" that's it. no contact number, no reference to what listing or what type of listing (organic, places, etc.) Checked GWT, GA, and the Gmail - there are no warnings or messages in any of those accounts. Has anyone else experienced this?
Industry News | | EmpireToday0 -
Did Google Search Just Get Crazy Local?
Hey All, I think it's a known fact at this point that when signed into a personal Google account while doing a search, the results are very oriented around keywords and phrases you have already searched for, as well as your account's perceived location; for instance when I wanted to check one of my own web properties in SE listings I would sign out or it would likely appear first as a false reading. Today I noticed something very interesting: even when not signed in, Google's listings were giving precedence to locality. It was to a very extreme degree, as in when searching for "web design," a firm a mile away ranked higher than one 1.5 miles away and such. It would seem that the algos having this high a level of location sensitivity and preference would actually be a boon for the little guys, which is, I assume why it was implemented. However, it brings up a couple of interesting questions for me. 1. How is this going to affect Moz (or any SE ranking platform, for that matter) reports? I assume that Google pulls locations from IP Addresses, therefore would it not simply pull the local results most relevant for the Moz server(s) IP? 2. What can one do to rise above this aggressive level of location based search? I mean, my site (which has a DA of 37 and a PA of 48) appears above sites like webdesign.org (DA of 82, PA of 85). Not that I'm complaining at the moment, but I could see this being a fairly big deal for larger firms looking to rank on a national level. What gives? I'd love to get some opinions from the community here if anyone else has noticed this...
Industry News | | G2W1 -
Google Webspam Algo Update 24/4/12
Having just checked our clients rankings 95% have not been affected, in fact many have moved up rankings. 1 or 2 have had big drops 😞 Who has been effected by this? The forums are full of people talking about sites being floored from the serp's. it will be interesting to follow the aftermath of this and get some insight into what exactly has changed!
Industry News | | ifluidmedia0 -
Google Products / Google Shopping
My client has a site with products a lot of which are so similar in function that for usability reasons we have combined some products on the same pages. We want to get into Google Shopping, but on the face of it the Google feed seems to want unique urls per product. I guess we could have products on the same page then have single pages as well, though that could generate duplicate content. We could also try pointing several products to 1 URL, does anyone know if this would work? Or can anyone suggest any work arounds? Justin
Industry News | | GrouchyKids0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Googles' Anonymous data sharing "pool"
Is sharing this information good for my websites? And Is it Open information for anyone to hack into, and see my sites analytics? Bottom line, good or a bad thing?
Industry News | | smstv0