Google messages & penalties
-
I just read the following comment in a response to someone else's question. The Responer is an SEOMoz Authority whose opinion I respect and have learned from (not sure if it's cool to mention names in a question) and it spurred my curiosity:
"...Generally you will receive a warning from Google before your site is penalized, unless you are talking about just specific keywords."
This is something I have been wondering about in relation to my own sudden ranking drop for 2 specific keywords as I did not receive any warnings or notices. I have been proceeding as if I had over used these keywords on my Home page due to an initial lesser drop, but identifying the cause for the huge drop still seems useful for a number of reasons. Can anyone explain this further?
-
Thanks Colin. Your response clarifies "manual" warnings. I have wondered about the difference between "manual" penalties and receiving an email warning in GWT. I understand you to be saying those are one and the same. Yes?
I am still hoping to learn more about the methods by which a couple of specific keywords might be penalized if no warning was received.
I know this specific keyword issue is not related to inbound links in this case.
-
If you receive a warning in Google Webmaster tools, that's seen as a manual penalty. If you lose rank for a few keyword terms it might be an "algorithmic" penalty - which just means that Google no likey. These types of penalties are difficult to assess.
Usually key phrases would be penalized because of unnatural patterns in anchor text. I would look at your anchor text distribution in open site explorer and see what you've got going on.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
High ranking nationally but not locally via google
A website I am working on is ranked very well in all tracked keywords at a national level, but not from a local standpoint via google. I find it weird that the site is on the first page if you search from many other states/towns/locations but not locally. Looked on Google Search Console and couldn't see any link to why this is happening. Figured we would clear out the htaccess for any redirect issues and hope it fixes it. Suggestions please? Never seen google do this. It is strange.
White Hat / Black Hat SEO | | SeobyKP1 -
Google Algorithm non-manual penalty. How do we fix this (quality?) drop?
Hi, See attached image. We received a non-manual penalty on March 22, 2015. I don't think we ever came out of it. We have moved up due to the Penguin update, but we should (by DA PA) be up on the first page for tons of stuff and most keyword are lower than their true strength. What kind of quality errors could be causing this? I assume it was a quality update. I am working on the errors, but don't see anything that would be so severe as to be penalized. What errors/quality problems am I looking for? We have tons of unique content. Good backlinks. Good design. Good user experience except for some products. Again, what am I looking for? Thanks. non-manual-penalty.png
White Hat / Black Hat SEO | | BobGW0 -
Google URL Shortener- Should I use one or multiple???
I have a client with a number of YouTube videos. I'm using Google URL Shortner to allow the link to show in the YouTube text (as its a long URL). Many of these links go to the same page ex .com/services-page Should I use a single short URL for each video linking to the .com/services-page or should they be unique each time? If unique, would Google possibly think I'm trying to manipulate results? Thanks in advance. I'm just not sure on this one and hope someone knows best practice on this. Thanks!
White Hat / Black Hat SEO | | mgordon1 -
HOW!??! Homepage Ranking Dropped Completely out of Top 100 on Google....
So I'm competing for a very competitive keyword, and I've been on the bottom of page 2 for a while now, ranking for my homepage, which is very content rich and has GREAT links pointing to it. Out of nowhere, last week I dropped completely out of the top 100 or so, yet one of my article posts now ranks on page 6 or so for the same keyword. I have great authoritative links, my on-page is spot on, all of my articles are super super high quality, I don't understand how my homepage, which has ranked for the main keyword for months on page 2, can just completely drop out of the top 100 or so.... Can anyone help provide some insight?
White Hat / Black Hat SEO | | juicyresults0 -
Recovering from Google Penguin/algorithm penalty?
Anyone think recovery is possible? My site has been in Google limbo for the past 8 months to around a year or so. Like a lot of sites we had seo work done a while sgo and had tons of links that Google now looks down on. I worked with an seo company for a few months now and they seem to agree Penguin is the likely culprit, we are on page 8-10 for keywords that we used to be on page 1 for. Our site is informative and has everything in tact. We deleted whatever links possible and some sites are even hard to find contact information for and some sites want money, I paid a few a couple bucks in hopes maybe it could help the process. Anyway we now have around 600 something domains on disavow file we out up in March-April, with around 100 or 200 added recently as well. If need be a new site could be an option as well but will wait and see if the site can improve on Google with a refresh. Anyone think recovery is possible in a situation like this? Thanks
White Hat / Black Hat SEO | | xelaetaks0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Passing page rank with frames - Is this within Google Guidelines?
It appears this site is gaming Google for better rankings. I haven't seen a site do it this before way before. Can you tell me what enables this to get such good rankings, and whether what they are doing is legitimate? The site is http://gorillamikes.com/ Earlier this year this site didn't show up in the rankings for terms like "Cincinnati tree removal" and"tree trimming Cincinnati" etc. The last few months they have been ranking #1 or #2 for these terms. The site has a huge disparity in MozRank (8, very low) vs. Page Rank (6, high). The only links to this page come from the BBB. However, when you look at the source code you find 100% of what is displayed on the site comes from a page on another site via a frame. The content is here: http://s87121255.onlinehome.us/hosting/gorillamikes/ When I go to onlinehome.us I'm redirected to http://www.1and1.com/. I'm only speculating, but my guess is onlinehome.us has a high page rank that it is passing to http://gorillamikes.com/, enabling Gorilla Mikes to achieve PR of 6. Does this make sense? In addition, the content is over optimized for the above terms (they use "Cincinnati (Cincinnat, OH)" in the first three H2 tags on the page. And all of the top menu links result in 404 errors. Are the tactics this site is using legitimate? It appears that everything they're doing is designed to improve search results, and not in ways that are helpful to users. What do you think?
White Hat / Black Hat SEO | | valkyrk0