Diagnosing a likely Penguin Penalty that's never been recovered from
-
The context: my market
Here, for reference, is what I’d like to see with my website (New York Jazz Events), and I think I deserve to see: https://www.dropbox.com/s/1gf2ajw80iciqii/Screenshot 2015-11-27 12.09.08.png?dl=0
Intrigued by that screenshot? Please read further!
I have only a few competitors in my market (jazz bands offered in the city of New York for corporate events and weddings), those being Gigmasters, Gigsalad, and Thumbtack. (Each of those three, by the way, are much more general sites than mine (they offer everything from musicians to jugglers), and should be behind me if one is ranking based on quality and relevance.) Of the next nearest type of competitor, single, individual jazz (which also should be behind me if one is ranking based on quality and relevance), there are a dozen or so.
The context: my plans
No matter what, at the least I’m going to be doing a complete modernization and redesign of my site soon.
Please refer to the following screenshot of my Google organic traffic throughout the life of my site while reading the account that follows: https://www.evernote.com/l/AAOQpSw8Hn9DGpCQAt5onH9WMBiwGTDcCk8
What I’d like to find out: exactly what caused the Penguin penalty (if there was one); exactly what would remove it and restore my site to its previous standing.
You can see that when my site launched, it only took four months (12/10-4/11) for it to consistently, and seemingly effortlessly, ranking 5th or 6th in Google for the most important keyword combinations related to my industry (such as “jazz band new york,” “jazz trio new york,” “jazz wedding new york”). That's for a new site with no backlinks. From this I inferred that there is little to no direct competition in this market (i.e., jazz bands in New York marketed specifically for weddings and corporate events).
Then, around November of 2013, I paid for some bogus links (51 to be exact) to these keyword combinations in order to improve the ranking of my site, which worked briefly (see Google Analytics screenshot, January 13) until Penguin launched the following Spring, at which time my site was essentially removed from the search results altogether due to an apparent algorithmic (not manual) penalty which I presumed were due to these links (although I could be wrong, it could be penalized due to something else that I don’t understand).
After removing most of the bad links (down to 3 from 51, see https://www.dropbox.com/s/kolb665rth47q11/bad links 2013-10-24 explorer.numbers?dl=0) and disavowing all the offending URLs, and after Penguin updated to 3.0, Google still failed to recognize my site, with one odd exception: in Fall of 2014 it began to place the keyword combination "jazz bands new york" ("bands" plural, not singular) back on page one, and tied it to a completely undeveloped Google Plus page with zero reviews on it, that it displayed simultaiously (the “knowledge graph?” or “maps listing”?). (Google works in some strange and not very intelligible ways. For example, in a searcher removed the “s” from “bands” and the site remained banished from the results altogether. The same is true for every other keyword variation.)
Encouraged by this unexpected development, last Winter (2014-15) and Spring (2015), I developed my Google Plus Local Business page with lots of useful videos and photos, increased the review count from 0 to 13 (all real and all five star, by the way), linked my YouTube page to it, and, on Google’s advice and against my better judgment, closed down my other Google Plus Local Business pages related to other business services I market on the web (I’m a graphic designer and videographer in addition to being a bandleader). (Unhelpfully, Google keeps them in the search results but just marks them as “closed.” Thanks so much, Google. I probably could have left them up.) I also made a massive effort to clean-up my local directory listings so far as possible, removing listings for my competing businesses (again, against my better judgment), making the format of my business address and contact information consistent so far as possible (I'm a service business and so hide my full address when possible, but this is not always possible depending on the policies of the particular citation website, hence some inconsistencies), and added this information to the footer of all the pages on my site.
After making these improvements, rather than improving my rankings, my site was entirely removed from the first several pages of Google’s search results, including for the keyword combination "jazz bands new york.” On occasions when my site could be located (several pages down), it was no longer associated with my Google Plus Local Business page, unless one searched specifically for my site’s name, New York Jazz Events (which nobody does, because 99.9% of people searching on Google don't know my business name).
Some questions this raised in my mind: Why did Google make a link between my site and my Google Plus Local Business page back when the page was undeveloped? Why did Google then break that link (stop the association my website with their business page (or knowledge graph, or maps listing, whichever it is now), apart from the exception noted above) once the Google Plus Local Business page was developed? And indeed, why wouldn't developing that page, along with cleaning up my citations, logically result in more search term combinations bringing my results back to the first page, along with the link to the Google Plus Local Business page, rather than the opposite?
Then, unexpectedly, this last November my website rank for "jazz bands new york" in Google briefly returned from "buried" all the way to #1! And the 1st page of the search results was dominated by my site in three places, all #1: the top spot for paid ads (as usual), the very top of the natural search results (first time ever), and the top and only local listing, on the right! I was even ahead of two giant national corporate competitors, which would seem to be impossible to me as they probably have thousands of backlinks. I basically “owned” page one of Google to an extent I’ve never seen for anyone before. It was actually a bit bizarre.
You can see this here: https://www.dropbox.com/s/1gf2ajw80iciqii/Screenshot 2015-11-27 12.09.08.png?dl=0
Now, what is also bizarre, was that, as before, I was still buried for every other keyword combination that's relevant to my site, including extremely similar combinations (for example, substituting "band" for "bands," or "NYC" for "New York," etc.). These keyword combinations essentially return the exact same results, only with my site missing from organic and local.
As I mentioned, these astonishing results were temporary, and now my site is again buried for all keyword combinations including the once and sometimes astonishingly-performing “jazz bands new york.”
Something else interesting and relevant to this conundrum: I’ve done searches for all my three major keyword search terms in Bing, and guess what? In the top three results for two out of the three of my search terms in organic results, with my Bing local listing right up there, and my other website (NYCJazz.com) not far behind!
Now, it's strange to me that these incredibly great (and, as far as I'm concerned, high quality) Bing rankings lead to no inquiries, that nearly all of my customers find me from my paid advertising in Google, but that's another bafflement for another day… what is relevant to this discussion, is that my Bing results makes the essential invisibility of my website and my local business listing in Google's natural results all the more baffling. One could speculate that Google is a more sophisticated search engine and is returning more relevant results, except that that's not true… my site is in fact the most relevant for those terms (or at least, to be generous, in the top few in terms of relevance). And in the past, before Penguin, it used to be in the top few results in Google, just like in Bing. It's hard for me to swallow that I'm just lacking in proper SEO, when it used to rank great, when I've subsequently been working hard to further improve the SEO for years, and it's a top site everywhere else. Something has to be up with Google… I wish I knew what it was and what I could do…
What I have done already:
I’ve worked hard over the last five years cleaning up bad backlinks and making citations consistent. I think I understand well my most important keywords already, and have my pages optimized for them. I understand on-page optimization and think my site’s in pretty good shape in that regard (and I will further improve the on page optimization when I redesign it very soon.) It could use more good backlinks, but that’s a problem for the future as far as I’m concerned, and not related to the penalty in any case. I understand AdWords well and my ad is at the top of the search results consistently for all relevant keywords, so I don’t need any help there…
Anyone who may have any insight to this… thanks very much in advance!
-
For the reasons given in my last post (May 2), could someone from the staff please mark this discussion as "not answered" rather than "answered"? Thanks.
-
I really appreciate Laura's advice, but I would rather this is not marked "answered," because the essential questions still aren't answered. It may be that they are unanswerable and I just need to scrap the site and start over from scratch.
Regarding the Penguin update, Penguin was already updated with 3.0 back in 2014 but that did not correct my site's tanking. Maybe a further update will, but that remains to be seen and I remain skeptical based on 3.0 having had no effect.
Regarding Google Plus, I know it was separated from Google Local, but when I say I developed my Google Plus page I'm primarily referring to the 13 reviews that were added and that occasionally display nowadays with the knowledge graph or maps or Google Local or whatever it should be called now; perhaps it would have been more clear if I referred to Google Local rather than Google Plus. It's the the Google Local features (map, reviews, business hours, etc.) that are randomly being associated and disassociated from my site in the organic results.
Regarding acquiring more legitimate backlinks, I'm not denying their value as something always worth pursuing that will be certain to help rankings in general, I'm just putting them aside for now for the sake of this discussion as they were not a causal factor in my site’s tanking, because my site ranked towards to top before Penguin without them. And I've done plenty of additional SEO work since then, including creating and cleaning up citations throughout the web, that's had little effect… In fact, nothing I do or have done seems to have much effect… that's the problem. I’m aware that I always can and should do more… but that doesn’t explain my site’s curse.
My best hypothesis is that maybe some aspect of Penguin caused Google to no longer be able to discern the identity of my site (see [https://mza.bundledseo.com/community/q/google-can-t-discern-the-identity-of-my-site]). But that’s just a hypothesis; I haven’t really figured out an explanation that makes sense to me.
-
Thanks for your advice, Laura.
-
First of all, we are all rather impatiently awaiting the next Penguin update, which is long overdue and expected any day now (for the last few months). It's been a while since the last one. If your website traffic did tank as a result of Penguin, you'll see the results of your clean-up efforts soon(ish???).
Secondly, your experiences with Google Plus/Google Local are probably related to the fact that Google separated the two over the last year. Read more at https://www.searchenginejournal.com/google-kills-off-google-local-still-committed-local-search/146049/.
Thirdly, I disagree with your conclusion that building good backlinks to your site is "a problem for the future as far as I’m concerned, and not related to the penalty in any case." If your site has not incurred a manual penalty, meaning that it's algorithmic in nature, you'll help offset any negative effects of the bad backlinks by increasing the number of great backlinks to your site. If your site is set up with Google Search Console, you'll know if you have a manual penalty.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
Daytona Beach Web Design vs. Daytona Web Design: What's Best?
Three months ago we had our team create local pages for some of the services we render -- _i.e., _web design. As we reviewed the pages, they created two pages with similar content; one with URL: /daytona-beach-web-design/ & /daytona-web-design/ We knew we had to kill one of them to avoid duplicate content. Here is where the hard decision came and hence the question. We though about keeping the '/daytona-beach-web-design/ ' URL but for some reason, Google had already crawled the shorter version of the URL '/daytona-web-design/' So we ended up deleting the long tail URL and kept Daytona Web Design instead. Which one would you keep and have you experienced similar issues?
Local Website Optimization | | WebDaytona0 -
What's the best international URL strategy for my non-profit?
Hi, I have a non-profit organization that advocates for mental health education and treatment. We are considering creating regional chapters of the non-profit in specific countries - France, UK, Russia, etc. What's the best long-term foundation for global organic growth? Should we simply internationalize our content (.org/uk/)? Or create a custom site for each ccTLD (.org.uk, etc.? Since it's an educational site, the content for each country would not be particularly unique, apart from: Language (regional English nuance for UK and AUS, or other languages altogether) Expert videos and potentially supporting articles (i.e., hosting videos and a supporting article for a UK Doctor versus a US Doctor) Offering some regional context when it comes to treatment options, or navigating school, work, etc. Any thoughts would be much appreciated! Thanks! Aaron
Local Website Optimization | | RSR1 -
What is the best way to differentiate and optimize two similar websites's SEO?
What is the best way to differentiate and optimize two similar websites's SEO, having in mind that they do not produce content?
Local Website Optimization | | EmmaGeorge0 -
Mobile Friendly Penalty?
My website, http://www.stephita.com/ used to appear on the first page of my "locally" searched terms on www.google.ca. Specifically, when users searched for "wedding invitations", we would appear between 1-3 on the rankings. Sadly, I "ignored warnings" of making my site "mobile friendly" for too long, and only recently made drastic changes to my site to get it to "mobile friendly" status. So, I've fallen far in the SERP, down to the dreaded second page, for my keyword search: "wedding invitations". The "mobile friendly" version of my site has been up and running for about 1 month now. My site's home/index page has a PR3, compared to all the other sites that have leaped frogged mine are all at PR1. Does anybody know the likelihood that my site will ever see the page 1 daylight again? 😞 Or has Google Penalized my rankings 😞 Very much appreciated any help the community can provide!
Local Website Optimization | | TysonWong0 -
What's your opinion on stores with multiple locations around the country that sell the same products?
Is there a way to capture local SEO traffic by only having one website/page for our product pages or do we have to have a website for each location even though the content is identical? We do have a location finder where we list each location. But we want to generate local traffic in the cities we are in to our product pages through SEO, but it's difficult because they all sell the exact same product. We know Google doesn't like duplicate content.
Local Website Optimization | | GrowBrilliant0 -
How can I get a Google places 'widget' displaying on SERPS?
Hi all, My query relates to the Google Places 'Widget' (not sure what they are called exactly). If you do a search for, let's say 'Apple' you get the regular SERPs and on the right of the page they display a Google Places panel which includes map, company details & reviews. (and also a G+ panel for some businesses, where appropriate) What determines this being displayed? I had presumed a correctly formatted and optimised 'Places' page and making sure this was linked and verified. We have Google Places set up for quite a long time (and verified) but for some reason it's not being displayed. Any thoughts? On another note, a client of ours has the opposite issue - they would like to remove this panel from SERPs. I'm guessing the only way to do that would be to remove the Google Places page? Thanks in advance...
Local Website Optimization | | davidmaxwell0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0