Penguin 2.0 drop due to poor anchor text?
-
Hi,
my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic.
Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page:
http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2
With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!?
I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble!
My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery?
Any advice/suggestions will be greatly appreciated,
Thanks
Mike
-
Go to majestic SEO, type your url in. If your keywords you got penalized for are over 10% diversity you are being penalized generally, however there are a few exceptions, but not many. I analyzed 440 sites and found that the highest was 2.47 for a site that didn't have keywords in the url.
Also, I suggest you read this http://dailyseotip.com/what-other-marketing-firms-want-you-to-believe-that-isnt-true/3356/ I see that you are really focused on Onpage SEO. I think this will help you understand more.
The next thing you may want to do is start contacting admins and deleting low quality links if you have them. Use OSE and figure out low quality links. There are only a handful of directories I recommend out their. I have a message from Google telling one of my clients to get rid of their directory links, it was and example link coming from a directory site to be exact. Never use a keyword at a directory site, always use Brand name or your URL.
Make sure your Disavow is your last resort and I highly suggest you get someone to do it that has experience in it. Many have messed this up and really hurt their website.
Have a great day.
-
Hi Mat,
thanks for your reply. I'll definitely change the link, but I agree that it would be harsh if it was the sole reason for the 30% drop in organic traffic.
There are definitely some directories linking to ConcertHotels.com - at one stage I used the SEOmoz list of directories and got my website listed on some of the recommendations from the list. But my strategy for the last two years has been to approach venue's own websites and ask if they'd be interested in linking to our nearby hotels page, as a useful resource for their visitors. This strategy has worked quite well for me, and to me it sounds like a very natural, sensible link building strategy. I'll certainly work through my list of backlinks, but I would hope that the majority of them are from very relevant websites (due to the strategy I adopted). I guess there could be a percentage that I have not had any control over however, and I guess I should disavow these?
As for the directories, should I now be disavowing directory links? I didn't think that the percentage of directory links to my site would be that high -I used the directory link strategy in the past to simply enhance the number of links to my homepage - the strategy I described above is one which achieve links to specific pages throughout my website, not my homepage, so I felt the need to grow the number of homepage links.
Thanks again for your help and advice
Mike
-
That link is not ideal, but I really do not believe that it would cause the sort of drop you are talking about.
If you think you have been hit by penguin 2 then I'd start looking at your backlinks with a critical eye. I just stuck your domain into majestic seo and I hit a lot of questionable directories pretty quickly. That might be unfair - I certainly haven't analysed in any depth. However I took 10 domains at random and 9 were sites that at best are not helping you much.
If you're looking for a cause of a drop I'd say you could do worse than going through your backlink profile.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple sites using same text - how to avoid Google duplicate content penalty?
Hi Mozers, my client located in Colorado is opening a similar (but not identical) clinic in California. Will Google penalize the new California site if we use text from our website that features his Colorado office? He runs the clinic in CO and will be a partner of the clinic in CA, so the CA clinic has his "permission" to use his original text. Eventually he hopes to go national, with multiple sites utilizing essentially the same text. Will Google penalize the new CA site for plagiarism and/or duplicate content? Or is there a way to tell Google, "hey Google, this new clinic is not ripping off my text"?
Web Design | | CalamityJane770 -
Hidden Text w/ Java Script _ Is it Bad?
Just came across an article that stated that Google is looking negatively at sites that attempt to hide text or use javascripts to expand text on websites. We are about to launch our new website and believe we are using this technique but im not certain if what we are doing will hurt us. Our website tends to be a little heavy on the text so used a "read more" scrpit that will expand when clicked on. Three sections that use this on the new website Take a look and let me know your thoughts http://joomplateshop.com/demos/catdi.com/
Web Design | | ChopperCharlie0 -
Links not visible in "Google cache text version" but visible in "Fetch as Google" in Webmaster tool
Hi Guys, There seems some issue with the coding due to which Google is not indexing half of our menu bar links. The cached text version of http://www.99acres.com/ is not showing links present in dropdown "All India" , dropdown "Advice" and "Hot Projects" tab in blue bar on top menu whereas these links are visible in "Fetch as Google" in Google Webmaster tool. Any clue to why is there a difference between the links shown in Google webmaster and Google cache text version. Thanks in advance 🙂
Web Design | | vivekrathore0 -
Can white text over images hurt your SEO?
Hi everyone, I run a travel website that has about 30 pre-search city landing pages. In a redesign last year we added large "hero" images to the top of the page, and put our h1 headlines on top of them in white. The result is attractive, but I'm wondering if Google could be reading this page as "white text on white page", which is an obvious no-no, especially if it could seem that we're trying to hide text. Here's an example: http://www.eurocheapo.com/paris/ H1: Expert reviews of cheap hotels in Paris I should add that our SERPs for these city pages has dropped (for "Cheap hotels in X"), but it could obviously be related to other issues. Any advice would be appreciated. Many thanks! Tom
Web Design | | TomNYC0 -
Will updating our site from ASP .NET 3.5 to ASP .NET 4.0 negatively affect SEO?
I've checked out some of the other posts related to .NET upgrades, but none specifically address ASP .NET 4.0. I understand that there are many advantages to upgrading, but as with any change made to site code I want to be 110% positive that this upgrade will not affect how Google ranks my client's pages. Since the URL extension isn't changing (will remain .aspx), I'm thinking that there won't be much of an affect on SEO at all. In fact, I'm making the argument that the upgrade will only improve page rank. Anyone go through this upgrade and experience any immediate benefits or disadvantages? Thanks for your help!
Web Design | | FreightTEK0 -
Duplicate H1 tag IF it holds SAME text?
Hello people, I know that majority of SEO gurus (?) claim that H1 tag should only be used once per page. In the landing page design I'm working with, we actually need to repeat our core message stated in H1 & H2 - at the bottom of the page. Now the question is: Can that in any way cause any ranking penalty from big G? In my eyes that is not attempt to over optimize page as it contains SAME info as the H1 & H2 at the top of the page. Confusing, so I'm hope that some SEO gurus here will share some light on this. Thanks in advance!
Web Design | | RetroOnline0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0 -
Text-align: -900% in an absolute element?
I'm having a hard time doing image replacement in an absolute element. I know there is a replacement technique which is ideal for this but the text is larger then the window so when the image is shown over the text, a part would still be visible. Could anyone help me any further?
Web Design | | ldestrooper0