Penguin 2.0 drop due to poor anchor text?
-
Hi,
my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic.
Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page:
http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2
With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!?
I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble!
My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery?
Any advice/suggestions will be greatly appreciated,
Thanks
Mike
-
Go to majestic SEO, type your url in. If your keywords you got penalized for are over 10% diversity you are being penalized generally, however there are a few exceptions, but not many. I analyzed 440 sites and found that the highest was 2.47 for a site that didn't have keywords in the url.
Also, I suggest you read this http://dailyseotip.com/what-other-marketing-firms-want-you-to-believe-that-isnt-true/3356/ I see that you are really focused on Onpage SEO. I think this will help you understand more.
The next thing you may want to do is start contacting admins and deleting low quality links if you have them. Use OSE and figure out low quality links. There are only a handful of directories I recommend out their. I have a message from Google telling one of my clients to get rid of their directory links, it was and example link coming from a directory site to be exact. Never use a keyword at a directory site, always use Brand name or your URL.
Make sure your Disavow is your last resort and I highly suggest you get someone to do it that has experience in it. Many have messed this up and really hurt their website.
Have a great day.
-
Hi Mat,
thanks for your reply. I'll definitely change the link, but I agree that it would be harsh if it was the sole reason for the 30% drop in organic traffic.
There are definitely some directories linking to ConcertHotels.com - at one stage I used the SEOmoz list of directories and got my website listed on some of the recommendations from the list. But my strategy for the last two years has been to approach venue's own websites and ask if they'd be interested in linking to our nearby hotels page, as a useful resource for their visitors. This strategy has worked quite well for me, and to me it sounds like a very natural, sensible link building strategy. I'll certainly work through my list of backlinks, but I would hope that the majority of them are from very relevant websites (due to the strategy I adopted). I guess there could be a percentage that I have not had any control over however, and I guess I should disavow these?
As for the directories, should I now be disavowing directory links? I didn't think that the percentage of directory links to my site would be that high -I used the directory link strategy in the past to simply enhance the number of links to my homepage - the strategy I described above is one which achieve links to specific pages throughout my website, not my homepage, so I felt the need to grow the number of homepage links.
Thanks again for your help and advice
Mike
-
That link is not ideal, but I really do not believe that it would cause the sort of drop you are talking about.
If you think you have been hit by penguin 2 then I'd start looking at your backlinks with a critical eye. I just stuck your domain into majestic seo and I hit a lot of questionable directories pretty quickly. That might be unfair - I certainly haven't analysed in any depth. However I took 10 domains at random and 9 were sites that at best are not helping you much.
If you're looking for a cause of a drop I'd say you could do worse than going through your backlink profile.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking drop after new website
Hi there, I have a new client who has just had a new website built (by someone else). It was quite a major change as it was 12 years old and has just been moved to Wordpress. However although they are by and large happy with the new site, they have lost a lot of their rankings in Google. The content and menu structure is apparently identical. I told them I didn't think this was unusual but I'm not sure how easy it will be to get them ranking again. Where are they likely to be starting from? Is it a case of starting from the beginning or will there be some residual ranking capability left over? Or can they expect a full recovery over time? I was going to start by looking to see if things like tagging and meta data has been filled in (I will add the site to my Moz account) but is there any way of comparing the old site with the new for SEO purposes? Thanks so much, Sarah.
Web Design | | Frog-Marketing0 -
Site Migration due to Corporate Acquisition
Hey everyone, Wanted to check-in on something that I've been thinking way too much about lately. I'll do my best to provide background, but due to some poor planning, it is rather confusing to wrap your head around. There are currently three companies involved, Holding Corp (H Corp) and two operating companies, both in the same vertical but one B2B and the other is B2C. B2C corp has been pushed down the line and we're focusing primarily on H Corp and B2B brand. Due to an acquisition of H Corp and all of it's holdings, things are getting shuffled and Ive been brought in to ensure things are done correctly. What's bizarre is H Corp and it's web property are the dominant authority in SERPs for the B2B brand. As in B2B brand loses on brand searches to H Corp, let alone any product/service related terms. As such, they want to effectively migrate all related content from H Corp site to B2B brand site and handover authority as effectively as possible. Summary: Domain Migration from H Corp site to B2B Brand site. Ive done a few migrations in my past and been brought in to recover a few post-launch so I have decent experience and a trusted process. One of my primary objectives initially is change as little as possible with content, url structure (outside the root) etc so 301s are easy but also so it doesn't look like we're trying to play any games. Here's the thing, the URL structure for H Corp is downright bad from both a UX perspective and a general organizational perspective. So Im feeling conflicted and wanted to get a few other opinions. Here are my two paths as I see and Id love opinions on both: stick with a similar URL structure to H Corp through the migration (my normal process) but deviate from pretty much every best practice for structuring URLs with keywords, common sense and logic. Pro: follow my process (which has always worked in the past) Con: don't implement SEO/On-page best practices at this stage and wait for the site redesign to implement best practices (more work) Implement new URL structure now and deviate from my trusted process. Do you see a third option? Am I overthinking it? Other important details: B2B brand is under-going a site redesign, mostly aesthetic but their a big corporation and will likely take 6-9 months to get up. Any input greatly appreciated. Cheers, Brent
Web Design | | pastcatch1 -
2 Menu links to same page. Is this a problem?
One of my clients wants to link to the same page from several places in the navigation menu. Does this create any crawl issues or indexing problems? It's the same page (same url) so there is no duplicate content problems. Since the page is promotional, the client wants the page accessible from different places in the nav bar. Thanks, Dino
Web Design | | Dino640 -
Website Drops Some Traffic after Redesign. What's Happening?
What it is NOT: No Link was broken. I have used Moz, Screaming Frog, Excel, etc - there are not broken links. We have not added spammy links. We kept the same amount of links and content on the homepage - with an exception of 1 or 2. All the pages remained canonical. Our blog uses rel=prev rel=next, and each page is canonicalized to itself. We do not index duplicated content. Our tags are content="noindex,follow" We are using the Genesis Framework (we were not before.) Load time is quicker - we now have a dedicated server. Webmaster tools has not reported any crawl report problems. What we did that should have improved our rankings and traffic: Implemented schema.org Responsive design Our bounce rate is down - Average visit length is up. Any ideas?
Web Design | | Thriveworks-Counseling0 -
Domain Authority Drop After Website Relaunch
Prior to my website redesign and relaunch on July 10th, our domain authority was 33. 301 redirects were implemented properly. Out or 600 pages, about 200 URLs were modified. Domain authority has dropped to 28. Rankings are terrible. Conversions are awful. What does the domain authority drop mean? I have noticed in the past that a drop in domain authority tends to coincide with more a drop in ranking and a drop in the quality of visitors. The site is www.metro-manhattan.com Thanks,
Web Design | | Kingalan1
Alan0 -
New more "helpful" internal linking causing SERP & traffic drop?
Still dealing with the weird traffic drop on my website. I have removed a bunch of old links from a defunct blog, 301 thin pages, added text to remaining pages. I'm still stumped. So awhile ago I freshened up my website and thought I was "helping people" by making sure they could CONTACT the studio more easily... I added more links to the "contact page" I thought this would help conversions...This changed the number of links to my entire site....Would this be the problem with my ranking/traffic drop? http://bayareaboudoir.com/babinternal1.pdf
Web Design | | Squee1 -
Does google prefer expanded text to text that you have to mouse over to show?
Does a long scrolling page of text perform better than a page that has the content in sections that have to be moused over to be seen? Are there any articles or research on this?
Web Design | | SirSud0 -
Has Anyone Had Issues With ASP.NET 4.0 URL Routing?
I'm seeing some odd results in my SEOMOZ results with a new site I just released that is using the ASP.NET 4.0 URL routing. I am seeing thousands(!) of duplicate results, for instance, because the crawl has uncovered something like this: http://www.mysite.com/
Web Design | | TroyCarlson
http://www.mysite.com/default.aspx (so far, so good, though I wish it wouldn't show both)
http://www.mysite.com/default.aspx/about/ (what the heck -?)
http://www.mysite.com/default.aspx/about/about/ (WTF!?)
http://www.mysite.com/default.aspx/about/about/products/ (and on and on ad infinitum) I'm also seeing problems pop up in my sitemap because extensionless urls have an odd "eurl.axd/abunchofnumbersgohere" appended to the end of every address which is breaking links. sigh Buyer beware. I've found articles that discuss the "eurl.axd" issue here and there (this one seems very good), but nothing about the weird crawl issue I outlined above. Any advice?0