$100 to who discovers why our rankings drop
-
I'm offering $100 to the SEO that pinpoints why our rankings dropped. Here's details:
Some very good people have this site:
nlpca(dot)com
and it has dropped for many of it's keywords, including the keywords
"NLP"
"NLP Training"
and many other keywords.
We dropped from 19th to 42nd for the term "NLP".
Here's what I'm doing about it:
(1) making sure all of the keywords (on all pages) in the titles reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(2) making sure all of the keywords (on all pages) in the URLs reflect what's in the content, and that the keywords show up exactly in the content 3 times or more.
(3) We're redoing the home page as (1) above.
(4) We're fixing the 404s
(5) We're shortening the titles that are too long, and we're thinking of reducing the home page keyword count to 3 keyword phrases, although 4 keywords work in all of our other sites that have the keywords showing up at least 3 times in the content.
If it is something else, and you pinpoint it, and if because of you, we rise back up to around 19th (more or less) again then we'll give you $100 payable via paypal as a thank you.
I'm going to leave this question 'unanswered' until this is resolved.
-
Sorry, but not remembering 100% what I was thinking at the time of writing the response since it was a week ago, but trying to reread through what was written, I believe I was talking about how the SERP may have been manually rated. While some of the SERPS are ranked via the algorithms google has developed, I've heard and read that there are a number of them that are affected and rated manually by humans. If there was any human interaction by one of their manual raters, they may have deemed your site less "relevant" for the search.
Have you ever seen the "Give us feedback" link at the bottom of the SERPs? Let's say somebody decides your website and the other 2 competitors are not what they were looking for when it came to the search "nlp" or "nlp training. Well, they could complain and potentially be reviewed by the manual raters or whomever responds to the complaints and drop you. Since it was before the most recent panda change, I was speculating that this could of been a cause.
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.<<
Could you expain this, SeattleOrganicSEO. That might be what happened. It looks like there was an algorithm change that effected us and at least 2 other strong competitors and shifted us all down
-
It might be true, but when the drops occur or when the SERP is manually rated and changed in terms of the makeup, it could be because whatever's triggering it could have been finally re-evaluated at the time you dropped.
However, I don't know if I know all the different pieces you do. Even with the above description of the issues, I think there's a lot more going on potentially that as "outsiders", we can't help with as much. Even when we know everything, we still might be clueless. Sorry, but I haven't had this problem with a client before. I know it will sound cocky, but we've only had the opposite problem (well not a problem) that the rankings go up. I call it a problem because sometimes a ranking improvement doesn't always translate into traffic (or qualified traffic for that matter). Sorry, going off on a tangent...
-
SeattleOrganicSEO,
That's worth looking at, but I'm pretty sure it's not only competition. We tumbled form 19th to 42nd in just a few days for the term "nlp". We'd been on the second page for many years.
-
I don't see it being the larger problem.
Have you considered that your competitors have jumped up their SEO efforts? Have you been paying attention to their backlinks and seeing if they've been doing a bit of link building on the keywords you're targeting? It's a lot of work, but if you know the 2 specific SERPs you're targeting, perhaps you can pay attention to what they're doing. Some SEO software out there make it a bit easier to keep track of...
-
I also just realized that we have articles on our website that are elsewhere on the web. Always with permission, but could this be a problem?
-
If this occurred around Nov/Dec, then it might not be the Panda changes. I just though since you posted recently that maybe the recent Panda change (3.2) could of been a possibility.
-
In that article, SeattleOrganicSEO, one of the comments is
Surviving Panda 3.2 - I will target the right keyword and provide superb content.
This drop in rank was occurring around November or December (Panda 3.1?) when I was trying to target several keywords per page and then later adding content to match.
I thought Panda was for scraping and duplicate content problems, do I need to worry about appropriateness of keywords? Do I need to only target keywords that the page is very obviously already optimized for? If it's not code errors, could this be why we've had a ranking drop?
-
I also am a big believer of clean code, crawalability in general.
but i used the bing SEOtolkit, that sees the site just how bing sees, it, I only found one invalid code error, and one page with too much css. I think the w3 validator picks up a lot of issues that are a bit picky.
but I also believe one open tag, can mean huge amounts of content are not read as visisble content.
This is even more concerneing now we have Microsodata, one error can mean your whose scema is useless.
i dont like to have any css or js in my HTML, I like to look at my souce code and be able to read my content easlsy.
This is one of the reasons i dont like CMS.
-
When did it happen? Any chance it happened around the 18th?
http://searchengineland.com/google-panda-3-2-update-confirmed-109321
-
Those errors are just for the homepage, albeit, there may be much less (once a tag is left open, it tends to really confuse the validator). I'd clean up the whole site for good measure; I'm a big fan of SEO PowerSuite's on-page tools when doing this sort of thing.
The line breaks don't all need to be totally replaced, the big gaps at the top just seemed a bit excessive. That particular recommendation is just based in my own superstitions, and those of others, but is based on this: the first 1/3 rule comes into play so much in SEO (weighting content placed high on a page, early in a tag, etc.); condensing the header section to a more sane level seems sensible. Some SEO auditers, such as WebCEO, will also yell at you if your TITLE tag doesn't immediately follow HEAD, for what I'd expect to be a similar thought; although again, not as scientific of a claim to my knowledge as valid code (which absolutely matters).
-
It's been a while since I did code validation, remind me - is that 79 errors just for the home page?
And will the line breaks confuse crawlers?
And remind me what the cleanest thing to replace the line breaks with are.
-
Not necessarily your one path to salvation (and keep your money on this if it does help gain some ground), but I'd personally start with cleaning up the source:
http://validator.w3.org/check?uri=http%3A%2F%2Fnlpca.com%2F&charset=(detect+automatically)&
doctype=Inline&group=0&user-agent=W3C_Validator%2F1.279 validation errors could definitely confuse crawlers about how things are organized, and imply usability issues. I'd also do something about the extreme # of unnecessary line breaks. I recently pushed a legal niche site up from page 5 to page 1 on a very competitive, short-tail phrase with not a lot more than cleaning up ugly code.
-
One think i noticed is your linking structure, this would I assume been like it is all along and would not be the reason of the drop. But your menu is on every page (I am assuming), meaning that all pages are linked by all pages. This pattern leads to all pages sharing teh rank, but what you want is your landing pages to have most of the page rank.
you should link to as many pages as you can from the home page, but only link to the home page and landing pages from every other page (where posible of cause). this will shift the PR to those pages. See link for a simple explaination.
http://thatsit.com.au/seo/tutorials/a-simple-explanation-of-pagerank
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does interlinking on mobile site helps in seo & improvement in rankings
Hi, Does interlinking on mobile site helps in seo & improvement in rankings. Our desktop site & mobile site has same urls. Regards
Web Design | | vivekrathore0 -
404's and a drop in Rank - Site maps? Data Highlighter?
I managed an old (2006 design) ticket site that was hosted and run by the same company that handled our point of sale. (Think, really crappy, customer had to click through three pages to get to the tickets, etc.) In Mid February, we migrated that old site to a new, more powerful site, built by a company that handles sites exclusively for ticket brokers. (My site: TheTicketKing. - dot - com) Before migration, I set up 301's for all the pages that we had currently ranked for, and had inbound links pointing to, etc. The CMS allowed me to set every one of those landing pages up with fresh content, so I created unique content for all of them, ran them through the Moz grader before launch, etc. We launched the site in Mid February, and it seemed like Google responded well. All the pages that we had 301's set up for stayed up fairly well in rank, and some even reached higher positions, while some took a few weeks to get back up to where they were before. Google was also giving us an average of 8-10K impressions per day, compared to 3000 per day with the old site. I started to notice a slow drop in impressions in mid April (after two months of love from Google,) and we lost rank on all our non branded pages around 4/23. Our branded terms are still fine, we didn't get a message from Google, and I reached out to the company that manages our site, asking if they had any issues with their other clients. They suggested that I resubmit our sitemaps. I did, and saw everything bump back up (impressions and rank) for just one week. Now we're back in the basement with all the non branded terms once again. I realize that Google could have penalized us without giving us a message, but what got me somewhat optimistic was the fact that resubmitting our sitemaps did bring us back up for around a week. One other thing that I was working on with the site just before the drop was Google's data highlighter. I submitted a set of pages that now come back with errors, after Google seemed to be fine with the data set before I submitted it. So now I'm looking at over 300 data highlighter errors when I'm in WMT. I deleted that set, but I still get the error listings in WMT, as if Google is still trying to understand those pages. Would that have an effect on our rank? Finally I do see that our 404's have risen steadily since the migration, to over 1000 now, and the people who manage the CMS tell me that it would have no effect on rank overall. And we're going to continue to get 404's as the nature of a ticket site would dictate? (Not sure on that, but that's what I was told.) Would anyone care to chime in on these thoughts, or any other clues as to my drop?
Web Design | | Ticket_King0 -
Best course of action when removing 100's of pages from your site?
We had a section on our site Legal News (we are a law firm). All we did there was rehash news stories from news sites (no original content). We decided to remove the entire Legal News section and we were left with close to 800 404's. Around this same time our rankings seemed to drop. Our webmaster implemented 301's to closely related content on our blog. In about a weeks time our rankings went back up. Our webmaster informed us that we should submit each url to Google for removal, which we did. Its been about three weeks and our Not Found errors in WMT is over 800 and seems to be increasing daily. Moz's crawler says we have only 35 404's and they are from our blog not the legal news section we removed. The last thing we want is to have another rankings drop. Is this normal? What is the best course of action when removing hundreds of pages from your site?
Web Design | | MFC0 -
Penguin 2.0 drop due to poor anchor text?
Hi, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update, and after years of designing my website with SEO in mind, generating unique content for users, and only focusing on relevant websites in my link building strategy, I'm a bit disheartened by the drop in traffic. Having rolled out a new design of my website at the start of April, I suspect that I've accidentally messed up the structure of the website, making my site difficult to crawl, or making Google think that my site is spammy. Looking at Google Webmaster Tools, the number 1 anchor text in the site is "remove all filters" - which is clearly not what I want! The "remove all filters" link on my website appears when my hotels page loads with filters or sorting or availability dates in place - I included that link to make it easy for users to view the complete hotel listing again. An example of this link is towards the top right hand side of this page: http://www.concerthotels.com/venue-hotels/agganis-arena-hotels/300382?star=2 With over 6000 venues on my website, this link has the potential to appear thousands of times, and while the anchor text is always "remove all filters", the destination URL will be different depending on the venue the user is looking at. I'm guessing that to Google, this looks VERY spammy indeed!? I tried to make the filtering/sorting/availability less visible to Google's crawl when I designed the site, through the use of forms, jquery and javascript etc., but it does look like the crawl is managing to access these pages and find the "remove all filters" link. What is the best approach to take when a standard "clear all..." type link is required on a listing page, without making the link appear spammy to Google - it's a link which is only in place to benefit the user - not to cause trouble! My final question to you guys is - do you think this one sloppy piece of work could be enough to cause my site to drop significantly following the Penguin 2.0 update, or is it likely to be a bigger problem than this? And if it is probably due to this piece of work, is it likely that solving the problem could result in a prompt rise back up the rankings, or is there going to be a black mark against my website going forward and slow down recovery? Any advice/suggestions will be greatly appreciated, Thanks Mike
Web Design | | mjk260 -
Home page is not the highest ranking page?
Our websites do not have the home page as the first page in the search engines. It's something benign like the privacy policy page or the directions page! I've thought of adding no index to the privacy page, but then the directions or contact page would show up instead of the home page. One odd thing I've noticed, is that on our link to the home page on our menu and footer, the link is /default.aspx?mb=rte. But that shouldn't make a difference and default.aspx should rank first, right? What can we do to fix this?
Web Design | | CFSSEO0 -
Preserve Rankings When Switching to a New Site
Hello community! I have a quick question for you regarding preserving my SERPs once I switch a development site to replace the current production site. Basically, we are switching to a new CMS and will be featuring the same content, architectural layout, URL structure, etc. Again, the only difference is that it's going to be on a new CMS. Upon switching to the new version of the site, what would be the best strategy for making sure we preserve our rankings for content already ranking highly within the search engines? Likewise, is there anything additional we may be able to do right-off-the-bat in order to assist content that may not be ranking highly in the SERPs, rank more highly?
Web Design | | NiallSmith0 -
Drastic rankings drop
SOS SEOMozzers hello all. I woke up this morning to find out our rankings for two main keywords: 1. internet marketing firm
Web Design | | vijayvasu
2. marketing firm dropped from being #8 on google to not being invthe top 50. last week we upgraded the site to HTML5 . We checked the on page via pro tools and everthing seemed fine. We checked Google analytics ( traffic was beginning to fall) and we checked webmaster tools - ( there were not critical issues etc) So now i am bewildered as to what possible happened to wipe us of the search. Please can you help - the site is www.gunshotdigital.com0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0