Shoing strong for last 2 years for search terms NOW GONE! What happened?
-
Hi All! I have a 9-11~ my website www.popscrap.com has been showing strong (top 2) for about 2 years now for many of the search terms we are targeting (scrap software; scrap medal software; recycling software; etc.), and I just noticed today that we are nowhere. What do you suggest for troubleshooting this to find the cause and fix?
Thanks!
-
Well, I removed the suspect content, and after 2 weeks, nothing. Then I added Google Authorship to each page, and the NEXT DAY the site is back in the top positions for our target terms, and the leads are pouring in. Was it the Google Authorship? It certainly felt like it. But I thought that was not a ranking factor.
Anyway, thanks for all the support! BB
-
On a quick look my gut instinct is that this is ok. However, on a site: search I'm seeing that you have over 19,000 pages indexed in Google. That's a bit of a Panda flag for me as most likely there are not 19,000 unique pages that add value on your site.
-
Thanks for the response, Marie
I asked the question as I was wondering whether I'd need to add "boilerplate" text to each description to fill it out. I'd rather not as a) it's not very scaleable and b) I'm not sure it would add value to our users per se, as in the main people want to see pictures. Here's an example of one of the shorter descriptions we run.
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it Of the 4,500 pages, 95-98% are content that's unique to our site (the other ~2-5% are managed by individual realtors who I'm guessing probably copy and paste descriptions from their own sites. We're not in the US so aren't part of the MLS network).
-Do users engage with your content? Mos' def.
-
It's hard to say what Google views as thin. Here are some factors I would consider when making that decision:
-Is the content the same MLS description that is on multiple sites? If so, then I'd noindex it.
-Do users engage with your content? Short content can be useful. If Google sees that people are actually engaging with your site then they will have no problem with thin content.
It sounds to me like these pages are probably ok. But I can't say for certain.
-
"Thin content" question:
I run a real estate website and carry about 4,500 property pages (each page consisting of between 5-13 photos and about 50-300 words of a property description) Might the pages of ~50 words run the risk of being deemed "thin content" even though they have photos on them?
I also have around 200-250 article pages that are far more text-heavy.
FWIW, I don't think I've been hit by Panda 4.0. (I've slid from about #8 to #12 over the past 2 weeks but I suspect that's more to do with sluggish content marketing/link-building). -
If unpublishing causes the pages to either be removed from your site or noindexed then yes, that's the same thing.
-
Thank you! But what about unpublising? Is that the same thing as removing, in the eyes of Google?I want to remove ALL pages under the "Scrap Laws" menu, because I think that is where the issue is. But I don't want to delete totally and have to recreate them all later. Thnaks again!
-
While you can test this over time, it would be difficult because you will never know if you've done enough to satisfy Panda. And really, you don't even know for sure if Panda is the culprilt. (I think it is, but no one can say for sure.)
So, let's say you took out some of the low quality content and a month later nothing has changed. That could mean that you didn't take out enough to make the Panda algorithm see your site as high quality. But, it could be that you just need more time. While some sites recover within one Panda refresh (and that usually happens approximately monthly), others seem to need several refreshes.
In regards to unpublishing vs deleting the content, you can either delete the pages or you can use a noindex tag to tell Google not to include the pages in the index. Having low quality pages on your site that are noindexed will not hurt you in the eyes of Panda.
-
Thanks Marie! I'm getting the feeling it's the content. Quick question: Could I just unpublish the content and then test over time, OR do I need to completely delete the questionable content from the site? Does Google see it if it is unpublished and still penalize?
-
There were two major algorithm updates last week - Panda and the Payday loans algorithm. Payday loans affects sites that had done really spammy link building and it is very unlikely that this affected you. But, Panda is certainly possible.
I haven't had a good look at the site, but I see that you have 263 pages indexed in Google. Are all of these pages high quality pages that Google would be proud to show to searchers? If you've got duplication amongst the pages or if you've got "unhelpful" pages that are indexed then you need to remove or noindex them. On a quick look here are some examples of pages that should be removed or noindexed:
http://www.popscrap.com/component/content/category/11-demo-articles
http://www.popscrap.com/component/users/?view=remind
http://www.popscrap.com/24-products/120-scrapshield - It looks like a good amount of the text on this page is on multiple pages of your site.
Of course, there could be other issues. If you've made any changes to the site recently then I'd look at those changes first, but otherwise I'd go on a thorough cleanup so that only the pages that are the best are shown to Google.
-
To help figure out what is causing the 404 errors do the following in webmaster tools:
-login to your websites profile, then on the left hand side navigation hit crawl > crawl errors > not found. Under not found review the list of URL's for clues (you can also click on an individual link to see where the 404 page was linked from). Depending on how large your site is, if the 747 not found URL's is a large percentage of your total page count, you could be experiencing a temporary rankings drop that will disappear one you fix your error pages. If you could add a link to a few of the 404 error pages we could help you figure out what is wrong with your site code or server setup.
-
Just my two cents friend..
4 days back, Google released Panda 4.0. You can check if that caused the drop.
Here is a tool that can help you find if any of the Google penalties are behind the drop:
http://www.barracuda-digital.co.uk/panguin-tool/
Once on the page, click on the 'Log-in to Analytics' button and allow the tool to access your Google Analytics account and check if the recent Panda caused the drop. Hope this helps.
Good luck. By the way, thin content is of no use these days and you should be investing all your quality time in producing quality content.
Best,
Devanur Rafi
-
I looked at some of your content, and some of it seems quite thin, such as the regulations for each state. There's really only a couple of sentences (in the instances that I saw) that deal with the individual state, and then there's a lot of boilerplate content, navigation, and other site elements that are the same from page to page. Just one more thing to think about.
-
It looks like google penalize you, it's happen to one of my websites on January, I was going nuts because I didn't see any message until 2 weeks after on my google webmaster tools. I would recommend a couple o days to see if you see something if not then try to check your links if a couple of websites you are linking got penalized they you can get in trouble too.
-
Kevin, any insight into where to start with respect to the 747 missing URL's?? What causes that? How to fix? Thanks!!
-
haha! Ok! Thanks Kevin!
-
No, no. My bad. You mentioned above that you've been ranking strong for two years, and then when I peaked at your site I saw the RT template. I wrongly assumed the Joomla template was released at the same time as the Magento template (I actually use the same exact template for Magento at www.88k.com.tw, although heavily modified). I was just thinking if you had done a site revamp with a new template that might be a factor in your recent bump off SERPs. Sorry to worry you about that. But it looks like you found an issue with the 404 errors. Good job.
-
Also, I just noticed this (see image). 747 missing URL's!?
-
What do you mean by "it's not 2 years old"? Is being under 2 years old a factor?
-
Thanks! Yes, it's Google. We actually are ranking better on Bing and Yahoo now!
Looked at Google Webmaster and it shows a steep drop on 5-21. (image attached)
-
A couple of things I'd do right away:
Look in Google Webmaster Tools to see if there are any notices there (I'm going to assume that it's Google where you are no longer ranking).
Look in your analytics to see if there was a particular day that you dropped off. You can then look to see if that coincided with any known algorithm update.
-
My bad. Looks like it is. It was release for Magento only late last year.
-
Always great to help out a fellow Rocketeer! Did you recently update your website, because that template is not 2 years old. This could certainly be a factor.
-
Thanks, Kevin. I haven't made any changes in months, and do not do any crazy linking schemes. Competitors seem to be at the same places on the page. We are the only one hit by this.
-
That's a tough one without more to go on. Google releases updates to it's ranking algorithm every so often and some site get hit hard. If you're content hasn't changed and you haven't engaged in any unusual activity in terms of link building or advertising, then I'd say wait it out. Give it a week or two, which is how long it's taken many other quality sites to bounce back from a Google update. Unlikely you'll have issues here, but you still might want to check your webmaster tools to see if any manual actions have been applied.
This might be a good time to go over your site, again, for the first time;-) See what could be done to answer visitor questions and lead them to the right pages.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not showing up in search results for non-branded terms
Hello! Can anyone see any glaring reasons why this post: "98 Book Marketing Ideas That Can Help Authors Increase Sales" isn't on page one of Google — or even page 10! — for the term "book marketing ideas"? Many other sites with lower domain and page authority — even ones linking to this article — are ranking on the first ten pages for this term, and I can't figure out why we're not appearing anywhere. The same thing is happening for ALL of our other blog posts, and the keywords they're optimized for. According to GA, the only terms we're getting clicks from are branded keywords. This subdomain is now 2 years old, and the domain bookbub.com has been around for 5 years. Our domain authority is 61. We have the Yoast SEO plugin installed and are following all the standard SEO best practices. We have enough external links to at least be ranking within the first 10 pages of this Google search. I feel like there's something technically wrong, maybe in the code or backend, but nobody here can figure it out, and our hosting provider WP Engine has no ideas. Moz is returning crawl errors on our site, mainly "Error Code 804: HTTPS (SSL) Error Encountered" and "Error Code 803: Incomplete HTTP Response Received." I have confirmed with WP Engine that everything is set up correctly on our end, and that this is a known Moz issue. I've reached out to Moz's support team about this, and am awaiting a response. But what else am I missing? There's got to be something — I've been blogging for 10 years for different companies and my own personal websites, and I've never come across anything like this before. I'm completely stuck! I'd appreciate any insights you can offer. Thanks in advance! 🙂 EDIT: I heard back from Moz on those errors. The 804 errors are a Moz-side issue — their crawler isn't equipped to be able to handle SNI. They're looking into a resolution, and this wouldn't affect search engine crawlers. Regarding the 803 error: "When you see an 803 error, that means your site closed its TCP connection to our crawler before our crawler could read a complete HTTP response. You don't see this error when you go to the page in your browser because content-length is an outdated component for modern browsers and they will disregard this error, but the intention of our crawler is to report any errors that might be occurring. So the crawler is configured to detect and report such errors." The only thing I can think to do here is go back to WP Engine with this information, but other than that, I'm not sure what this could mean or how to fix it, or if this might be the underlying technical issue keeping us from ranking.
Intermediate & Advanced SEO | | bookbubpartners1 -
Crawled page count in Search console
Hi Guys, I'm working on a project (premium-hookahs.nl) where I stumble upon a situation I can’t address. Attached is a screenshot of the crawled pages in Search Console. History: Doing to technical difficulties this webshop didn’t always no index filterpages resulting in thousands of duplicated pages. In reality this webshops has less than 1000 individual pages. At this point we took the following steps to result this: Noindex filterpages. Exclude those filterspages in Search Console and robots.txt. Canonical the filterpages to the relevant categoriepages. This however didn’t result in Google crawling less pages. Although the implementation wasn’t always sound (technical problems during updates) I’m sure this setup has been the same for the last two weeks. Personally I expected a drop of crawled pages but they are still sky high. Can’t imagine Google visits this site 40 times a day. To complicate the situation: We’re running an experiment to gain positions on around 250 long term searches. A few filters will be indexed (size, color, number of hoses and flavors) and three of them can be combined. This results in around 250 extra pages. Meta titles, descriptions, h1 and texts are unique as well. Questions: - Excluding in robots.txt should result in Google not crawling those pages right? - Is this number of crawled pages normal for a website with around 1000 unique pages? - What am I missing? BxlESTT
Intermediate & Advanced SEO | | Bob_van_Biezen0 -
URL Parameters Settings in WMT/Search Console
On an large ecommerce site the main navigation links to URLs that include a legacy parameter. The parameter doesn’t actually seem to do anything to change content - it doesn’t narrow or specify content, nor does it currently track sessions. We’ve set the canonical for these URLs to be without the parameter. (We did this when we started seeing that Google was stripping out the parameter in the majority of SERP results themselves.) We’re trying to best strategize on how to set the parameters in WMT (search console). Our options are to set to: 1. No: Doesn’t affect page content’ - and then the Crawl field in WMT is auto-set to ‘Representative URL’. (Note, that it's unclear what ‘Representative URL’ is defined as. Google’s documentation suggests that a representative URL is a canonical URL, and we've specifically set canonicals to be without the parameter so does this contradict? ) OR 2. ‘Yes: Changes, reorders, or narrows page content’ And then it’s a question of how to instruct Googlebot to crawl these pages: 'Let Googlebot decide' OR 'No URLs'. The fundamental issue is whether the parameter settings are an index signal or crawl signal. Google documents them as crawl signals, but if we instruct Google not to crawl our navigation how will it find and pass equity to the canonical URLs? Thoughts? Posted by Susan Schwartz, Kahena Digital staff member
Intermediate & Advanced SEO | | AriNahmani0 -
Website using search term as URL brand name to cheat Google
Google has come a long way over the past 5 years, the quality updates have really helped bring top quality content to the top that is relevant for users search terms, although there is one really ANNOYING thing that still has not been fixed. Websites using brand name as service search term to manipulate Google I have got a real example but I wouldn't like to use it in case the brand mentions flags up in their tools and they spot this post, but take this search for example "Service+Location" You will get 'service+location.com' rank #1 Why? Heaven knows. They have less than 100 backlinks which are of a very low, spammy quality from directories. The content is poor compared to the competition and the competitors have amazing link profiles, great social engagement, much better website user experience and the data does not prove anything. All the competitors are targeting the same search term but yet the worst site is ranking the highest. Why on earth is Google not fixing this issue. This page we are seeing rank #1 do not even deserve to be ranking on the first 5 pages.
Intermediate & Advanced SEO | | Jseddon920 -
Only ranking well when "UK" is added to search term
Hi, what does it mean when a lot of our keyword phrases rank only when "UK" is typed in the search term? For example:
Intermediate & Advanced SEO | | Solid_Web
"boxes" (not in top 50)
"boxes UK" (38) "big storage boxes" (45)
"big storage boxes UK" (33) We haven't attempted to SEO the pages for search terms with "UK" appended to them. Our domain is a co.uk domain. So, what reasons could there be that are we ranking in such a way?0 -
How to combine 2 pages (same domain) that rank for same keyword?
Hi Mozzers, A quick question. In the last few months I have noticed that for a number of keywords I am having 2 different pages on my domain show up in the SERP. Always right next to each other (for example, position #7 and #8 or #3 and #4). So in the SERP it looks something like: www.mycompetition1.com www.mycompetition2.com www.mywebsite.com/page1.html
Intermediate & Advanced SEO | | rayvensoft
4) www.mywebsite.com**/page2.html**
5) www.mycompetition3.com Now, I actually need both pages since the content on both pages is different - but on the same topic. Both pages have links to them, but page1.html always tends to have more. So, what is the best practice to tell Google that I only want 1 page to rank? Of course, the idea is that by combining the SEO Juice of both pages, I can push my way up to position 2 or 1. Does anybody have any experience in this? Any advice is much appreciated.0 -
How to rank for difficult terms
I am helping launch a new business venture that sells oak furniture in the UK. The key product terms are along the lines of; oak bed
Intermediate & Advanced SEO | | pretige12
oak dining table
oak chairs etc which are very popular search terms. Im used to being able to hang of local or branded search which arent available in this project. The domain is around 12 months old, it has few inbound links which we are working on now with an overseas freelancer (keeping an eye on them). The pages of the site are unique-content rich and the titles and other meta boxes are ticked. What else would the experts recommend?0 -
Soft Hyphenation: Influence on Search Engines
Does anyone have experience on soft hyphenation and its effects on rankings? We are planning to use in our company blog to improve the layout. Currently, every word above 4 syllable will be soft hyphenated.
Intermediate & Advanced SEO | | zeepartner
This seems to render okay in all browsers, but it might be a problem with IE9... In HTML 5, the "" soft hyphenation seems to be replaced with the <wbr> Tag (http://www.w3schools.com/html5/tag_wbr.asp) and i don't find anything else about soft-hyphenation in the specs. Any experiences or opinions about this? Do you think it affects rankings if there are a lot of soft hyphens in the text? Does it still make sense to use or would you switch to <wbr> already?0