No longer to be found for "certain" keywords.
-
I'd like to see if anyone could potentially shade a light on this rather strange scenario:
Basically yesterday I noticed that we are no longer to be found for 'certain' keywords that we had page 2-3 ranking. Yet, for other keywords we still appear on page 2-3. These keywords are very competitive and our rankings has constantly improved in the course of 5-6 months.
Now my question is that what could or may have contributed to the fact that for only some keywords we are no longer to be found? Another question is, can Google remove you from their SERPs for certain keywords 'only'? Thank you,
Maximilian. -
Thank you once again. I will get in touch with you. Please PM your contact details. We are located in downtown Manhattan.
-
Strategies for dealing with external duplicate content are overviewed here.
A much bigger concern is duplicating your OWN content. I know it sounds kind of silly, but it is a very real issue, especially since Google's most recent update (Panda). Is there more than one URL for any page on your website? Does http://website.com direct to http://www.website.com? Do you have canonicalization problems related to pagination or something similar?
These issues are discussed in-depth here.
If you think that your issues may be related to on-site or site architecture factors, the best thing you can do is hire a qualified SEO consultant that can assess these issues and make actionable recommendations for correcting them.
-
Anthony, our website has been online since 2002 and we have been getting organic quality backlinks for a very long time. Directory submission was just added literally 4 weeks in an effort to 'add more value'. It was never intended to use it as the only source, but just was an effort to increase the rankings. Who knew it would or could potentially backfire.
On another note, you mentioned that duplicate content may cause this issue. Our content is very reach and well-written and there are a number of sites who have copied/pasted our content. Despite all of our efforts and having had contacted their hosting companies to shut down their website, no changes have been made. Now my question is, do we get affected if "others" have copied our content on their website? Does Google have any algorithmic to define which site has copied which sites data?
Thank you once again for your insightful information.
-
Maximilian -
If Google has devalued some of your links, there's no way to "recover" these links. That's the bad news.
The good news is that this happens to websites every day, and it's not by any means a permanent penalization or anything of that sort. All it means is that the time and effort spent to build these links was squandered.
What can you do to speed up your recovery? Switch the way you think about link building. Directory submissions are useful, but they shouldn't by any means comprise the majority of your link building strategy.
I would suggest reading this article cover to cover. It will teach you the proper mindset you should have when approaching linkbuilding, and it will give you dozens and dozens of ideas to get started with building high quality links from relevant, authoritative domains.
Replace directory submissions with strategies like writing guest blog posts, creating amazing content (linkbait) and subsequently promoting it on social media and social bookmarking websites, and/or creating a widget that would be an invaluable addition to any website in your industry. All of these are covered under the "Content-Based Link Building Strategies" section of the SEOmoz Professional Guide to Linkbuilding that is linked to above.
-
"You have been building low quality links and they have been devalued by Google, causing you to lose hundreds (thousands?) of keyword-rich backlinks."
Anthony, thank you for your response. The above statement could potentially be the source of issue, considering one of our staff members been doing directory submissions (30 per day).
Now you mentioned that Google may have devalued the high-quality banklinks that have had due to gaining low quality backlinks. In your experience, what can be done to resolve this issue? Is this devaluation something permanent or a temporary? What can be done to speed up the recovery?
Thank you in advance.
-
There are tons of reasons why your website might see a drop in rankings for certain keywords. Here are a few:
-
Increased competition in your niche pushed your website down as new websites started to rank above you.
-
You have been building low quality links and they have been devalued by Google, causing you to lose hundreds (thousands?) of keyword-rich backlinks.
-
Your website has been affected by a Google algorithm update. For one reason or another (duplicate content? shallow or nonexistent content on some pages? too many ads / too big of a "footprint"?), your website is being seen as less trustworthy, and this has affected rankings sitewide. (Note, some rankings would be maintained, particularly for keywords with low competition.)
-
You were logged into your Google account when you saw these rankings, and your website was appearing higher than it typically does because of your personalized search data. When you logged out and checked your rankings, you noticed a "drop" in ranking that was not actually real. (Okay, this one is unlikely, but we've all seen it before.)
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the "Homepage" for an International Website With Multiple Languages?
BACKGROUND: We are developing a new multi-language website that is going to have: 1. Multiple directories for various languages:
Intermediate & Advanced SEO | | mirabile
/en-us, /de, etc....
2. Hreflang tags
3. Universal footer links so user can select their preferred language.
and
4. Automatic JS detection of location on homepage only, so that when the user lands on /, it redirect them to the correct location. Currently, the auto JS detection only happens on /, and no other pages of the website. The user can also always choose to override the auto-detection on the homepage anytime, by using the language-selector links on the bottom. QUESTION: Should we try to place a 301 on / to point to en/us? Someone recommended this to us, but my thinking is "NO" - we do NOT want to 301 /. Instead, I feel like we should allow Google Access to /, because that is also the most authoritative page on the website and where all incoming links are pointing. In most cases, users / journalists / publications IMHO are just going to link to /, not dilly dally around with the language-directory. My hunch is just to keep / as is, but also work to help Google understand the relationship between all of the different language-specific directories. I know that Google officially doesn't advocate meta refresh redirects, but this only happens on homepage, and we likewise allow user to override this at any time (and again, universal footer links will point both search engines and users to all other locations.) Thoughts? Thanks for any tips/feedback!2 -
Call for Help. Hit Badly with "Medic" and another 30% Loss with Sept 28th Update
Hi Everyone, I am not sure how this is all happening. We have been online for about 15 years, and now we are at our lowest amount of traffic in about 10 years. Our sites are www.bestpricenutrition.com and www.mysupplementstore.com. We sell commodity items, but I have focused on unique product descriptions, tons of UGC, blog posts and guides for awhile now and it has always done us well. Until as of late. This is what I feel led up to this, but I am hoping there is something I missed. May 1st, 2018: Migrated www.bestpricenutrition.com and www.mysupplementstore.com from Shopify. Similar sites, but almost all unique content. We purchased www.mysupplementstore.com about 8 years ago. A ton of traffic and sales, which is why we didn't just redirect it. Around May 25th: www.mysupplementstore.com took a big hit and lost almost 40% of its traffic. Nothing happened to www.bestpricenutrition.com, we actually increased traffic. Aug 1st Update: www.mysupplementstore.com lost another 25% of its traffic. www.bestpricenutrition.com lost about 40% of it's traffic. Sept 28th: Nothing happened to www.mysupplementstore.com, but www.bestpricenutrition.com lost another 30% of it's traffic. So I have been trying to figure out if there is anything technically wrong, but doesn't seem so. These are issues we discovered in August. During the migration, the reviews from each site were syndicated to both websites. There were 1000's. This was resolved in mid August. During the migration, the company doing the migration pushed our blog posts to both websites. 100's of blog posts duplicated to each website. This was resolved mid August. We found that a disgruntled employee instead writing unique content for our product pages, she was copying them one from another. This was about 100 product pages, which we have since resolved. What's Left I noticed on www.bestpricenutrition.com that we have 100's of blog posts that are getting hardly any traffic. I had trimmed www.mysupplementstore.com of this low traffic content. I am working on www.bestpricenutrition.com still. I have been in this industry since 2003, survived 2012, but have exhausted everything I know to figure this out. It's another sob story I know, but trying to keep everyone's job alive here, but it doesn't look like it's going to happen. Any help would be greatly appreciated.
Intermediate & Advanced SEO | | vetofunk0 -
How to answer for question "why xyz site is ranking for abc keyword" and not our website
Hi All, This is a layman question but would like to get a concrete answer for. I would like to know how to answer the questions like "Why our competitor is ranking for keyword ABC but not us"? What metrics or data can I showcase that gives logical answer. Please help in this regard. Thanks!
Intermediate & Advanced SEO | | Avin1230 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
Is it dangerous to use "Fetch as Google" too much in Webmaster Tools?
I saw some people freaking out about this on some forums and thought I would ask. Are you aware of there being any downside to use "Fetch as Google" often? Is it a bad thing to do when you create a new page or blog post, for example?
Intermediate & Advanced SEO | | BlueLinkERP0 -
Can links indexed by google "link:" be bad? or this is like a good example by google
Can links indexed by google "link:" be bad? Or this is like a good example shown by google. We are cleaning our links from Penguin and dont know what to do with these ones. Some of them does not look quality.
Intermediate & Advanced SEO | | bele0 -
Proper use and coding of rel = "canonical" tag
I'm working on a site that has pages for many wedding vendors. There are essentially 3 variations of the page for each vendor with only slightly different content, so they're showing up as "duplicate content" in my SEOmoz Campaign. Here's an example of the 3 variations: http://www.weddingreportsma.com/MA-wedding.cfm/vendorID/4161 http://www.weddingreportsma.com/MA-wedding.cfm?vendorID=4161&action=messageWrite http://www.weddingreportsma.com/MA-wedding.cfm?vendorID=4161&action=writeReview Because of this, we placed a rel="canoncial" tag in the second 2 pages to try to fix the problem. However, the coding does not seem to validate in the w3 html validator. I can't say I understand html well enough to understand the error the validator is pointing out. We also added a the following to the second 2 types of pages <meta name="robots" content="noindex"> Am I employing this tag correctly in this case? Here is a snippet of the code below. <html> <head> <title>Reviews on Astonishing Event, Inc from Somerset MAtitle> <link rel="stylesheet" type="text/css" href="[/includes/style.css](view-source:http://www.weddingreportsma.com/includes/style.css)"> <link href="[http://www.weddingreportsma.com/MA-wedding.cfm/vendorID/4161](view-source:http://www.weddingreportsma.com/MA-wedding.cfm/vendorID/4161)" rel="canonical" /> <meta name="robots" content="noindex">
Intermediate & Advanced SEO | | jeffreytrull1
<meta name="keywords" content="Astonishing Event, Inc, Somerset Massachusetts, Massachusetts Wedding Wedding Planners Directory, Massachusetts weddings, wedding Massachusetts ">
<meta name="description" content="Get information and read reviews on Astonishing Event, Inc from Somerset MA. Astonishing Event, Inc appears in the directory of Somerset MA wedding Wedding Planners on WeddingReportsMA.com."> <script src="[http:///urchin.js](view-source:http:///urchin.js)" type="text/javascript">script> <script type="text/javascript"> _uacct = "UA-173959-2"; urchinTracker(); script> head>0 -
Shall I fix "most Common Errors" for a website that ranked top 3 on Google (difficult KW)?
How can SEOmoz "most Common Errors*" under "Crawl Diagnostics" advice can be right for a good site organic? Site is well ranked top 3 on Google (difficult KW). If I go ahead and fix these errors, I might hurt my SEO , no? like: Too Many On-Page Links 302 (Temporary Redirect) Title Element Too Long (> 70 Characters) Missing Meta Description Tag
Intermediate & Advanced SEO | | Elchanan0