Homepage alone dropped for one "keyword"
-
Hi Moz community,
Our websites has dropped almost 50 positions for main keyword and Okay with other keywords. Other pages are doing consistent for other keywords. We haven't made any changes in website. What could be reason this ideal scenario of homepage dropping for main keyword. And recent unconfirmed algo update have anything do with this?
Thnaks
-
Google left us in "confusion world" and making money. First, does anybody guess what is good or bad link as per Google? No. May be very less SEO experts; that too after using expensive tools. Google agreed that they "try hard" to stop the negative SEO affect; but cannot guarantee you that no links hurt. So it implies that Google algorithm is never going to be accurate even they update with Penguin or Peacock. That too it'll be more less accurate coming to websites like us with thousands of backlinks where hundreds and thousands of new clueless backlinks add every month making it hard finding the culprits.
Wikipedia page is sure a strong hit. Wikipedia page is not a feasibility every company but only for which holds some genuine reputation can get a page and backlink. So, even it's a nofollow technically, Google gives a weight to it.
One of our sub-domain is hitting with backlinks from same domain for last few months...all added up to 5k links...mostly from comments. Do we need to worry about this hitting our domain and website rankings?
-
It's not that Google will penalize you for using the disavow tool, but rather, that if you disavow a good link you'll potentially end up doing harm.
I don't think Matt's statements were confusing. He said that if you reavow a link it may not be given the weight that it once had. This is a measure Google takes to make it harder for people to experiment with the disavow tool. I wrote a lot more about this here:
https://searchenginewatch.com/sew/how-to/2409081/can-you-reavow-links-you-have-peviously-disavowed
Losing a link from a wikipedia page should not cause a drop IMO. Links from wikipedia are nofollowed and do not pass PageRank. Now, in some cases you can get followed links from sites that scrape wikipedia, but I would be surprised if losing these links hurt you.
In regards to Google ignoring bad backlinks, that's what they say they do now. Still, if I see a site that has a lot of self made SEO links then I'll disavow just to be sure. Also, there are other algorithms that use links and there is the potential for manual actions, so it's not like we can completely ignore unnatural links.
-
Hi Marie...Thanks for sharing your views and guidance.
I agree that disavowing some useful links might push us down in rankings. But I have only disavowed the links which have more Moz spam score after checking them manually. These links look useful and I don't find any value for them. I don't think removing 10 links from our back-link profile with thousands of backlinks will affect us.
I disagree that Google will penalise for using disavow tool. Statement from Matt Cutts is one of the confusing statement to make sure users not misusing the tools and put a load on Google to process the requests. Disavow is complete auto tool with no human interference as per my knowledge and Google will not punish for just using it. And if Google is against experimenting, first they must able to judge if a request is a experiment or genuine try. This is highly impossible for Google which is like trying to read minds of users.
Something interesting happened in our case is....we actually lost the back link from wikipedia page. So we are presuming this might be the obvious cause for the drop. Do you think so?
And do you believe that Google completely ignores bad backlinks? And only good backlinks are ranking factor? (beside on-site factors)
-
The potential harm in using the disavow tool is that you could be disavowing links that are actually helping you. If a link is truly an unnatural link, then yes, it should be disavowed, but if you are disavowing and then re-avowing, and then trying different links to disavow this could be dangerous.
Matt Cutts a few years ago said that Google had built in some features to the disavow tool to prevent people from trying to experiment with it. He hinted that a reavowed link may not carry the same power that it once did. Also, Cyrus Shepard from Moz did an experiment where he disavowed every link to his site and rankings plummeted. He later removed his entire disavow file and his rankings did not recover at all.
Regarding discounting links vs penalizing for links, Gary Illyes from Google made statements saying that the new Penguin algorithm no longer penalizes sites. With that said, if you have a lot of unnatural links I still recommend disavowing as you could get a manual penalty.
Also, there are other algorithms that use links, so yes, I still do disavow. My reasoning for advising that you don't disavow is because it sounds like you are experimenting with the tool and disavowing and reavowing. Again, if a link needs to be disvaowed, then disavow it and leave it at that.
-
Thanks for your thoughts Marie.
I don't understand what's wrong in using disavow tool with any number of links and how it'll harm. It's an automation where link juice will stop passing from the links we disavowed and nothing behind that. Moreover if disavow tool made our push, why don't we recover even after weeks removing it? Also we hardly kept the disavow file for few days.
I also don't agree that Google gonna just ignore unnatural links and consider good links with it's algorithm. After all humans itself these days couldn't able to conclude some links; so Google doing a smart job here is impossible and it's never going to be accurate. Definitely some links will trigger the backlink profile and that's how most of the penalties have been removed by SEO experts these days by using disavow tool.
Page title: I can see in my niche, most of the top ranking pages are starting with "brand & keyword"..like I said "vertigo tiles". I can see this mostly on homepage which might be contributing better as this phrase has been mentioned more time across internet. I mean if "vertigo tiles" has more visibility, starting home page title with same will boost the rankings.
Thanks
-
It's tough to comment without seeing the actual page, but here are my thoughts.
You should not try to experiment with the disavow tool. If you've got links that you yourself made for SEO purposes and they serve no other purpose then yes, disavow them. But, if you're not sure, it's best to just leave the disavow tool alone as it's possible to do more harm than good. Google's new version of Penguin just ignores unnatural links and doesn't penalize sites for having them. If you have lots of spammy links I still advise disavowing, but disavowing just a few links is not a good idea.
Regarding page titles, it's generally best practice to have your most important keywords at the beginning of the title tag, so with that in mind the old title tags look better to me.
-
We dropped on Jan 20th and dropped more in Jan last week. We haven't changed anything around the time. We have disavowed few links and removed the links again if they might dropped us; but this disavow removal does't help us. I think it's more about On-page. I can see page titles with exact match are pitching more effectively on top results. Below example is my hypothesis on monitoring the new results and comparing them:
Let's say, for "tiles" keyword, below are how old and new page title types making on top:
Old page titles: "tiles for kitchen, hall and bedroom - vertigo tiles"
New page titles: "vertigo tiles - tiles for kitchen, hall and bedroom"
Please share your thoughts.
Thanks
-
When did the drop happen? If it was within the last few days, I'd say not to change anything and just wait. I've had a number of clients recently that have noticed a huge drop and then within a week, they popped back up higher than they were before. I personally think this could be a part of how Google tests a site's worth. I think they may remove a page from the first page temporarily to see if it affects where people click.
One other thing to check is keyword stuffing. That can sometimes cause a page to drop for one keyword. But again, I wouldn't change anything just yet.
-
A few Days ago I faced same the problem
But I recovered this by Optimizing my ON Page like I used more Keyword in Title and extended to more that 70 characters.
After that, i have linked some internal link on the Home page and linked 1-2 Outbound link to High Authority Website.
That's It. I Got My Website Rankings.
I am Sure this will Help you to Get Your Ranking Soon.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
50% Visibility drop following June 2021 Google Update
Hello everyone,
Algorithm Updates | | yacpro13
We've observed a 50% drop in our Visibility score in the last week. This is our biggest drop ever, which coincides with June Google updates. We're an established ecommerce website located in Canada. This has obviously severely impacted sales. I'm frantically searching for information regarding fixes / implementations to recover asap, but if anybody could point us in the right direction, that would be hugely appreciated. Thanks!0 -
Is it possible (or advisable) to try to rank for a keyword that is 'split' across subfolders in your url?
For example, say your keyword was 'funny hats' - ideally you'd make your url 'website.com/funny-hats/' But what if 'hats' is already a larger category in your site that you want to rank for as its own keyword? Could you then try to rank for 'funny hats' using the url 'website.com/hats/funny/' ? Basically what I'm asking is, would it be harmful to the chances of ranking for your primary keyword if it's split across the url like this, and not necessarily in the correct order?
Algorithm Updates | | rwat0 -
In one site a 3rd party is asking visitors to give feedback via pop-up that covers 30-50% of the bottom of the screen, depending on screen size. Is the 3rd party or the site in danger of getting penalized after the intrusive interstitial guidelines?
I am wondering whether the intrusive interstitial penalty affects all kinds of pop-ups regardless of their nature, eg if a third party is asking feedback through a discreet pop-up that appears from the bottom of the screen and covers max 50% of it. Is the site or the third party who is asking the feedback subject to intrusive interstitial penalty? Also is the fact that in some screens the popup covers 30% and in some others 50% plays any role?
Algorithm Updates | | deels-SEO0 -
Any benefit to splitting up links from one company to diff pages?
We are the presenting sponsor for this big event in our area (Chasco Fiesta). As part of being their sponsor, their website has linked to us in five different places on their site. But it's all to our homepage. Would there be any benefit to having them link to other pages on our site instead of just our homepage (assuming the other pages are a reasonable expectation for the user, of course)? Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
18 years later, Page Rank 6 Drops to 0, All +1s disappear, Scrapers outrank us
18 years ago I put up our first website at http://oz.vc/6 Traffic grew and our forums reached hundreds of thousands of posts, our website had a page rank of 6 and our forums and other content areas ranked 5-6, the others usually 4-6. Panda 2.2 came along and whacked it. No measures recommended by SEO experts and the Matt Cutts videos even made a dent, including some pretty severe measures that were supposed to make a difference. Bing and Yahoo traffic both grew since Panda 2.2 and only Google kept dropping every few updates without recovery. Several few weeks ago Google provides the ultimate whack. It seems every page other than the home page has either a PR of 0 or not generating any PR at all. Every +1 disappeared off of the site. Now three pages have +1 back and the entire guide section (hundreds of articles) are still missing all +1s. I discovered two scrapers, one of which was copying all of our forum posts and ranking a PR 2 for it (while we have a zero. They were taken down but I still can't imagine how this result could happen. I am going to have an RSS feed aggregator taken down that is ranking a 2 and knows we can't prevent them from taking our Wordress feeds and storing them (we use them for areas on the site.) How can Google provide us with a zero page rank and give obvious scrapers page rank? What should have been years worth of awesome rich added content and new features was wasted chasing Google ghosts. I've had two SEO people look at the site and none could point to any major issue that would explain what we've seen, especially the latest page rank death penalty. We haven't sold paid links. We have received no warnings from Google (nor should we have.) The large "thin" area you may see in a directory were removed entirely from Google (and made no difference and a drop in Google doing the "right" thing!) Most think we have been stuck for a very long time in the rare Google glitch. Would be interested in your insights.
Algorithm Updates | | seoagnostic0 -
Sub-domains and keyword rich domains
Hello All I'm hoping for some opinions as i am confused as to the best action for me to take. The problem:
Algorithm Updates | | jonny512379
Although i say the below, we have never been penalised by Google, not taken part in any bad link building and don't do too bad with SERP. but i worry Google may not like what i do these days. We have one main site that is broken down into areas/cities (i,e London, Manchester, etc) so the domain looks like www.domain.co.uk/London But in addition to this we also use Sub-domains to target popular areas (i,e. http://London.domain.co.uk).
These sub-domains take the content from the main site but of course only display results relevant to London and are optimised for "London + Keyword"
Any page that gets duplicated (i.e London.domain.co.uk/profile123 and www.domain.co.uk/profile123 are ALMOST the same content) we add a rel="canonical" link that points to the main domain+page on www.
All these sites have a large amount of links back to www.domain.co.uk/?Page so the user can also search in other areas other then London, etc. This method has worked well for us and is popular with both users and Google search results. All sites/sub-domains are added to GWT under the same account and all sites have unique sitemaps. I do however worry that Google may class this as link manipulation owing to the amount of links pointing back to the main domain and its pages (this is not the reason we use the sub-domains though) In addition to the above sub-domains we have a few domain names (5/6) that are keyword rich that we also place the same content on (i,e www.manchester-keyword.co.uk would show only content relevant to Manchester), and again these sites have links back to the main domain, so users can navigate other areas of the UK. I worry that these additional domains may also not be liked by Google What do people think? I have started to reduce/replace some of the additional keyword rich domains with sub-domains from the main site and then 301 the keyword rich domain (i.e. www.manchester-Keyword.co.uk now goes to http://Manchester.domain.co.uk) as i feel sub-domains may not be penalised as much as unique domains are.
There are domains that i dont really want to 301 as they bring in good amounts of traffic and users have bookmarked them, etc. Any opinions or what you think i should do would be great, as i really worry that if Google stops giving us good results, i'm in real trouble. Although im not sure if what we do is wrong with Google or not.0 -
Trying to figure out why one of my popular pages was de-indexed from Google.
I wanted to share this with everyone for two reasons. 1. To try to figure out why this happened, and 2 Let everyone be aware of this so you can check some of your pages if needed. Someone on Facebook asked me a question that I knew I had answered in this post. I couldn't remember what the url was, so I googled some of the terms I knew was in the page, and the page didn't show up. I did some more searches and found out that the entire page was missing from Google. This page has a good number of shares, comments, Facebook likes, etc (ie: social signals) and there is certainly no black / gray hat techniques being used on my site. This page received a decent amount of organic traffic as well. I'm not sure when the page was de-indexed, and wouldn't have even known if I had't tried to search for it via google; which makes me concerned that perhaps other pages are being de-indexed. It also concerns me that I have done something wrong (without knowing) and perhaps other pages on my site are going to be penalized as well. Does anyone have any idea why this page would be de-indexed? It sure seems like all the signals are there to show Google this page is unique and valuable. Interested to hear some of your thoughts on this. Thanks
Algorithm Updates | | NoahsDad0 -
Traffic dropping after blog was added...
We have a major client that gets over a thousand organic visits per day. They also self manage an adwords campaign that generates a few thousand clicks a day. Lat summer they got a new website with a blog. The blog has a couple of teaser paragraphs on the home page with the new blog posts, in addition to lots of static text about the items sold on the site. What has happened is, our organic traffic has changed form the main keywords of what the site sells, to things talked about in the blog. Traffic hasn’t dropped dramatically, but we have lost all our big keywords, for example, a keyword that was generating 40 organic clicks per day a year ago we receive no traffic for now, and much more obscure keyword combinations from the blog are being picked up. Is it the addition of the blog that is causing this problem, or the blog teasers that are on the home page of the site? We like the traffic from the blog terms, but we would like our former main keyword traffic back as well, how can we accomplish this, and what do you think has occurred here? Thanks for your help.
Algorithm Updates | | WhiteHat120