Got dropped on Google rank - Tips to discover why please
-
Hi guys
originally my website was poor ranked on Google. So, after sign in on Moz and follow their tips I achieved the 4th position for one of my keywords (amazing!).
But a few days ago my page dropped to bellow the first 50th pages for this same keyword, but I didn't make any changes on it.
Anybody has some tips of how can I discover/repair what happened?
Thank you all in advance.
Best regards
Paulo
-
I'm not sure if I found the root cause of my problem... so let me ask you all if you have heard it before:
After change my website to include the Moz improvement suggestions, I logged in Google Webmaster Tools and ran "Fetch as Google" option and then submitted to index.
But, after 5 or 10 minutes, I discovered a small problem on my page and, after fix it, I ran the "Fetch as Google" again and submitted to index.Probably Google interpreted this two submissions as a "Black SEO" and punished my website?
Btw, according to Moz Rank Tracker, I'm the 5th result for the keyword I want.
Thank you all in advance.
Best regards
-
Hi Martijn
thanks for your reply!
My website is http://www.vetwork.com.br. The keyword I want to rank for is "sistema veterinário".
No, according to Moz, the keyword is not competitive, also doesn't have too much news for it.
Basically the other sites kept with same position order.
I'm not working with any other keywords by now.
What do you mean about "manual actions against my site"?Best regards
-
Hi Andriy
thanks for your reply!
I added the keyword on title and body. Marked some of them with Bold and H1, H2 tags.
I can't say how much time it stayed on 4th place before the drop.. maybe one or two days...
What do you mean by "back link profile"?Best regards
-
Hi Paulo,
We need some more information for this one, could you share the URL of the page you would like the keyword to rank for? Is the keyword competitive? Is there much news around the keyword? Did any other sites jump up since you've dropped your rankings. Have other keywords also suffered? Did you have any manual actions against you site according to Google Webmaster Tools?
-
Paulo, be more concrete. What exact changes did you make? How much time ran after the changes? Did you change the back link profile?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor Drops 10,000 links since last index. Lets play detective.
One of the intriguing things about SEO is being able to reverse engineer your competitors rankings because all the technical information is available for those who know where to look. I recently looked at my Dashboard and saw that one of my competitors had dropped 10,000 links. The questions is why? Google algorthm change? Blackhat Penalty? Something else.? Here are the numbers, I am going to lieave my own clients site out because his numbers are pathetic. www.Leafly(dot)com 50.4k Links Down 10k www.thcfinder(dot)com 1,530 links Down 71 www.weedmaps(dot)com 64,000k links Up 1.5K Is it just me or is that a lot of links to loose over one indexing period?
White Hat / Black Hat SEO | | DavidMeshah0 -
Besides description and design optimization, is there any other main factor that we can influence to get better App Store rankings?
Hi there! I do love SEO, the cracking Web Search engine, but when it comes to other Google's search engines like Youtube and Apps Store it's an unknown field for me.
White Hat / Black Hat SEO | | Gaston Riera
So, i'm diving into App Store Optimization, ASO. This is my question: Besides the text and the design in the description of the app, is there any other factor that we can manipulate or influence?(such as linkbuilding, social media or alien magic hehe). Thanks a lot!
GR.0 -
How to re-rank an established website with new content
I can't help but feel this is a somewhat untapped resource with a distinct lack of information.
White Hat / Black Hat SEO | | ChimplyWebGroup
There is a massive amount of information around on how to rank a new website, or techniques in order to increase SEO effectiveness, but to rank a whole new set of pages or indeed to 're-build' a site that may have suffered an algorithmic penalty is a harder nut to crack in terms of information and resources. To start I'll provide my situation; SuperTED is an entertainment directory SEO project.
It seems likely we may have suffered an algorithmic penalty at some point around Penguin 2.0 (May 22nd) as traffic dropped steadily since then, but wasn't too aggressive really. Then to coincide with the newest Panda 27 (According to Moz) in late September this year we decided it was time to re-assess tactics to keep in line with Google's guidelines over the two years. We've slowly built a natural link-profile over this time but it's likely thin content was also an issue. So beginning of September up to end of October we took these steps; Contacted webmasters (and unfortunately there was some 'paid' link-building before I arrived) to remove links 'Disavowed' the rest of the unnatural links that we couldn't have removed manually. Worked on pagespeed as per Google guidelines until we received high-scores in the majority of 'speed testing' tools (e.g WebPageTest) Redesigned the entire site with speed, simplicity and accessibility in mind. Htaccessed 'fancy' URLs to remove file extensions and simplify the link structure. Completely removed two or three pages that were quite clearly just trying to 'trick' Google. Think a large page of links that simply said 'Entertainers in London', 'Entertainers in Scotland', etc. 404'ed, asked for URL removal via WMT, thinking of 410'ing? Added new content and pages that seem to follow Google's guidelines as far as I can tell, e.g;
Main Category Page Sub-category Pages Started to build new links to our now 'content-driven' pages naturally by asking our members to link to us via their personal profiles. We offered a reward system internally for this so we've seen a fairly good turnout. Many other 'possible' ranking factors; such as adding Schema data, optimising for mobile devices as best we can, added a blog and began to blog original content, utilise and expand our social media reach, custom 404 pages, removed duplicate content, utilised Moz and much more. It's been a fairly exhaustive process but we were happy to do so to be within Google guidelines. Unfortunately, some of those link-wheel pages mentioned previously were the only pages driving organic traffic, so once we were rid of these traffic has dropped to not even 10% of what it was previously. Equally with the changes (htaccess) to the link structure and the creation of brand new pages, we've lost many of the pages that previously held Page Authority.
We've 301'ed those pages that have been 'replaced' with much better content and a different URL structure - http://www.superted.com/profiles.php/bands-musicians/wedding-bands to simply http://www.superted.com/profiles.php/wedding-bands, for example. Therefore, with the loss of the 'spammy' pages and the creation of brand new 'content-driven' pages, we've probably lost up to 75% of the old website, including those that were driving any traffic at all (even with potential thin-content algorithmic penalties). Because of the loss of entire pages, the changes of URLs and the rest discussed above, it's likely the site looks very new and probably very updated in a short period of time. What I need to work out is a campaign to drive traffic to the 'new' site.
We're naturally building links through our own customerbase, so they will likely be seen as quality, natural link-building.
Perhaps the sudden occurrence of a large amount of 404's and 'lost' pages are affecting us?
Perhaps we're yet to really be indexed properly, but it has been almost a month since most of the changes are made and we'd often be re-indexed 3 or 4 times a week previous to the changes.
Our events page is the only one without the new design left to update, could this be affecting us? It potentially may look like two sites in one.
Perhaps we need to wait until the next Google 'link' update to feel the benefits of our link audit.
Perhaps simply getting rid of many of the 'spammy' links has done us no favours - I should point out we've never been issued with a manual penalty. Was I perhaps too hasty in following the rules? Would appreciate some professional opinion or from anyone who may have experience with a similar process before. It does seem fairly odd that following guidelines and general white-hat SEO advice could cripple a domain, especially one with age (10 years+ the domain has been established) and relatively good domain authority within the industry. Many, many thanks in advance. Ryan.0 -
I have 4012 links from one blog - will Google penalise?
My website (http://www.gardenbeet.com) has 4012 links from http://cocomale.com/blog/ to my home page -a banner advert links from the blog - I also have 3,776 from another website to 6 pages of my website 1,832 from pinterest to 183 pages etc etc overall there are 627 domains linking to my website I have been advised by a SEO company that I was penalised in about may to july 2012 due to a large number of links coming from one domain or two domains is that true? should I ask the blog owner to remove my link?
White Hat / Black Hat SEO | | GardenBeet0 -
Sudden Recent Drop in Impressions in GWT - WTF?
I noticed this recent drop in impressions in Google Webmaster Tools. It started mid-February, and I know there was the page layout algorithm on the 6th, and I've heard mention of a Panda update around the 11th, so I started to wonder what was resposible. A manual penalty was just recently removed, too. As I dug deeper, I discovered other problems. For one a misredirected blog causing 404s, plus a redirected site whose duplicate pages were never removed from Google's index. There are also two exact match domains 301 redirected to the site, but there were no links or content prior to the redirect. In a site:operator search, one is showing a duplicate homepage. When the wordpress.com blog was redirected, it was not redirected to the /blog subdirectory. Could the resulting 404s which go back as far as I can see in GWT (3 month limit) be the cause of this drop? We're talking about hundreds of blog pages and their links. FYI the main nav in /blog pointed to the old site until 2/7 when I pointed them to the existing domain (so hundreds, if not thousands of links were being redirected) The million dollar question is: is it just the 301 redirect issue causing the problem here? It looks like I might just have exacerbated it when I fixed the nav menu links. Will fixing the redirect rescue the impressions? My plan of attack includes killing the 301 redirects from the exact match domains with no backlinks, and removing the old site from Google's index from within GWT. Any yays or nays? FYI, a 301 redirect of .index.html, default.asp, and non-www was done 1/8,
White Hat / Black Hat SEO | | kimmiedawn
the reconsideration request was sent 1/24, manual penalty lifted 2/10. Index.html still redirects twice, going to www.site.com/index.html before resolving at .com. Same with default.asp. IarDs8u0 -
Does the proximity to the center of the city have anything to do with higher rankings in local results ?
Does the proximity to the center of the city have anything to do with higher rankings in local results ?? If yes then how ?
White Hat / Black Hat SEO | | mnkpso0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Why doesn't Google find different domains - same content?
I have been slowly working to remove near duplicate content from my own website for different locals. Google seems to be doing noting to combat the duplicate content of one of my competitors showing up all over southern California. For Example: Your Local #1 Rancho Bernardo Pest Control Experts | 858-352 ... <cite>www.pestcontrolranchobernardo.com/</cite>CachedYou +1'd this publicly. UndoPest Control Rancho Bernardo Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 858-352-7728. Your Local #1 Oceanside Pest Control Experts | 760-486-2807 ... <cite>www.pestcontrol-oceanside.info/</cite>CachedYou +1'd this publicly. UndoPest Control Oceanside Pros specializes in the eradication of all household pests including ants, roaches, etc. Call Today @ 760-486-2807. The competitor is getting high page 1 listing for massively duplicated content across web domains. Will Google find this black hat workmanship? Meanwhile, he's sucking up my business. Do the results of the competitor's success also speak to the possibility that Google does in fact rank based on the name of the url - something that gets debated all the time? Thanks for your insights. Gerry
White Hat / Black Hat SEO | | GerryWeitz0