50% of my keywords just had a significant drop for no reason no changes?
-
I just looked at a report for one of my sites in moz pro and
35/75 keywords just dropped on google, I didn't make any changes recently
Ranking Changes
-
12
Improved
-
35
Declined
Many dropping 5 - 22 places!
Any ideas on whats up, thank fully it didn't effect traffic, its pretty low anyways, but it has stayed the same... But still dropped to page 5 for lots of stuff. Strange. Did google just update or something?
Thanks for any ideas.
-
-
Hello Willem,
Yes, in my opinion you should submit languages. I'm not sure but think that main site on .com should be set as basic NL site.
For each sub-site (language ver) sitemap in particular language is needed, I have no idea how your sitemaps are created. If on automatic way (plugin or other way) it seems to be a bit complicated, but it's better to have sitemap than not to have, it is worth your effort.
Use meta tags "alternative" in section on each site.
And ... ask other people, maybe there is something new. Theres a lot of stuff in the GWT blog and forum...
Marek
-
I do have 3 different sub directories on the main domain. /NL, /DE and /EN, one for each language.
So I should add a GWT 'site' for each language and set a target geographic location to the righ region and delete the hometextileshop.com GWT 'site'(that is the main domain for 3 languages)
That would mean remove the sitemap too since it has 3 languages in it too.
As for the sitemap: would it be better in my case to remove the sitemap all to gether and let the bots just crawl the site and index?
Or should I have the sitemap module reprogrammed to use the "alternative" tag.
And how would the sitemap be applied if I had 3 GWT sites (one for each language)
For each language a sitemap with the target language and the other 2 as 'alternative', or just one sitemap for all 3 GWT-sites with 1 main language and the other languages as "alternative'?
-
HI Willem,
I'm not a prestashop specialist but i know that on one licence you ca do something like that shop-en shop-cs - it is from Polish prestashop forum ... forum.prestashop.pl
My " use rel="alternative" for language versions" refers to the situation when you have language versions and you want to prevent from duplicated content.
When it comes your situation - rankings, in my opinion it is better to have 3 different subdomains or subfolders for different languages.
As I understand your site uses language switcher... and a versions are in main domain subfolders. For GWT doesn't matter which method you use - subdomain or folder.
Key thing is that you should submit to GWT 3 different language versions (subfolders) and do a verification with different GWT meta tag (if you use this method for GWT verification) or code or with three diff GA accounts.
So 3 lang ver -> 3 GWT "sites" -> 3 verifications by GWT
And last but not least ... you still should use "alternative" tag to ensure crawl-bots that it is different site version not duplicated content. remember that not only text can be duplicated content... images or other stuff too
It should work...
Cheers,
Marek
-
Hi Marek,
Your comment on the rel="alternative" got me thinking.
I run a prestashop in 3 languages and use a sitemap module, and it doesn't use the alternative tag
( www.hometextileshop.com/sitemap.xml)
I always had trouble to rank in more than one language, the standard language in the shop seems to affect the language priority and since I use a multi langual sitemap I see english sites rank on dutch keywords (although the standard language is dutch), the dutch keyword is not even on these sites.
Somehow the multi language part of the site brings trouble and search engines seen to have trouble with it althoug each site is differtent in each language. (one url per language, own title and description and (translated) content)
Any suggestions?
-
Hi Syndicate,
As you wrote - you did change nothing. It is possible that one keyword still has a good support for content, links etc... but other not.
Your competition probably did changes. Few tips:
-
you should analyze bounce rate in correlation with time spent on page
-
for pages with low bounce rate and short time spent and low traffic is better do them noidex and nofollow - they decreases site quality
-
implement microformats,
- big impact has rel="author", "author" links should provide to author's site not to publisher site
-
social media are the key to modern SEO position
-
G+1 is not a social media but it helps a lot
-
employ responsive site content (mobile) more than 20% of the internet traffic is on mobile devices (usage of CSS3 and HTML5 is a need)
-
use rel="alternative" for language versions
-
use Google Map Maker
-
and many many more, I advice you to read mentioned conference materials
You have to change your site - it is a must.
Just imagine that you start on Ford Model T in Formula 1 ...
The distance between Ford T and McLaren MP4-27 (2012) is the same like between internet five years ago and now.
Marek
-
-
Anything is possible but I have a site that I haven't touched ANYTHING for about 5 years now and it still holds #1 for its main keywords. So not sure but seems not all sites are punished for old content, that content is nearly 10 years old but it still ranks #1 and top 5 for most all of its original keywords. Makes it real hard to figure things out!
-
Hi Bron,
Maybe it was because you did nothing, remember the words "Change or die" said in 4th century by Claudian Claudianus and also in recent times by marketing guru John Porter from Harvard.
In 2011 google did 525 changes in algorithm. In 2012 they are still doing changes. Google new trend is to promote sites friendly for users not for robots. Backlinks are still have have impact but not as big as in the past. Now we're stepping in new world ... semantic search. Now for Google good article is article between 1000 - 2000 words with good quality and with minimum once a week publication on your site/blog. There are also lots of other changes and news ... you should take a look on SMX conference materials.
...so maybe you have to change something, do a site revision and next do changes...
Take care,
Marek
-
Yes something happened, not sure what. I have 3 competitors in my report.
Most of them dropped by only by a normal amount of 1 point not 25. None had more then 1 or 2 points in changes while the vast majority of mine were off the page.
I didn't do any linkbuilding, all I have done in the past month or so, it do some minor content changes to only the homepage. Really strange, maybe some of the links changed offsite or something. I cannot figure out what happened.
-
I noticed the same thing on my websites. (all of them)
What stood out was that alle the keywords that have more competition and had more linkbuilding over the past year dropped more than keywords with no linkbuilding on it.
Around met I heard more people telling me that their sales in their shop have dropped since middle of this week.
The key question is what to do about this? How to get ranking up again.
If there was a google update it did not seem to effect my competition. The people mentioned above all have websites around the same size as mine. Maybe that's got something to do with it (guess)
Please ask any question that arises, I'll be glad to answer if that leads to a solution to this.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trafic drop after a huge indexation
Hello everyone, My website used to have about 500k indexed pages in Google. After publishing fresh sitemaps and a little local "buzz", it now has about 6 millions indexed pages and the numbers are skyrocketing (GWT says 7 millions and it will probably keep going). My website has a total number of pages of 10 millions. I used to have about 5k organic visite each day, but since the big indexation has started, I now have half less. I read many things about that kind of trafic drop, and it seems to be a normal step when indexing a huge site. I just wanted to know if you guys had any similar experiences and if yes, if there are specific tasks to do in order to recover/develop the organic trafic or if it's just a matter of time. Thanks for your help and share of experiences 😉
Algorithm Updates | | Pureshore0 -
Whats the best method to tackle website traffic drop?
ON 14th November following a DNS error (which seems to have been a google error as local servers were not effected and it happened to many people everywhere) my traffic started dropping. Within 5 days it was down 50%. In a panic to resolve the situation I thought it was because i was on a shared host and moved to VPS which was a disaster. Ive had server errors since. After desperately looking everywhere for news on the DNS error and whether there as algorithm change etc I decided i should have a look at my site @ www.mutantspace.com and see if there were any internal issues. In Google Webmaster forum a helpful moderator suggested that I do 2 things: 1. Deal with the fact that I had keyword stuffed my <alt tags="">. Basically i run an arts blog so have 8 - 10 images per post and was putting the same text in each one i.e [artists name][artform][name of artwork]. I stupidly didnt realise what i had been doing and have since been deleting my alt tags for every image except one per post. However i have 17,000 images so its going to take a while. </alt> 2. She also linked me to https://ahrefs.com/site-explorer/overview/subdomains/http%253A%252F%252Fwww.mutantspace.com%252F and wondered why i had such volatile inbound links. I dont know why. And cant figure it out As far as everything else goes I dont know what I could be doing wrong to deserve penalty - if it is a penalty. I dont back link so all my links are natural (from artists, galleries, art blogs, tumblrs, etc) I dont sell advertising (yet anyway) Having said that ive been told i have too many links on each page (i run a wordpress site and so have categories, etc) so im wondering if i should nofollow my categories? In short im wondering what advice anyone has on doing a systematic shake up of my site. Im currently doing the following: 1. deleting most of the <alt tags="">on my posts. Ive got back as far as 2012 and will keep going til theyre all done.</alt> 2. Redirecting all crawl errors 3. No Following more outbound links and links to social networks 4. Checking all inbound links to see if there is an suspicious domains 5. Sorting out the fact that ive had numerous server errors for the last 2 weeks (would that affect SERPS?) Is there anything else i can do? Should do? much appreciated
Algorithm Updates | | mutant20080 -
Google keyword tool
I was quite happy with google keyword tool for basic and accurate searches for keywords. Can anyone suggests a new tool that will give accurate search volume on google ( country specific ) I am not interest in info for adwords, and find a keyword planner tool way out in traffic results, compared to Keyword tool. Is the keyword tool completely gone?
Algorithm Updates | | summer3000 -
Why is site dropping in rank after we update it?
One of our sites - supereyes.com - appears to drop in rank after we update it. The client notified us of this today and I've verified that it did indeed drop in Google -- four spots since last week. He says this happens every time we make changes to the site, but then a week later it will go back up and is usually higher than where it was before. I have not verified this, but I'm very worried it may not rise again In the past week, we've posted a new blog entry to their site and we've changed some of the content -- specifically, added their locations to the header, added a contact page and put two testimonials in their sidebar. We've also had someone submitting their site to directories and local business sites like Angie's List and so forth. There are about 16 new backlinks established in the past 2-3 weeks. Also, I should note, traffic is higher than it's ever been, but the client doesn't look at traffic. They only look at their Google results. Can anyone offer any insight into what's going on here and if I need to be worried the site won't rise again in the rankings?
Algorithm Updates | | aloley0 -
Is This Keyword Stuffing/Spamming?
We are a custom patch company--we make patches for many different types of clients. I have a gallery of patches for almost every kind of client, and they all have their own pages. If I put navigation on the home page such as what I show below, will Google consider that to be too much? Boy Scout Patches | Motorcycle Patches | Fire Patches | Police Patches | Military Patches | Sports Patches | Business and Organization Patches | Paintball Patches | Scooter Patches | In Memory Patches They would all be links to different pages, and there would be literally 50-60 more! Would it be better to remove the word patches from all of the links? And then another question comes up: too many on-page links?
Algorithm Updates | | UnderRugSwept0 -
Drop in traffic last 24 hours plus title tag not correct
Over the last 24 hours we've seen a drop in traffic to our website from Google. Basically since Satursday 19th Nov the traffic has drop about 50% normal levels. The strange thing I noticed is that when you search Google for a search phrase the snippet Google is showing as the title is not the same as we have in a meta title tag on the page and this is across many pages, not just 1 or two. Anyone know why this is and what is happening ? Thanks
Algorithm Updates | | NeilTompkins0 -
PageRank Is Zero For Some Reason
Now, I'm not a stickler for pagerank, and actually haven't checked the pagerank of ANY of the sites I own and operate in the last year or two, but one of my cronies noticed the other day that our BathroomGetaways.com websites pagerank is 0, despite the fact it ranks well and gets a decent amount of traffic for a great deal of the terms we target. My guess is that we still just don't have the quality links we need, and we're working on that, but is there anything glaringly obvious that anyone else can see that can be the cause of this? One thing to note, we did take the site offline for 2-4 months or so towards the end of last year and just relaunched it in January of 2011. It climbed right back into the rankings almost immediately after our re-launch, so maybe that has something to do with it?
Algorithm Updates | | AarcMediaGroup0 -
Keyword Research for Real Estate Industry
I just finished reading the Beginners Guide to SEO at SEOMoz and joined as a Pro Member. I have created a campaign for my new SEO project and feel like I have a good understanding now of on-page optimization. I am going to start fixing title tags and on page content for our top 50 pages and start a new campaign to go after some keywords. Our website is 360dwellings. I am struggling to determine what the best keywords are for us to target. Right now our primary markets are Denver and San Diego, we also display listings for all of Colorado as well. We had originally gone after competitive keywords like "Denver Homes for Sale". What I am learning is that even ranking bottom of page one for that term doesn't bring a ton of traffic. Meanwhile, we rank well for a lot of niche content like "5280 best neighborhoods" "Denver Lofts for Sale" and "Denver Neighborhood Map". My questions is do we completely abandon going after big keywords like 'Denver CO Homes for Sale", and 'Denver Real Estate" and go as far as removing them from title tags? We have pages for every Denver neighborhood like Park Hill and the Highlands, but there is no search data for these searches in Google Keywords. My gut says that if each of those pages ranked for terms like "Denver Highlands Neighborhood Homes for Sale" that it would bring good targeted traffic. Does anyone know of search terms for Real Estate that are low competition but have some search volume? Thanks!
Algorithm Updates | | 360ryan0