Whats happening with Google UK?
-
Within the last week we have had a handful of our rankings drop dramatically down the SERPS. About 15% but this an estimate and has not been fully investigated yet.
Whilst looking into possible scenarios that could be causing this i wanted to check what the SERPS looked like for the terms that we are still holding position on.
Typing "extending dining tables" into Google UK today i was amazed at what i found...
Ranking in position 1 and 2 is a massive UK furniture store.
But isnt that the same landing page being returned for both positions??It appears to be a navigation problem within the site category tags causing duplicate content. However they have been rewarded with the top two positons subsequently pushing our website onto page two.
I find it so frustrating that we listen to Googles best practices when it comes to pagination issues yet this is how our hard work is rewarded!
Anyone else have any thoughts about this?
-
Pleasure. Shout if I can help!
-
Fantastic. Thank you very much. Interestingly this website is hosted on a different platform to our others, so I wonder whether this has something to do with the config. We'll set up 301s for w. and ww. as a short term fix and look at the config going forward.
Many thanks again.
-
Hey, I think I have spotted something:
Google this:
portland clic-clac sofa bed
& Closely Check the result:
http://ww.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.html
ww not www
Also, we have another version of that page indexed:
v 1.
info:ww.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.htmlv 2.
info:www.franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.htmlSo, you have something whack going on with your sub domains.
Digging a bit deeper:
site:franceshunt.co.uk/live/sofa-beds/portland-clic-clac-sofa-bed.html
This shows that we have not only some ww. & www. results we also have pages being returned on
w.
ww.
www.
www.w.These are all the clic clac sofa bed pages so that most likely explains that one away and could well be at the root of your other problems.
I quickly checked the obvious and you do a 301 from franceshunt.co.uk to www.franceshunt.co.uk but if we do a general indexation query
site:franceshunt.co.uk
We see all kinds of weirdness and for the homepage alone (again, checking very quickly we have indexed and can resolve that page on
So.... not to hard to assume you may have lost a little bit of trust here through duplicate version of the page.
It obviously needs a bit more digging around but this should be easily fixed with a 301 for all these variations to www. and a double check across the board and on your internal linking to figure out just how this has happened and why it resolves on those wacky sub domains.
I didn't find a:
if-we-create-duplicate-versions-of-the-site-do-we-get-more-serp-share.franceshunt.co.uk but.... it resolves so it seems the site will resolve on any sub domain so we have two main issues
1. The virtual host is wrongly configured to allow it rank on anything.franceshunt.co.uk - a competitor could use this to harm you!
2. There are variations indexed that you need to take care of and a (*). rule for anything other than www. should 301 to the www. version of the page and that should, given a bit of time for reindexation etc, do the job (or at least help, who's to say we don't have multiple issues here).
Hope it helps and please let me know how it works out!
Marcus
-
First of all, thanks very much for taking the time to have a look for us and offer your opinions Marcus, much appreciated.
We are certainly going to be experimenting with the canonical tag in this way moving forward. We've never experienced problems with user interaction within the site since Google decided to start ranking the "show all" version of the pages instead so we've never really worried too much about it until now.
The worst hit was another non-competitive term "clic clac sofa bed" - we grew it steadily from 10th position back in feb and this was 3rd last week (!) and is no longer ranking at all! The page that was ranking is: http://www.franceshunt.co.uk/live/sofa-beds/
When this campaign began back in the old days of yore we were still using free directorys for optimisation of deep pages. Ive read alot about these being slowly de-indexed by Google so was wondering if this was having an adverse impact on some of the "weaker" pages. As you can see though there has been no off-site optmisation towards this page its a pretty new term (only added to campaign in feb) so im discounting that theory - for now!
-
Hey
First up, you have rel = next & prev on the paginated pages so that's good but I would also use the rel=canonical to the view all page as described here:
http://googlewebmastercentral.blogspot.co.uk/2011/09/view-all-in-search-results.html
The view all page in this category is not huge and loads nice and quickly so I cant see any reason not to 'help' google and give them the indication that this is where you want all rankings for those pages to be concentrated.
As always, experimentation is needed but I see things like this:
-
You have a view all page and that is the desired page to display and Google prefers it all by itself
-
You have a rel=next & rel=prev set up that is really for when you want to display individual component pages rather than the main page
-
The search query you are referencing has no intent that makes it more specific to one of the paginated pages so the ideal landing page is the view all page
So, remove the rel=next & rel=prev and canonical it to the view all page and see how you get on. Allow it to reindex, record the results and make an decision based on that information.
As a disclaimer, this may not make any difference with the ranking as it seems they are not indexing your paginated pages AND if we do an info query on the main category page it shows details for the show all page. That said, this is the correct way to do it unless you would rather show the individual pages so I would still make the change.
I think when it comes down to it, Harveys just have like 5 x as many linking domains as you and you both have fairly natural looking anchor text (at the most cursory of views) so they are just outranking you here. I have not digged into the other results between you and them and a drop from 3 to 11 is a bit more than the usual flutters - is there anything else that has had a similar drop?
-
-
Thanks Marcus!
Our site is http://www.franceshunt.co.uk/
We have asked a couple of questions before on Moz as to how best to solve the pagination issues within our site.
Google seems to prefer to rank the "show all" version of the targeted landing pages.
So whilst we are optimising http://www.franceshunt.co.uk/dine/extending-dining-tables/
Google prefers to rank http://www.franceshunt.co.uk/dine/extending-dining-tables/?p=all
Which hasn't caused us any problems before, yet now im wondering if this could be part of the issue too. Please let us know what you think!
-
We were ranking third before the update for this term.
Surely brand exposure and social signals are related to their number one positioning, but whats with the second result?
This is the same landing page yet through a different navigational path. This is what im questioning here?
-
Hmmm, yeah, that kind of sucks. That is the same page, and like you say it just seems to be either tagged as either living room or dining room. Looking at them closely, they are vaguely different, not a lot in it, both just a weak category page.
Whilst this is an obvious example of something amiss here, they should not have the top two spots, I would not waste too much time worrying about it. I imagine this will be a short lived deal for them.
Can you drop a link to your site? Maybe we can better advise you on what you can control so you can try to win back some footing here?
-
The update went in favour of companies with good brand exposure, so it is possible that Harvey's link profile is a mix of brand and keyword anchor text.
Your also notice they have 9,000+ facebook fans, in order to obtain that they must activity work on social media, so your also looking at social signals being built another thing Google is now focusing on.
But I don't really see that keyword being that competitive, you should be able to push through SERP's
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Why did Google Index a weird version of my blog post?
i wrote a page - https://domain.com/how-to-do-xyz/ but when doing an inurl search, i see that it is indexed by google as - https://secureservercdn.net/58584.883848.9834983/myftpupload/how-to-do-xyz/ (not actual url) and when i view that page, it is a weirdly formatted version of the page with many design elements missing. this is a wordpress site. Why would this be? thanks, Ryan
Web Design | | RyanMeighan0 -
Why is my homepage not indexed by Google or Bing
http://www.schoppnutritionclinic.com/ Home page is not indexed by Google or Bing but all other pages are indexed. I know that currently i am missing the robot.TXT file and the sitemap. This is something i am woking on as a possible solution. I would have thought Google/Bing would have still indexed this page regardless of the lack of sitemap/robot.txt files not being present. I attempted to run a fetch and render in Webmaster tools and received a Not Found status.
Web Design | | ChrisSams0 -
In Google Anayltics does the iPad always report portrait orientation?
I find I have disproportionate anount of visitors viewing at 768x1024, which coincides with the high iOS visitor rate. However, does this mean the vistors are all viewing in portrait orientation, or does the report portair regardlaess of orientation?
Web Design | | gotomarketers0 -
Google also indexed trailing slash version - PLEASE HELP
Hi Guys, We redesigned the website and somehow our canonical extension decided to add a trailing slash to all URLs. Previously our canonical URLs didn't have a trailing slash. During the redesign we haven't changed the URLs. They remained same but we have now two versions indexed. One with trailing slash one without. I've now fixed the issue and removed the the trailing slash from canonical URLs. Is this the correct way of fixing it? Will our rankings be effected in a negative way? Is there anything else I need to do. The website went live last Tuesday. Thanks
Web Design | | Jvalops0 -
Google Penalizing Websites that Have Contact Forms at Top of Website Page?
Has anyone else heard of Google penalizing websites for having their contact forms located at the top of the website? For example http://www.austintenantadvisors.com/ Look forward to hearing other thoughts on this.
Web Design | | webestate1 -
Google cache and rel alternate
http://groups.google.com/a/googleproductforums.com/forum/#!searchin/webmasters/rel$20alternate$20not$20works/webmasters/xzwTBJemPss/LyRjRCigZdYJ http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Web Design | | ctam
Situation:www.example.ca - canadian IP and https://www.google.com/webmasters/tools/settings:"Your site's domain is currently associated with the target: Canada" - done long time ago.<head><link rel="canonical" href="http://www.example.ca/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /></link<><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" /><link rel="alternate" hreflang="en" href="http://www.example.com/" />```
www.example.com - US IP and
https://www.google.com/webmasters/tools/settings:
"Geographic target Target users in: United States" done long time ago.
<head><link rel="canonical" href="http://www.example.com/" /> <link< span="">rel="alternate" hreflang="en-us" href="http://www.example.com/" /><link rel="alternate" hreflang="en-ca" href="http://www.example.ca/" />
<link rel="alternate" hreflang="en" href="http://www.example.com/" /> Differences: Prices and some minor changes in design. cache:www.example.com - shows .ca version,
with snapshot's date after rel="alternate" had been added.
Results: In usexample.com pages do not appear in search results.
Some times www.example.ca pages do,
but they are even close so well ranked as example.com pages before. Question: What we are doing wrong?</link<>0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0