Concerned
-
Hi SEOmoz fans,
Hang on a minute, I sound like Rand, watching to many WBF's.
Ok let me start, I am currently doing all the marketing for a website which has both .co.uk & .ie due to legal reasons, the website and .co.uk & .ie have all the same content and even when I am writing new pages, which is on a regular basis I make sure both are being updated, anyway after all the research there should not be an issue with this (duplicate content) as Google recognizes that it's the same domain etc, however I was really ranking well for a specific keyword no.5 , very competitive now the page being ranked for that keyword is a .pdf in the site but is ranked no.68.
Now, I thought this is very strange as you can imagine, I never do any black hit linkbuilding or anything like that, that's a NO NO for me, anyway i put the URL which was ranking well in Google into the Google search box, and yes it appeared, so no sign the URL has been banned, however when I paste in the first few paragraphs of that page which was ranking well in Google.co.uk into the Google search box it's the .ie website which appears not the .co.uk
Can anyone help me out, advice etc
Kind Regards
-
Sorry Peter, my previous reply looks strange, was using the iPad, not sure what happened, anyway, what I meant to say was:
So just to clarify, the homepage of the website would show:
And product page would be:
-
So just to clarify, the homepage of the website would show: And product page would be:
-
Argh - I'm sorry, yes. The hreflang="" code is the same, but the URL is the cross-language version of that URL. As long as the URL structure stays the same, this shouldn't be too hard, but if you use different structures, it could be a pain. I'm editing my previous reply.
-
Just one more question
Example
If this is on a particular product page does it have to be :
-
Thanks Peter, you have been a great help so far.
I will make these changes and let you know how I get on.
-
No - I'll be perfectly honest: I don't do a ton of international. The international SEOs I trust seem to think positively about the new tags, but we don't have a ton of data. The upside is that they're relatively easy to implement and they don't carry any real risk. The worst that happens is that it doesn't work.
My gut reaction is that there's regional confusion and Google is having a tough time reconciling duplicates. That's more in line with the inconsistent ranking you describe than a full-blown penalty would be.
-
Ah! fantastic.
Have you tried this before? Do you recommend putting this across the whole site?
Another thing I noticed is that when I paste in a first paragraph from a co.uk webpage into Google.co.uk it's the .ie webpage that appears, however on another webpage on the .co.uk website it's the .co.uk webpage that appears in Google.co.uk, hope that makes sense? what I would say is that the page in question that is not ranked, if I paste the URL into Google.co.uk it still appears.
-
The two sites should point at each other and use the region codes, so...
(1) The English site should have this tag:
(2) The Irish site should have this tag:
That way, whichever site Google hits, they're aware of the other site(s).
-
Hi Peter,
The .ie website is not shown in the google.co.uk for the target keyphrase, however what I did in google.co.uk was I pasted the first paragraph of the page which was ranked on page 1 for that target keyphrase and it's the .ie website that appears, .co.uk website is not where to be seen.
I have been doing some link building, however nothing excessive, and on authority websites, industry specific, I just don't feel it could be this so the only thing left is that this webpage has been penalized for duplicate content even though the .co.uk page has been indexed before the .ie webpage.
The strange thing is, I am still ranking really well, top 5 for about 30 or so keywords, very competitive keywords at that, so why would Google just be penalizing that specific webpage in question and not others, arrrrrrggggghhhhh, this is really getting to me.
Do you recommend that I place this code on the .ie webpage:
Pointing to the .co.uk website?
-
This can get tricky - rel-canonical passes link juice, but it could also prevent the .ie pages from ranking. Google is a bit inconsistent with this internationally, sometimes, a non-canonical version will still rank, if it's more relevant to the country/language of the query, but I'd hate to trust that.
-
No you use robot.txt to restrict pages from pages. Rel-canonical passes link juice. However, I would also look into what Dr.Peter is suggesting.
-
Unfortunately, while you should be able to theoretically target .co.uk and .ie separately, Google can screw it up on occasion and treat them as duplicates. If you're seeing the copy bring up the .ie site on Google.co.uk, that's definitely a possibility. You could try the new hreflang approach - see this Google resource:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
It's basically for regional content where the language is the same (there are other variants, but that's a big one), since Google knows they don't always get it right.
It is also possible that the .co.uk page has been penalized and other content is just being brought in to fill the spot - since the PDF is at #68, that's also possible. Have you done any recent link-building pushes to this particular page?
-
Hi Donnie,
If I use a rel='canonical' on the .ie webpage, is this not telling Google that you do not want this page to rank?
-
Hmm, your link juice will flow through with a canonical code. However, I don't think this is the problem, in your case. I would experiment try adding the code and see if your results are back up in a few days... If not take the code down.
-
I don't want to use a rel='canonical" as I want the .ie website to rank well for all keywords in Google.ie and at the moment this seems to be the case.
-
Yes, maybe the .pdf was always there.
All optimization tests have been done, this was all done before pages went live.
Changes were made first & foremost for the user, and from the results I gave you, this is clearly proven a success.
It was the main body of the text and structure that was changed, header tags etc all remained the same.
I checked Bing, and the URL in question is still on page 1 for the keyword.
-
Maybe the .pdf was always there just unnoticed?
Perhaps its something you changes on the page: Did you run an SEOmoz onsite optimization test?
What did you change on the page? Also, did you change any internal links pointing at that page? if its none of these factors, it can also be an external linking factor.
-
Hi Donnie,
Thanks for your reply.
Yes, it was ranked no.5 and now it's gone and replaced with the .pdf in pos.68 for that phrase.
I have re-written this page and pages related to this page. What I have seen in Analytics when I have made these changes, is that bounce rate has improved from 70% to 30%, Avg time on site has increase by 2 minutes and page views has also increased, so from the user experience it has worked as I imagined, with Google not.
I have checked Google webmaster tools, no messages.
-
Hi Gary Do you have a rel="canonical" on the .ie version of the site that points to the .co.uk pages? Basically tells the bot that this site is a direct copy and the .co.uk is the one to crawl. It may be that because it is duplicated across 2 different domains you are getting penalised for it. More about rel="canonical" here:- http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps Also a WBF about cross domain canonical links:- http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday Hope this helps
-
I am a little confused, what made you lose your spot 5 ranking? Did you move your page to a .pdf? how is the .pdf relevant to your keyword? Or was one page ranking in 5 and now its gone. However, you found your .pdf in spot 68 for that phrase?
Did you change anything on that page that was ranking or on your site? Usually something causes a loss in rankings. Esp. when you go from spot 5 to nowhere to be found. Have you checked your Google webmaster tool there may be a message there.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema markup concerning category pages on an ecommerce site
We are adding json+ld data to an ecommerce site and myself and one of the other people working on the site are having a minor disagreement on things. What it comes down to is how to mark up the category page. One of us says it needs to be marked up with as an Itempage, https://schema.org/ItemPage The other says it needs to be marked up as products, with multiple product instances in the schema, https://schema.org/Product The main sticking point on the Itemlist is that Itemlist is a child of intangible, so there is a feeling that should be used for things like track listings or other arbitrary data.
Intermediate & Advanced SEO | | LesleyPaone2 -
Cloudflare - Should I be concerned about false positives and bad neighbourhood IP problems
I am considering using cloudflare for a couple of my sites.
Intermediate & Advanced SEO | | lcourse
What is your experience? I researched a bit and there are 3 issues I am concerned about: google may consider site bad neighbourhood in case other sites on same DNS/IP are spammy.
Any way to prevent this? Anybody had a problem? ddos attack on site on same DNS could affect our sites stability. blocking false positives. Legitimate users may be forced to answer captchas etc. to be able to see the page. 1-2% of legit visitor were reported by other moz member to be identified as false positive.
Can I effectively prevent this by reducing cloudflare basic security level? Also did you experience that cloudflare really helped with uptime of site? In our case whenever our server was down for seconds also cloudflare showed error page and sometimes cloudflare showed error page that they could not connect even when our server response time was just slow but pages on other domains were still loading fine.0 -
Migrating e-commerce platform (same domain). Do I need to be concerned about these changes?
We are moving a domain from oscommerce to prestashop.
Intermediate & Advanced SEO | | lcourse
We will setup 301 redirects for each page and have made sure that new platform is following SEO best practices. I read a lot that it is critical to keep changes to a minimum when migrating to a new domain, but is this also critical when migrating just to a new e-commerce platform (same domain)? Change of URL is unavoidable, but what about these other changes below? Would you be very concerned about doing them at the same time, or rather would you do them some time after the migration? title tag (about 30% of text in title tag will be different) meta description tag (more customized and varied meta description than before) h1 (expanding product name with some relevant keywords for a number of products) additional table with product features (additional content in product pages) adding additional products to store moving to https instead of http Product descriptions and product images and category descriptions will remain the same. Replicating title tag, title description and h1 from old site would actually imply quite a lot of additional work at this point and we would have to make the change anyway at a later point, so if it is not a major risk I would prefer to do it in one go. Any thoughts?0 -
Rotating Content Concern on Deep Pages
Hi there, I apologize if I'm too vague, but this is a tough issue describe without divulging too much of our project. I'm working on a new project which will provide information results in sets of 3. Let's say someone wants to find 3 books that match their criteria, either through their organic search which leads them to us, or through their internal search on our site. For instance, if they're looking for classic movies involving monsters, we might display Frankenstein, Dracula, and The Mummy. We'd list unique descriptions about the movies and include lots of other useful information. However, there are obviously many more monster movies than those 3, so when a user refreshes the page or accesses it again, a different set of results show up. For this example, assume we have 5 results to choose from. So it's likely Google will index different results shuffled around. I'm worried about this causing problems down the line with ranking. The meat and potatoes of the page content are the descriptions and information on the movies. If these are constantly changing, I'm afraid the page will look "unstable" to Google since we have no real static content beyond a header and title tag. Can anyone offer any insight to this? Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
Offering discounts and getting backlinks - concerned.
Hiya Mozzers, My client is about to offer discounts (to a few large multinationals... for staff) and there's every possibility these will appear on the web, with a backlink to my client's website (perhaps direct via websitest / via online newsletters and so on). I am thinking of telling client to restrict the number of companies they interact with while I monitor backlinks in case there's some kind of problem with backlinks generated. I am also telling them on no account to ask for backlinks or encourage keyword rich links. Any thoughts on this, anybody? Is there a risk of penalty or am I just being paranoid?
Intermediate & Advanced SEO | | McTaggart0 -
Linking Across Subdomains - Any Concerns?
I use two subdomains on my website (news.webhostinghero.com and www.webhostinghero.com) - I know www.webhostinghero.com is not really a subdomain... That said, both subdomains are linking to each other through menus and sometimes articles. Can this cause any problem? Does Google perceive this as links from different domains / websites?
Intermediate & Advanced SEO | | sbrault740 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Keyword Self-Cannibalization Concern
Right now I have an E-Commerce website that has layered menu properties. I have one page trying to rank for "NextGen Digital Ballast" that is the main category page. However, on that category page I link out to three product pages which would be "NextGen 400W Digital Ballast", "NextGen 600W Digital Ballast" and "NextGen 1000W Digital Ballast". The on page ranking factors tools is saying I may need to consider making adjustments because of the potential self-cannibalization, but I wanted to get some feedback to see what others thought about that. Thanks.
Intermediate & Advanced SEO | | JerDoggMckoy2