Two identical websites need ranking locally
-
Hi
Wondering if someone can advise.
We have two websites with a .ie domian and .co.uk domain (e-commerce stores)
The websites are identical so we need to address duplicate content issue. The issue we have is we are targeting both local Google to rank, google.ie and google.co.uk.
Obviously to handle duplicate content we are going to have to "rel can" the one of the websites, which will probably be the .ie domain.
Question is, will this effect the ranking within the .ie domian on google.ie.
And any advice on how anyone else handles this situation would be greatly appreciated, we have had no issue ranking before with one domain on a local search engine, but this is the first time we have come across needing to rank two domains with identical content on each local search engine
Thanks in advance
John
-
Hi Andy
Yes, hear you loud and clear. I think actually treating everything on the two websites as different entities is the way to go, different url extensions subtle changes to headings etc, and try our best to mix content up, even if we need to add to it (although I will be braking my Golden rule of not creating content for clients websites, my real pet hate at web development stage.)
Thanks again for your help and advice, really appreciated, and your kind offer of help, I may take you up, so expect PM soon offering copious amounts of Guinness in return for advise
Kind Regards
John
-
yeh! - sleep whats that?
in that case try to optimise things like the page titles to the country and the description if you can amend that somehow ... maybe slightly change the way urls are output (one with brand name one without) that other thing you could do is have microformats on the site you are preferring over the other.
Something I did find once a while back is that if you have 2 very similiar sites and you optimise in different ways (eg with microformats above) then you can in some ways get an advantage over competitors by blocking listings.
Hope that helps - feel free to message me more detail such as urls and i can take a look and see what else i can advise.
-
Hi Andy
Thanks for reply.
Yes, we did suggest using a .com, the client has the .com also, but the client has a preference to use local domains as there will be a small bit of bespoke pages relating to each country, but in the vast the same content on each website.
We even suggested sub domains with a JS redirect for each country of the .com, but we have to work with .co.uk and .ie
We has also considered mixing the content descriptions up for each product. so they wont be exactly identical, unfortunately after receiving the for product descriptions, there is little or no in the actual descriptions that will allow for making each identical to each website.
But yes, I do take your point regarding large manufactures who have a local site for each country, with basically the same description for each product, they don't have the duplicate content issue effecting their rankings, but lots of other factors contribute to that I imagine.
I love this quote, " if you can't don't loose sleep and optimise elsewhere like hell", if only
Regards
John
-
it's a little tricky to answer the question for a few reasons - but first there is a the question of could you not buy the .com and target both from the one site - and avoid this issue?
Other than that it maybe worth considering some mixing and twisting of content in someways as well as using rel=can - Google doesn't always see dupe product descriptions across sites as dupe content, if they did no page with a manufacturers description would rank - this translates to "if you can write your own great! if you can rewrite a bit or twist it - ok ... if you can't don't loose sleep and optimise elsewhere like hell!"
Hope that helps - though could just through up more questions
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
I Need to put static text every page (600 words) need advice
i need to put static text (about our company brief 600 words) to all content section of pages of our website. I know it's bad for SEO Duplicate Content. But i need to tell google this is my static content and do NOT crawl it. Or something like that. canonical is for whole page but i need to set it up for certain positions of every page. is that possible?
Intermediate & Advanced SEO | | nopsts0 -
Homepage not ranking in Google AU, but ranking in Google UK?
Hey everyone, My homepage has not been ranking for it's primary keyword in Google Australia for many months now. Yesterday when I was using a UK Proxy and searching via Google UK I found my homepage/primary keyword ranked on page 8 in the UK. Now in Australia my website ranks on page 6 but it's for other pages on my website (and it always changes from different page to page). Previously my page was popping up at the bottom of page 1 and page 2. I've been trying many things and waiting weeks to see if it had any impact for over 4 months but I'm pretty lost for ideas now. Especially after what I saw yesterday in Google UK. I'd be very grateful if someone has had the same experience of suggestions and what I should try doing. I did a small audit on my page and because the site is focused on one product and features the primary keyword I took steps to try and fix the issue. I did the following: I noticed the developer had added H1 tags to many places on the homepage so I removed them all to make sure I wasn't getting an over optimization penalty. Cleaned up some of my links because I was not sure if this was the issue (I've never had a warning within Google webmaster tools) Changed the title tags/h tags on secondary pages not to feature the primary keyword as much Made some pages 'noindex' to try and see if this would take away the emphases on the secondary pages Resubmitted by XML sitemaps to Google Just recently claimed a local listings place in Google (still need to verify) and fixed up citations of my address/phone numbers etc (However it's not a local business - sells Australia wide) Added some new backlinks from AU sites (only a handful though) The only other option I can think of is to replace the name of the product on secondary pages to a different appreciation to make sure that the keyword isn't featured there. Some other notes on the site: When site do a 'site:url' search my homepage comes up at the top The site sometimes ranked for a secondary keyword on the front page in specific locations in Australia (but goes to a localised City page). I've noindexed these as a test to see if something with localisation is messing it around. I do have links from AU but I do have links from .com and wherever else. Any tips, advice, would be fantastic. Thanks
Intermediate & Advanced SEO | | AdaptDigital0 -
Can changing G+ authorship on a well-ranking article drop its search ranking?
We have an article that ranks #1 in Google SERP for the keyword we want it to rank for. We decided to revise the article because although it's performing well, we knew it could be better and more informative for the user. Now that we've revised the content, we're wondering: Should we update the article author (and the G+ authorship markup) to reflect that the revisor authored the content, or keep the original author listed? Can changing G+ authorship on an article impact its search ranking, or is that an issue that's a few Google algorithm updates down the road?
Intermediate & Advanced SEO | | pasware0 -
Loosing Ranking
Hi, I'm working on a real estate website last 2 years. I've get top position on local and organic search result. But last 2 month I'm loosing ranking on some keywords on both local and organic search. I'm doing boomarking, guest posting related to real estate and social media promotion but getting no result that's way I'm looking seo person for my website. Thanks and waiting for your feedback
Intermediate & Advanced SEO | | KLLC0 -
Rankings dropped - Why?
At the end of November a client site dropped significantly in the rankings. The drop effected almost all the keyphrases we monitor. Historically the homepage has always ranked higher than the sub-pages - however now it seems Google is no longer ranking the home page, and instead ranking the sub-pages, just far far lower down. Any ideas what could cause this?
Intermediate & Advanced SEO | | cottamg0 -
Purpose of a Blog in a website
How internal blog or external blog is helpful in SEO?why it is good to have a site with blog?
Intermediate & Advanced SEO | | Alick3000 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590