How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
-
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested.
My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows
Sitemap: http://www.mysite.net/sitemapNet.xml
Sitemap: http://www.mysite.net/sitemapSe.xmlin robots.txt, would that result in some cross submission error?
-
Thanks for your help René!
-
yup
-
Yes, I mean GTW of course :).
A folder for each site would definitely make some things easier, but it would also mean more work every time we need to republish the site or make configurations.
Did I understand that googlelink correctly in that if we have verified ownership in GWT for all involved domains cross-site submission in robots.txt was okay? I guess google will think its okay anyway.
-
actually google has the answer, right here: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=75712
I always try to do what google recommends even though something might work just as well.. just to be on the safe side
-
you can't submit a sitemap in GA so I'm guessing you mean GWT
Whether or not you put it in the robots.txt shouldn't be a problem. since in each sitemap, the urls would look something like this:
Sitemap 1:<url><loc>http:/yoursite.coim/somepage.html</loc></url>
Sitemap 2:<url><loc>http:/yoursite.dk/somepage.html</loc></url>
I see no need to filter what sitemap is shown to the crawler. If your .htaccess is set-up to redirect traffic from the TLD (top level domain eg .dk .com ex.) to the correct pages. Then the sitemaps shouldn't be a problem.
The best solution would be: to have a web in web. (a folder for each site on the server) and then have the htaccess redirect to the right folder. in this folder you have a robots.txt and a sitemap for that specific site. that way all your problems will be gone in a jiffy. It will be just like managing different 3 sites. even though it isn't.
I am no ninja with .htaccess files but I understand the technology behind it and know what you can do in them. for a how to do it guide, ask google thats what I allways do when I need to goof around in the htaccess. I hope it made sense.
-
Thanks for your response René!
Thing is we already submit the sitemaps in google analytics, but this SEO company we hired wants us to put the sitemaps in robots.txt as well.
The .htaccess idea sounds good, as long as google or someone else dont think we are doing some cross-site submission error (as described here http://www.sitemaps.org/protocol.php#submit_robots)
-
I see no need to use robots.txt for that. use Google and Bings webmaster tools. Here you have each domain registered and can submit sitemaps to them for each domain.
If you want to make sure that your sitemaps are not crawled by a bot for a wrong language. I would set it up in the .htaccess to test for the entrance domain and make sure to redirect to the right file. Any bot will enter a site just like a browser so it needs to obey the server. so if the server tells it to go somewhere it will.
the robots.txt can't by it self, do what you want. The server can however. But in my opinion using bing and google webmaster tools should do the trick.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are Multiple Page Titles hurting my rankings
The platform that built our website has, for some reason, put multiple page titles on all of our webpages: https://www.backyardadventures.com/ If you open this page and "View Page Source" you will see that there are 50 page title tags (). There is one with our actual page title, but there are 49 others that look like this: <title></span><span class="html-tag">lock</span><span class="html-tag"></title> or <title></span><span class="html-tag">bubbles2</span><span class="html-tag"></title> are just two of the examples. The company said that because they are in a svg tag () they are fine. The title tags show up on some of the Moz tools. Is this effecting our rankings?
On-Page Optimization | | PageLogic0 -
Why has my clients domain authority dropped 4 points?
I have been noticing odd results with the domain authority stat for one of my clients' wesbites. Whilst they have considerbaly improved for keyword rankings for granite worktops bristol and granite kitchen worktops bristol, I have noticed that the domain authority of the site has recently dropped from 12-6. Is there any particularly why this has seen such a big drop and what can be done to turn this back around? Best wishes, Richard O'Brien
On-Page Optimization | | obbass0 -
Google showing my content on the serps in a different domain
Hi all, Recently a partner of ours discovered that Google is showing a meta description on the serps for his homepage that is not his but ours. On his site, he sells add-ons for our software, so the name of our software appears many times and as well there are many links pointing to our site. He claims he hasn´t copied this text from us, and I have used some tools to verify this. I don´t understand how Google can get confused and show our text as the meta desctiption on the serps for his homepage. Any idea on why this happened?
On-Page Optimization | | Paessler0 -
One post on a keyword updated frequently vs. multiple posts
I'm wondering - which is better for SEO: having one post which is updated frequently or multiple posts on a given topic? Take this example: I write the ultimate guide to grilling steak. This guide should be updated at least yearly, if not more frequently. Should all the updates be applied to the existing post, or should there be a new, yearly post for each yearly guide to grilling steak? Another related question: is it bad for SEO to have a single-page site? Let's go back to the example: what if we create a single page which is the ultimate guide to grilling steak. We don't create additional content or anything else: it's only the guide which continues to get added to over time with new photos, new comments, new ideas, more information, etc. Is that going to rank better than a blog with separate posts that address all the different things that go into grilling steak (choice of meat, cooking methods, useful tools, etc.)? Thanks, --eric
On-Page Optimization | | EricOliver0 -
What is the best way to optimise a site that wants to target one service in multiple locations?
Hello! I would like to get some opinions about the best way to optimise a site/pages/URL's for a client that offers one service in multiple locations. The client has done work in various areas of the UK and wants to be found for these locations. They don't have addresses in these locations. What would be the best way to set up the pages/URLs for each location? Is there any reason why I can't have, for example, a main Portfolio page which has content about the service and a link to each portfolio in the different locations, e.g. /portfolio/london, /portfolio/hampstead, portfolio/kensington. Any advice would be very much appreciated. Thanks Kerry
On-Page Optimization | | Kerry_Jones1 -
Can we listed URL on Website sitemap page which are blocked by Robots.txt
Hi, I need your help here. I have a website, and few pages are created for country specific. (www.example.com/uk). I have blocked many country specific pages from Robots.txt file. It is advisable to listed those urls (blocked by robots.txt) on my website sitemap. (html sitemap page) I really appreciate your help. Thanks, Nilay
On-Page Optimization | | Internet-Marketing-Profs0 -
I have 90 domains at Godaddy and would like to transfer them to a new registrar - Moniker
Will the the transfer to a new registrar hurt my domains in the SERPs if all contact information for the domain stays the same and DNS stays the same and site hosting remains the same?
On-Page Optimization | | jeff.interactive0 -
Duplicate content on my domain
I have several pages on my domain that are using the same content except for changing a paragraph or sentence. Do I need to create unique content, even though much of the information pertains to a feature and is related?
On-Page Optimization | | Court_H0