Domain name with separated/non-separated keywords
-
I start a new webshop within a month about spices and coffee. I'm thinking about the domain name to take. I would like to get visitors from coffee and spice keyword searches.
How much does it matter (in terms of SEO) if I use spiceandcoffee instead of spice-and-coffee? (The site will be hungarian and it sounds easy to remember without the hypen: fuszer-es-kave or fuszereskave.)
Does Google weighing more separated keywords in domain, instead of non-separated?
-
Thank You for the comments, I was about to go with the non-separated and now I'm sure about it
Thanks once more
-
Thank you Alan, glad you agree
-
simon has said it all, hyphes are hard to comunicate and look spammy.
-
Hi Zoltan
A good question.
It's the general consensus that Google does not place much weight on keyworded domains anymore, some sure, though not much. So bearing that in mind, I'd suggest going for a domain that is user-friendly and describes your business.
'spiceandcoffee' sounds ideal, it does what it says on the tin!
Domains without hyphens are better in my opinion, they are easier to communicate, remember (as you said) and to type, so avoid hyhpens if you can and stick with e.g. 'spiceandcoffee'.
(hyphens are considered the same as spaces by search engines, though they can easily distinguish between and identify words that are all together such as in your suggestion of 'spiceandcoffee', no need for hyphens).
So no difference for Search, just better for usability without hyphens.
Hope that helps,
Regards
Simon
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When to use mod rewrite / canonical / 301 redirect
Hello, I have taken over the management of a site which has a big problem with duplicate content. The duplicate content is caused by two things: Upper and lower case urls e.g: www.mysite.com/blog and www.mysite.com/Blog The other reason is the use of product filters / pagination which mean you can get to the same 'page' via different filters. The filters generate separate URLs. http://www.mysite.com/casestudy
Technical SEO | | Barques-Design
http://www.mysite.com/casestudy/filter?page=1
http://www.mysite.com/casestudy/filter?solution=0&page=1
http://www.mysite.com/casestudy?page=1
http://www.cpio.co.uk/casestudy/filter?solution=0" Am I right to assume that for the case sensitive URLs I should use a 301 redirect because I only want the lower page to be shown? For the issue with dynamic URLs should we implement a mod-rewrite and 301 to one page? Any advice would be greatly appreciated.
Mat0 -
Blog separate from Website
One of my clients has a well established website, and a well established blog - each with its own domain. Is there any way to move the blog to his website domain without losing the SEO and links that he has built up over time?
Technical SEO | | EchelonSEO0 -
Forwarding kw rich domains to main domain
Hi My client has a clutch of kw rich domains that want to point to main domain, apart from being good for promotional reasons is there any seo benefit for doing so (i know there used to be years ago but under impression hasn't been any benefit for a long while) Most importantly though can any bad come from doing this ? Best Rgds Dan
Technical SEO | | Dan-Lawrence0 -
Best way to create a shareable dynamic infographic - Embed / Iframe / other?
Hi all, After searching around, there doesn't seem to be any clear agreement in the SEO community of the best way to implement a shareable dynamic infographic for other people to put into their site. i.e. That will pass credit for the links to the original site. Consider the following example for the web application that we are putting the finishing touches on: The underlying site has a number of content pages that we want to rank for. We have created a number of infogrpahics showing data overlayed on top of a google map. The data continuously changes and there are javascript files that have to load in order to achieve the interactivity. There is one infographic per page on our site and there is a link at the bottom of the infographic that deep links back to each specific page on our site. What is the ideal way to implement this infographic so that the maximum SEO value is passed back to our site through the links? In our development version we have copied the youtube approach implemented this as an iframe. e.g. <iframe height="360" width="640" src="http://www.tbd.com/embed/golf" frameborder="0"></iframe>. The link at the bottom of that then links to http://www.tbd.com/golf This is the same approach that Youtube uses, however I'm nervous that the value of the link wont pass from the sites that are using the infographic. Should we do this as an embed object instead, or some other method? Thanks in advance for your help. James
Technical SEO | | jtriggs0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Redirecting blog.<mydomain>.com to www.<mydomain>.com\blog</mydomain></mydomain>
This is more of a technical question than pure SEO per se, but I am guessing that some folks here may have covered this and so I would appreciate any questions. I am moving from a WordPress.com-based blog (hosted on WordPress) to a WordPress installation on my own server (as suggested by folks in another thread here). As part of this I want to move from the format blog.<mydomain>.com to www.mydomain.com\blog. I have installed WordPress on my server and have imported posts from the hosted site to my own server. How should I manage the transition from first format to the second? I have a bunch of links on Facebook, etc that refer to URLs of the blog..com format so it's important that I redirect.</mydomain> I am running DotNetNuke/WordPress on my own IIS/ASP.Net servers. Thanks. Mark
Technical SEO | | MarkWill0 -
External Links from own domain
Hi all, I have a very weird question about external links to our site from our own domain. According to GWMT we have 603,404,378 links from our own domain to our domain (see screen 1) We noticed when we drilled down that this is from disabled sub-domains like m.jump.co.za. In the past we used to redirect all traffic from sub-domains to our primary www domain. But it seems that for some time in the past that google had access to crawl some of our sub-domains, but in december 2010 we fixed this so that all sub-domain traffic redirects (301) to our primary domain. Example http://m.jump.co.za/search/ipod/ redirected to http://www.jump.co.za/search/ipod/ The weird part is that the number of external links kept on growing and is now sitting on a massive number. On 8 April 2011 we took a different approach and we created a landing page for m.jump.co.za and all other requests generated 404 errors. We added all the directories to the robots.txt and we also manually removed all the directories from GWMT. Now 3 weeks later, and the number of external links just keeps on growing: Here is some stats: 11-Apr-11 - 543 747 534 12-Apr-11 - 554 066 716 13-Apr-11 - 554 066 716 14-Apr-11 - 554 066 716 15-Apr-11 - 521 528 014 16-Apr-11 - 515 098 895 17-Apr-11 - 515 098 895 18-Apr-11 - 515 098 895 19-Apr-11 - 520 404 181 20-Apr-11 - 520 404 181 21-Apr-11 - 520 404 181 26-Apr-11 - 520 404 181 27-Apr-11 - 520 404 181 28-Apr-11 - 603 404 378 I am now thinking of cleaning the robots.txt and re-including all the excluded directories from GWMT and to see if google will be able to get rid of all these links. What do you think is the best solution to get rid of all these invalid pages. moz1.PNG moz2.PNG moz3.PNG
Technical SEO | | JacoRoux0 -
Outranking a competitor when their domain name is the keyword
Hi I'd just like to ask the opinion of my fellow members here : We are currently ranking second for a very important keyword and would obviously like the top spot on the SERP - the site that is ranking first has the domain name as the keyword phrase(along with a good amount of quality links from a variety of domains) - now I know it is possible to outrank them since I do remember reading about this in one of Rands posts(I think it was the whole white hat black hat one he posted recently) - bascially we have more domain authority, slightly less links but from double the amount of root domains and a higher page authority too! Does having the keyword as your domain make THAT much of a difference when we are(imo) quite close in terms of great content and link profiles(and all the onpage factors) ? Thanks!
Technical SEO | | DanHill0