URL Structure "-" vs "/"? Are there any advantages to one over the other?
-
An example would be domain.com/keyword/keyword2 vs domain.com/keyword-keyword2
Are there any advantages / disadvantages to one over the other?
-
Lots of great feedback has been offered. In short, it's up to your personal preference.
I can't help but add a link because I have watched too many Matt Cutts videos (they are starting to auto-play in my head) and he answered your exact question.
-
the /keyword/ tells the search engine that you have a folder, the other does not. If your Keyword1 has a lot of sub/related keywords then creating a folder would be helpful. Google does search the index files of your folders... without being prompted to.
If you don't have a legit sub category then I would stick with the -.
-
The beginners guide to seo is a great place to start in terms of how to structure your URLs. I personally avoid the "domain.com/keyword/keyword2" type of URL structure. I'd stick with the latter format, but would make sure to not keyword stuff. Just keep it simple with one or two keywords as you mentioned. It's really ugly when you see a url that looks like http://www.domain.com/keyword-keyword2-keyword3-keyword4-keyword5-keyword6-keyword7. I know I hate it when I see those URLs show up in the SERPS. I believe Rand touched on this in The Future of LInk Building webinar from a while back.
-
Well your first example would be referencing 2 separate directories, where the second example is one directory.
Here's a quick snippet of an article from Rand. http://www.seomoz.org/blog/11-best-practices-for-urls
These last examples have done nearly everything right:
- http://www.discoverohio.com/visitors/map.asp
Brilliant - it's short, descriptive, static and obvious. - http://web.mit.edu/is/usability/usability-guidelines.html
Despite the subdomain, everything else is near perfect. - http://www.whitehouse.gov/history/presidents/jk35.html
I'm letting the White House off the hook for not using "john-kennedy" as the page title, because they've wisely also provided his number (the US' 35th President).
- http://www.discoverohio.com/visitors/map.asp
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A/B testing entire website VS Seo issues
I'm familar with A/B testing variations of a page but I'd like to A/B test a new designs version of a e-commerce site. I´m wondering about the best way to test with SEO concerns... this is what I´ve in mind right now, any suggestion? Use parameters to make version B different from A version. Redirect 50% of the users with 302 ( or javascript would be a better way?) Use noindex in the B pages. Use rel=canonical in the B pages pointing to A version. In the end use 301 redirect to all B pages to A urls. PS: We can´t use subdomain and i don´t wanna use robots.txt file to protect the new design from competitors. I´d love any suggestions and tips about it - thanks folks 🙂
Technical SEO | | SeoMartin10 -
Http vs https: which is better for seo / usability
Hi All, Firstly thank you for taking the time to look. My dilemma is as follows; I have a site on wordpress that I have added an ssl certificate to and the entire domain is secure. The site has a mix of content including a blog area and product pages. My question is what does Google prefer, http or https or does it not matter As i see it my option is to keep the entire site as https and enforce this sitewide so all non secure content redirects to the https version or i could enforce https just in the cart and or product pages, all other content, homepage, blog, about us, contact us etc would be http. From an seo perspective ie google search engine, is their a best way to proceed? Finally, as i currently have http and https both displaying ie duplicate, what would be the way to fix this, i have yoast plugin so can set the canonical there and can also edit my robot.txt. I have come across this resource (http://www.creare.co.uk/http-vs-https-duplicate-content) and am wondering if this guideline is still correct or is there another more current way, if so I would be grateful if you could point me in the right direction. thanks in advance.
Technical SEO | | Renford_Nelson0 -
New "Static" Site with 302s
Hey all, Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with! We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s. One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/. My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail. Has anyone had a similar challenge with a static site and found a way to overcome it?
Technical SEO | | danny.wood1 -
Forum website rel="nofollow" is this Good?
Hi, Forum website rel="nofollow" is this Good? We have a Q & A site and have all links as Nofollow. Would this be a good way? Thanks
Technical SEO | | mtthompsons0 -
Is the " meta content tag" important?
I am currently trying to optimize my companies website and I noticed that meta content is exactly the same for all of the pages on our website. Isn't this problematic? The actual content on the webpage is not the same and a lot of the pages don't have these keywords in the content.
Technical SEO | | AubbiefromAubenRealty0 -
Updating content on URL or new URL
High Mozzers, We are an event organisation. Every year we produce like 350 events. All the events are on our website. A lot of these events are held every year. So i have an URL like www.domainname.nl/eventname So what would you do. This URL has some inbound links, some social mentions and so on. SO if the event will be held again in 2013. Would it be better to update the content on this URL or create a new one. I would keep this URL and update it because of the linkvalue and it is allready indexed and ranking for the desired keyword for that event. Cheers, Ruud
Technical SEO | | RuudHeijnen0 -
I have a ton of "duplicated content", "duplicated titles" in my website, solutions?
hi and thanks in advance, I have a Jomsocial site with 1000 users it is highly customized and as a result of the customization we did some of the pages have 5 or more different types of URLS pointing to the same page. Google has indexed 16.000 links already and the cowling report show a lot of duplicated content. this links are important for some of the functionality and are dynamically created and will continue growing, my developers offered my to create rules in robots file so a big part of this links don't get indexed but Google webmaster tools post says the following: "Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools." here is an example of the links: | | http://anxietysocialnet.com/profile/edit-profile/salocharly http://anxietysocialnet.com/salocharly/profile http://anxietysocialnet.com/profile/preferences/salocharly http://anxietysocialnet.com/profile/salocharly http://anxietysocialnet.com/profile/privacy/salocharly http://anxietysocialnet.com/profile/edit-details/salocharly http://anxietysocialnet.com/profile/change-profile-picture/salocharly | | so the question is, is this really that bad?? what are my options? it is really a good solution to set rules in robots so big chunks of the site don't get indexed? is there any other way i can resolve this? Thanks again! Salo
Technical SEO | | Salocharly0 -
How "Optimised" is my home page content
Good afternoon from 1 degrees C overcast frozen wetherby UK... I've made a number of on page html markup changes to optimise the page for steel suppliers steel stockholders but I'd like to know if there are any other on page improvments I could make for this page http://www.barrettsteel.com/ Im particulary concerned that contnet in in li tags and not p, could this be an issue? And finaaly on the home page a third party developer has slapped a header banner pointing to an external site know as woodberry tools, that cant be good can it? Any insights welcome 🙂
Technical SEO | | Nightwing0