Is placing content in sub directories better for SERP
-
Hi
For small web sites with less than 6 pages
Is there a benefit to structuring url paths using keyword rich sub directories compared to pages in the root of the site.
for example:
domainname.co.uk/keywordpagename.html
or
www.domainname.co.uk/keyword/keywordpagename.html
which seems to have better rankings?
thanks
keyword
-
I would also base it on the site content. Does the site content (product listing) warrant an extra sub directory to be created. For example if you are selling cars, then you can have a directory for Honda and then another subdirectory for the specific make with the keyword Honda in it.
In your case since the site only has 6 pages, it would not make sense to create the subdirectories.
-
Given the size of the proposed site in my view it's very unlikely that placing an extra keyword in the sub-directory would be of any benefit to ranking.
If the site is going to grow then it would make sense to nest related content within a number of relevant 'parent' sub-directories, but if it's going to stay at that size, a flat structure is probably going to be equally effective.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Z-indexed content
I have some content on a page that I am not using any type of css hiding techniques, but I am using an image with a higher z-index in order to prevent the text from being seen until a user clicks a link to have the content scroll down. Are there any negative repercussions for doing this in regards to SEO?
Technical SEO | | cokergroup0 -
No Java, No Content..?
Hello Mozers! 🙂 I have a question for you: I am working on a site and while doing an audit I disabled JavaScript via the Web Developer plugin for Chrome. The result is that instead of seeing the page content, I see the typical “loading circle” but nothing else. I imagine this not a good thing but what does this implies technically from crawler perspective? Thanks
Technical SEO | | Pipistrella0 -
Duplicate Content Issues - Where to start???
Dear All I have recently joined a new company Just Go Holidays - www.justgoholidays.com I have used the SEO Moz tools (yesterday) to review the site and see that I have lots of duplicate content/pages and also lots of duplicate titles all of which I am looking to deal with. Lots of the duplicate pages appear to be surrounding, additional parameters that are used on our site to refine and or track various marketing campaigns. I have therefore been into Google Webmaster Tools and defined each of these parameters. I have also built a new XML sitemap and submitted that too. It looks as is we have two versions of the site, one being at www.justgoholidays.com and the other without the www It appears that there are no redirects from the latter to the former, do I need to use 301's here or is it ok to use canonicalisation instead? Any thoughts on an action plan to try to address these issues in the right order and the right way would be very gratefully received as I am feeling a little overwhelmed at the moment. (we also use a CMS system that is not particularly friendly and I think I will have to go directly to the developers to make lots of the required changes which is sure to cost - therefore really don't want to get this wrong) All the best Matt
Technical SEO | | MattByrne0 -
Is a shorter subdomain better?
For example, consider the two subdomains below: learn.bluelinkerp.com learnmore.bluelinkerp.com Would there likely be an appreciable difference between the two based on length alone? Also, would it be better to use a term that relates to our product in general such as "software.bluelinkerp.com"? Thanks!
Technical SEO | | BlueLinkERP0 -
Lots of duplicate content warnings
I have a site that says that I have 2,500 warnings. It is a real estate website and of course we use feeds. it says I have a lot of duplicate content. One thing is a page called "Request an appointment" and that is a url for each listing. Since there are 800 listings on my site. How could I solve this problem so that this doesn't show up as duplicate content since I use the same "Request an Appointment" verbeage on each of those? I guess my developer who used php to do it, created a dedicated url to each. Any help would be greatly appreciated.
Technical SEO | | SeaC0 -
Why do some serps have a + map symbol?
Hi from sunny but freezing wetherby UK.. Ive noticed when you enter "York solicitors" some listings have + Show map symbol. Hers is a screen shot to illustrate: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/plus-map-serps-langleyscopy.jpg I'd like to know please what i would have to do to emulate this. Thanks in advance, David
Technical SEO | | Nightwing1 -
How does Google determine freshness of content?
With the changes in the Google algorithm emphasizing freshness of content, I was wondering how they determine freshness and what constitutes new content. For instance, if I write a major update to a story I published last July, is the amended story fresh? Is there anything I can do in addition to publishing brand new content to make Google sure they see all my new content?
Technical SEO | | KnutDSvendsen0 -
Content Delivery Network
Anyone have a good reference for implementing a content delivery network? Any SEO pitfalls with using a CDN (brief research seems to indicate no problems)? I seem to recall that SEOmoz was using Amazon Web Services (AWS) for CDN. Is that still the case? All CDN & AWS experiences, advice, references welcomed!
Technical SEO | | Gyi0