URL Structure & SEO - Should we be using sub-folders?
-
Hi all,
As part of our content marketing efforts we have run a number of initiatives in the past and created pages on the website to go along with them (also where the links for these particular projects point to).
However, the URL structure isn't actually a reflection of where the pages sit on the site. Unfortunately I'm unable to provide a URL for reasons I won't bore you with, but here's an example:
We recently ran a competition that was very successful in generating links. The URL for this is www.domain.co.uk/competition.
However, the page actually sits within the About Us section - which is where all of our news and content marketing pages go - and uses a URL override.
How much of an issue is this in regards to A) Our SEO in general?; and B) Ensuring we receive as much equity from the links we earn as possible? A brief explanation of what URL overrides actually are would also be useful! (We have a digital marketing agency who handle most of our SEO)
Thanks in advance guys!
John
-
Hi Jimmy,
Thanks very much for the reply!
John
-
Hi Tom,
Yes that's exactly right. The link from the About Us section goes directly to www.example.com/competition - it doesn't quickly redirect.
Thanks for putting my mind at ease!
John
-
Hi John
Just to make sure I'm reading this correctly.
You have your homepage. You can then click to the "about us" page. And from there, you can click to the "competition" page.
However, instead of it looking like: www.example.com/about-us/competition, it looks like: www.example.com/competition
Is that correct?
If it is, then I don't think there is anything to worry about. Here's why:
Think of your URL as merely a cosmetic thing. Your URL structure does not have to reflect how a search engine or a user reaches that page. The most important thing in terms of site structure and SEO is how a search engine crawler (and user) reaches that page - IE, how many clicks away from your root domain it is.
As a rule of thumb, you should try to keep all of your important pages no more than 3 clicks away from your homepage - and ideally as few as possible. How the URL is presented when the user/search engine reaches that point is not relevant**
**This isn't entirely the case - there is some correlation with including your keyword in the URL and higher rankings, albeit a very slight correlation. For example, some sites have links to their main pages from their homepage which then take you to a URL like: http://www.example.com/keyword/product-a - that doesn't mean that the search engine has to reach the page by going through that subfolder (which may not even exist), but the display URL contains the keyword and so it might be better for SEO (although I think if this does have an effect it is very, very slim). Conversely, if your main page URL was http://www.example.com/keyword, but it takes you four clicks to reach there, it would not be good for SEO as your page is so far away from the homepage.
The main guidelines here are to:
- Keep your main pages as close to the homepage as possible (via internal links)
- Try to include the keyword in your URL if possible (or at least use friendly, readable URLs)
Now, on the "about us" page, the link that goes to the "competition" page - does it go directly to www.example.com/competition, or is it linked to www.example.com/about-us/competition and then quickly redirects?
If it's the second scenario, that would be an unnecessary redirect and might involve some of the link equity being diluted. I'd get the agency to look at linking to the www.example.com/competition page directly if that is the case (although I doubt it will be).
I hope that helps to explain some things!
-
Hi John,
Your URL override sounds like a 301 redirect, you can test this by going to what the URL actually is in your system, e.g. www.domain.co.uk/about-us/pages/competition with a
'URL Redirect Checker'
If it reports as '301' then that is good (Moz has a section on redirects in their learn section).
If it reports as another type of redirect then that would not be as efficient as a 301.
Pages that come back as '200' are direct links (these are the best), 301 is 'Moved Permanently', they also prevent the page from being indexable via both URLs so as to remove any duplicate content possibilities.
301 redirects are 'meant' to carry all the equity across so nothing is being lost from an SEO perspectiveIn terms of SEO, having the easiest URLs possible is always a bonus, not only does Google prefer shorter URLs but is also encourages direct traffic and visitor referrals (you'd rather tell a friend to go to a particular if it easy enough to type).
Kind Regards
Jimmy
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
If a URL canonically points to another link, is that URL indexed?
Hi, I have two URL both talking about keyword phrase 'counting aggregated cells' The first URL has canonical link pointing to the second URL, but if one searches for 'counting aggregated cells' both URLs are shown in the results. The first URL is the pdf, and i need only second URL (the landing page) to be shown in the search results. The canonical links should tell Google which URL to index, i don't understand why both URLs are present in search results? Is 'noindex' for the first URL only solution? I am using Yoast SEO for my website. Thank you for the answers.
Technical SEO | | Chemometec0 -
Need URL structure suggestions
On my website I am in the process of creating expat city guides for different cities in Cambodia. I've already gotten three up, but I am worried that my URL structure is not the best, so I am wondering if I should fix it before I put the rest up. Right now the city guides are housed here: movetocambodia.com/expat-city-and-island-guides/ There's a section for each city, this one is for Battambang: movetocambodia.com/expat-city-and-island-guides/battambang And then there are sections for hotels, restaurants, etc. movetocambodia.com/expat-city-and-island-guides/battambang/battambang-hotels-and-accommodation So once you finally get to a review for an individual hotel or activity, the URL is really long, like this: movetocambodia.com/expat-city-and-island-guides/battambang/battambang-hotels-and-accommodation/classy-hotel Should I just par the section names down so the URL would be something like this: movetocambodia.com/expat-city-guides/battambang/accommodation/classy-hotel/ ? I was hoping by having the long URLs slugs for my section pages, such as "battambang-hotels-and-accommodation" they would be more likely to show on search terms like "Battambang hotels" than if the section was just "accommodation." However, this whole section is getting much less search traffic than anything else on my site, so I am wondering if it is because of these ridiculously long URLs. Any suggestions would be appreciated.
Technical SEO | | Lina5000 -
John Mueller says don't use Schema as its not working yet but I get markup conflicts using Google Mark-up
I watched recently John Mueller's Google Webmaster Hangout [DEC 5th]. In hit he mentions to a member not to use Schema.org as it's not working quite yet but to use Google's own mark-up tool 'Structured Data Markup Helper'. Fine this I have done and one of the tags I've used is 'AUTHOR'. However if you use Google's Structured Data Testing Tool in GWMT you get an error saying the following Error: Page contains property "author" which is not part of the schema. Yet this is the tag generated by their own tool. Has anyone experienced this before? and if so what action did you take to rectify it and make it work. As it stands I'm considering just removing this tag altogether. Thanks David cqbsdbunpicv8s76dlddd1e8u4g
Technical SEO | | David-E-Carey0 -
Multilingual Structure
Hello fellow SEO fans, I've got a setup that I'm interested in some opinions on. I have a website which has the following setup: www.site.com (english version of the site) www.site.com/nl (dutch version of the site) Now, my experience tells me the dutch version would be written in dutch (not using Google Translate) and the meta data et al should also be in dutch. But my question is: If somebody in, say, Netherlands perform a search in english for a specific keyword, we would want the www.site.com page to appear in the SERPs, not the www.site.com/nl page, because the person has searched in english. However, because there's a www.site.com/nl page, purely the /nl page will be optimized and linked to in order to rank it higher in the SERPs for dutch searches and not english searches? But if that's the case, then the person in the Netherlands searching for the english version of the keyword, probably won't see www.site.com in the ranks because of targeting and keyword distribution? Bit of a tricky situation that I've been pondering over and can't quite put the nail on the head. Any assistance would be appreciated.
Technical SEO | | ChristopherM0 -
Tutorial For Moving Blogger Blog From Sub-Domain to Sub-Directory
Does anyone know where I can find a tutorial for moving a blogger.com (blogspot) blog that's currently hosted on a subdomain (i.e. blog.mysite.com) to a subdirectory (i.e. mysite.com/blog) with the current version of blogger? I'm working on transferring my blogger blogs over to wordpress, and to do so without losing link juice or traffic, this is one of the steps I have to take. There's plenty of tutorials that address moving from blogspot.mysite.com to wordpress and I've even found a few that address moving from blog.mysite.com (hosted on blogger) to a root domain mysite.com. However, I need to move from blog.mysite.com (blogger) to mysite.com/blog/ - subdirectory (wordpress). Anyone who knows how to do this or can point me in the right direction?? Thanks.
Technical SEO | | ChaseH0 -
We changed the URL structure 10 weeks ago and Google hasn't indexed it yet...
We recently modified the whole URL structure on our website, which resulted in huge amount of 404 pages changing them to nice human readable urls. We did this in the middle of March - about 10 weeks ago... We used to have around 5000 404 pages in the beginning, but this number is decreasing slowly. (We have around 3000 now). On some parts of the website we have also set up a 301 redirect from the old URLs to the new ones, to avoid showing a 404 page thus making the “indexing transmission”, but it doesn’t seem to have made any difference. We've lost a significant amount of traffic, because of the URL changes, as Google removed the old URLs, but hasn’t indexed our new URLs yet. Is there anything else we can do to get our website indexed with the new URL structure quicker? It might also be useful to know that we are a page rank 4 and have over 30,000 unique users a month so I am sure Google often comes to the site quite often and pages we have made since then that only have the new url structure are indexed within hours sometimes they appear in search the next day!
Technical SEO | | jack860 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0