The impact of using directories without target keyword on our Rankings
-
Hello all,
I have a question regarding a website I am working on. I’ve read a lot of Q en A’s but couldn’t really find the best answer.
For one of our new websites we are thinking about the structure of this website and the corresponding URL-structure. Basically we have a main product (and a few main keywords) which should drive the most traffic to our website, and for which we want to optimize our homepage.
Besides those main keywords, we have an enormous base of long-tail keywords from which we would like to generate traffic. This means we want to create a lot of specific pages which are optimized.
My main question is the following:
We are thinking of two options:
- Option 1: www.example.com/example-keyword-one
- Option 2: www.example.com/directory/example-keyword-one
With option 1 we will link directly from our homepage to the most important pages (which represent our most important keywords). All the pages with the long tail content will be linked from another section on our website, which is one click away from our homepage (specifically a /solutions page which is linked from the footer). All the pages with long-tail content will have this structure www.example.com/example-keyword-one so the URLs will not contain the directory /solutions
With option 2 we will use more subdirectories in our URLs. Specifically, for all the long tail content we would use URLs like this: www.example.com/solutions/example-keyword-one
The directories we want to use wouldn't really have added value in terms of SEO, since they don’t represent important keywords.- So what is the best way to go? Option 1, straightforward, short URL’s which don’t really represent the linking structure of our website, but only contain important keywords. Or option 2, choose for more directories in our URLs which represent the linking structure of our website, but contain directories which don’t represent important keywords.
- Would the keyword ‘solutions’ in the directory (which doesn’t really relate to the content on the page) have a negative impact on our rankings for that URL?
-
Hi Rob,
Thanks for the helpful answer! I did a lot of research and also concluded that both options can work. I just haven't found any supporting case studies which clearly shows which of the two alternatives would work best. So if anyone knows a good article related to URL-structure and my question in specific, that would be very welcome!
Thanks!
Regards,
Jorg
-
It all depends if you want (or are going too):
1. Short URL's usually work best with regards to indexing and product correlation (too long means characters get left off by Google when indexing). Keep things within a short URL length also helps Google index the full length and get the full value of the URL - using your <keywords>to reinforce the URL relation.</keywords>
-
Also - Having these URL's linked too from the main page will help flow 'link juice' through the site, providing you keep the amount of links on the homepage to a minimum amount, and mix with other links that are <nofollow>. Usually links beyond 100 will not be crawled by Googlebot.</nofollow>
-
Also - If your URL's are strings - make sure to have 301's setup for URL's that include any type of string (?=question123456 or something to that alignment) Make sure to change that string = www.domains.com/keyword-rich-content. This might be nothing for the site/domain you are working on, or might be a step that needs to be included in the site's overhaul project work.
2. Longer URL's (like adding directories or sub-folders) can be good too, depending on your product breakdown in you site architecture. It might not be needed though. If you have hundreds of thousands of products, directories will most likely be needed to sort the data and organize the database being used to work alongside the CMS. Then you would want to go this route, other than having an unorganized ROOT directory with thousands of pages in it (even if dynamically generated)
Each option works, in their own way. Each with supporting documentation and methods. Just something to consider in helping you steer the SEO sea
Cheers!
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Core Web Vitals hit Mobile Rankings
Hey all, Ever since Google announced "Core Web Vitals" are mobile rankings have nose-dived. At first, I thought it was optimisation changes to the page titles we had made which might still be part of the issue. However, Desktop rankings actuallyy increased for the same pages where mobile decreased. There is the plan to introduce a new ranking signal into the Google algorithm called the "core web vitals: and this was discussed around late May. even though it's supposed to get fully indexed into a ranking signal later this year or early next; I think Google continuously test and release this items before any official release. If you weren't aware, there is a section in Google Webmaster Tools related to "core web visits", which looks at:1. Loading2. Interactivity3. Visual StabilityThis overlays some of the other basic requirements of a good website and mobile experience. Taking a look at our Google Search Console, it appears to be the following:1. Mobile- 1,006 poor URLs, 100URLs need improvement and 475 good URLs.2. desktop- 0 poor URLs, 379 need improvements and 1,200 good URLsSOURCE: https://search.google.com/search-console/core-web-vitals?resource_id=https%3A%2F%2Fwww.griffith.ie%2FIn the report, we can see two distinct issues with the mobile pages:CLS Issue: more than 0.25 (mobile)- 1,006 casesLCP issue: longer than 4secs (mobile) - 348 case_CLS (Cumulative Layout Shift)This is a developer issue, and needs fixing. It's basically when a mobile screen jumps for the user. It is explained in this article: https://web.dev/cls/Seems to be an issue with all pages. **LCP (Largest Contentful Paint)_**Again, another developer fix that needs to be implemented. It's connected to page speed, and can be viewed here: https://web.dev/lcp/Looking at GCS, it looks like the blog content is mostly to blame.It's worth fixing these issues and again looking at the other items on page speed score tests:1. Leverage browser caching- https://gtmetrix.com/reports/griffith.ie/rBtvUC0F2. https://developers.google.com/speed/pagespeed/insights/?url=griffith.ie- mobile score for home page is 16/100, https://www.griffith.ie/people/thamil-venthan-ananthavinayagan is 15/100I think here is the biggest indicator of the issue at hand. Has anybody else noticed their mobile rankings go down and desktop stay the same of increase.Kind regards,
Web Design | | robhough909
Rob0 -
Is there anything wrong to have large number of internal links pointing to homepage? Including links from sub domains or sub directories?
Hi all, Generally more number of internal links will be pointed to homepage. But I see some modern suggestions that too many internal links to homepage are not good. I'm just wondering if most number of internal links pointing to homepage may hurt? Also we have sub domains, can we point a link from every page of sub domain or sub directory to homepage? Usually the answer here is about users. Of course, the content is about same product across all pages. Thanks
Web Design | | vtmoz0 -
Migrating login page from website: SEO impact
Our current login page looks like www.website.com/log-in/. We are planning to migrate it to a sub directory login.website.com. For years, our login page is the top landing with highest visits after homepage. If we migrate this now, are we going to loose traffic and drop in rankings? Thanks
Web Design | | vtmoz0 -
Log-in page ranking but not homepage
Our homepage is outranked by log-in page for "primary keyword" in Google search results; for which actually our homepage was optimised. I have gone through the other answers for the same question here. But I couldn't find them related with our website. We are not over optimised. We have link from top navigation menu of blog to our homepage. Does this causing this?
Web Design | | vtmoz1 -
Why use a wwwP subdomain naming convention
While working through a series of crawl reports and competitive insights for a site, I noticed one of the competitors had switched from a WWW-version to a wwwP-version. Looking back at the snapshot I took of this during the same time period in 2014, I noticed a significant drop in PA/DA by 20+/-. I'm curious to know if anybody else has experienced something similar, and if anybody can provide insights on why a change like this would even be made? I'll preface it with, everything we could see that this competitor was doing from the outside, was legitimate and propelling them in a positive direction.
Web Design | | dodgejd0 -
Footer links on my site... bad for passing page rank?
i've been told that it is possible that google discounts the weight or page rank passed in footer links of websites and my website has the navigation to many of my pages in the footer of each page. My whole website is about 20 pages so each page has links to the 5 most popular pages at the top and the rest of the links are in the footer of each page. Am i losing page rank by having these links in the footer? Should i make my navigation different? I have lots of articles on my site so i thought it might be not only helpful to my readers but give my pages an seo boost if i placed in context links in the body of my articles to other pages of my site. Does this sound like a good idea? Thanks mozzers! Thanks mozzers!
Web Design | | Ron100 -
Can you use a base element and mod_rewrite to alleviate the need for absolute URLs?
This is a follow up question to Scott Parsons' question about using absolute versus relative URLs when linking internally. Andy King makes the statement that this can be done and that it saves additional space (which he claims then can improve page speed). Is this a true and accurate statement? Can using a base element and mod-rewrite alleviate the need for absolute URLs? I need to know before going off on a "change all of our relative URLs to absolutes" campaign. Thanks in advance! Dana
Web Design | | danatanseo0 -
The primary search keywords for our news release network have dropped like a rock in Google... we are not sure why.
Hi, On April 11th, a month after the farmer update was released for U.S. users of Google, the primary keywords for ALL our sites significantly dropped in Google. I have some ideas why, but I wanted to get some second opinions also. First off, I did some research if Google did anything on the 11th of April... they did. They implemented the farmer update internationally, but that does not explain why our ranks did not drop in March for U.S. Google users... unless they rolled out their update based on what site the domain is registered in... in our case, Canada. The primary news release site is www.hotelnewsresource.com, but we have many running on the same server. EG. www.restaurantnewsresource.com, www.travelindustrywire.com and many more. We were number 1 or had top ranks for terms like ¨Hotel News¨, ¨Hotel Industry¨, ¨Hotel Financing¨, ¨Hotel Jobs¨, ¨Hotels for Sale¨, etc... and now, for most of these we have dropped in a big way. It seems that Google has issued a penalty for every internal page we link to. Couple obvious issues with the current template we use... too many links, and we intend to change that asap, but it has never been a problem before. The domain hotelnewsresource.com is 10 years old and still holds a page rank of 6. Secondly, the way our news system works, it´s possible to access an article from any domain in the network. E.G. I can read an article that was assigned to www.hotelnewsresource.com on www.restaurantnewsresource.com... we don´t post links to the irrelevant domain, but it does sometimes get indexed. So, we are going to implement the Google source meta tag option. The bottom line is that I think we put too much faith in the maturity of the domain... thinking that may protect us... not the case and it´s now a big mess. Any insight you can offer would be greatly appreciated. Do you think it was farmer or possibly something else? Thanks, Jarrett
Web Design | | jarrett.mackay0