What are the advantages and disadvantages of having multiple folders in URL?
-
Example:
http://www.domain.com.ph/property-for-sale/city/area/
(3 folders)
Would it be great if we'll just use http://www.domain.com.ph/property-for-sale-area-city/ (All pages will be under 1 folder)?
Thanks in advance!
-
Thanks Vadim!
-
Similar questions asked before with great responses:
http://moz.com/community/q/url-length
http://moz.com/community/q/maximum-length-of-the-url-for-seo-75-115
Also too many of pages in one directory using: http://www.domain.com.ph/property-for-sale-area-city/ method could lead to server issues, depending on your type of server. But I mean this is if you are creating thousands of urls to one directory, so might not be relevant
Hope this helps
-
As the length of an URL is a relevant ranking factor (short ist better) you just have to check which one is the short one... see this link for actual ranking factors (0.16 as to the Spearman Correlation)
This statement does not cover technical issues of course...
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Anything wrong with multiple meta descriptions and multiple title tags? We have 2 by mistake
Hi, As I stated in the we have 2 meta description and title tags. Will this hurts? How Google handles this? Thanks
Algorithm Updates | | vtmoz0 -
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Link reclamation and many 301 redirect to one URL
We have many incoming links to a non existing pages of a sub-domain, which we are planning to take down or redirect to a sub-directory. But we are not ready to loose pagerank or link juice as many links of this sub-domain are referred from different external links. It's going to be double redirect obviously. What is the best thing we can go to reclaim these links without loss of link juice or PR? Can we redirect all these links to same sub-domain and redirect the same sub-domain to sub-directory? Will this double redirect works? Or Can we redirect all these links to same sub-domain and ask visitors to visit sub-directory, manual redirection? How fair to manually redirect visitors? Any other options? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Duplicate pages in language versions, noindex in sitemap and canonical URLs in sitemap?
Hi SEO experts! We are currently in the midst of reducing our amount of duplicate titles in order to optimize our SEO efforts. A lot of the "duplicate titles" come from having several language versions of our site. Therefore, I am wondering: 1. If we start using "" to make Google (and others) aware of alternative language versions of a given site/URL, how big a problem will "duplicate titles" then be across our domains/site versions? 2. Is it a problem that we in our sitemap include (many) URL's to pages that are marked with noindex? 3. Are there any problems with having a sitemap that includes pages that includes canonical URL's to other pages? Thanks in advance!
Algorithm Updates | | TradingFloor.com0 -
What would be the right website hosting solution for a global brand with multiple TLDs?
Hello everyone, we are preparing a global we strategy for our customer, who wants to focus on many local markets. They are present with their products in over 60 countries on all continents and they can cover 7 languages easily. They also own the TLDs in each main country. Our priority is to be not only found in Google, but also in all major local search engines, which are popular in each country. Our main concerns with this strategy are: How do we host the website locally for each country, so local search engines recognize the website as a local website? Also, is it that important? Is cloud hosting the right solution for this? We would have different server locations for each TLD/language. Surrounding countries would at least get a fast connection and download rate, if the server is located nearby. If yes, can you recommend any companies doing cloud hosting? Is it also wise to additionally redirect the user to the local server based on his/her location? Is there a risk of duplicate content or being recognized as a link spamming site (for having links to each other from different domains of the same site, the content will be in different languages, though) just by using this solution? We would, of course, keep the content separated by domain to avoid duplicate content. Any comments and ideas would be highly appreciated. Thanks, Wojtek
Algorithm Updates | | webeeline0 -
Blog in Sub Domain or Sub Folder. Pros and Cons
My client wants to show more links on google once someone searches for his company name. Im wondering whats the best way to setup a blog to solve this. Should I create it as blog.website.com.au or website.com.au/blog? What are the pros and cons of each? Thanks heaps guys.
Algorithm Updates | | Uds0 -
Multiple Listings in Results fading Local SEO
Lately I am noticing multiple listings for results seem to be fading away. Example is one domain being listed twice for a search phrase The Home page for example and an Internal Page. Is anyone else seeing this? Safe to say Google wants to see 10+ individual domains per results page?
Algorithm Updates | | bozzie3110 -
The related: query for one of my urls makes no sense
I'm trying to compete regarding keyword X. Currently, I'm on first page, 7-8th position. If, for each one of the urls listed in first page for such keyword, I search for related:[url], I get similar results for all of them, but mine. Mine shows inconsistent results, none of which related to the same topic as the other 9 in the top 10. Looking at them, the only hypothesis I am able to formulate is that, somehow, google is linking the url to its paid banners in big media. However, such banners go through an adserver and/or are declared as nofollow. Is there any obvious reason that could be causing this? I wonder if we are on page 1 even though we're considered pretty-much 'off-topic' regarding the keyword.
Algorithm Updates | | jleanv240