Best url structure
-
I am making a new site for a company that services many cities.
I was thinking a url structure like this,
website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5.
Will this be the best approach to optimize the site for the keyword plus 5 different cities ?
as long as I keep the total url characters under the SeoMoz reccomended 115 characters ?
Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
-
Joseph
I'd make each page totally unique to the city. Don't worry about/focus on getting a penalty or not. Make the pages so useful for the user that if you have a sentence or two that's similar you won't get a penalty. Useful to me would be information that's entirely specific to what the page is "about". A page with some testimonials and an in-depth case study, photos and useful info in regard to that location should really deliver and give people what they need to know about the services in that location!
-Dan
-
Hello Handcrafter,
thank you for your reply.
We have one long tail keyword per page but service about 15 cities with only 3 offices.
So we need to attract business from multiple cities is my dilema.
Just trying to get advice how to do this the best way.
Having one main service page and then listing all cities served in the description and content.
Or make a page for each city with case studies and testimonials unique to the city but same basic description of the service.
In your experience how much content would make it seem like duplicate content ?
-
Hello dan,
Thank you.
We do have many testimonials and case studies for each city which would be unique content for the city.
So we could describe the service in general which would have to be basically the same, but could add 2 or 3 testimonials and 5 or 6 case studies related to that service but for that cities customers.
Is that first description going to trigger a duplicate content issue ?
Would that be based on percentage of total content or having one paragraph basically the same be penalized ?
-
Thank you for all the input, some great help for sure.
After reading some answers, I was thinking to myself, what was I thinking...lol
I should have been a little clearer about the keywords.
Keyword 1, 2 and 3 are one long tail keyword not 3 separate keywords.
Example shiny blue widget
My client has a company that travels across a big area to do service calls for this shiny blue widget.
So we want the company to rank in multiple cities. There are 3 offices servicing 15 cities.
@mat - I totally agree about the URL.
-
Right on the money Matt, thank you!
So now the challenge is somehow creating separate pages about each location that are not duplicates and are actually useful.
Not sure what the niche is, but let's say you're targeting a city the business is not actually located in. How could you make a page about that? You could make a page all about that town with case studies of clients from that town. With testimonials of customers from that town. The company could hold or sponsor an event in that town. Or a piece like "The State Of [company type] in [town]". Perhaps that may get some ideas going.
-Dan
-
Hi Joseph- What will people be searching for? Does each keyword have it's own unique focus? It seems you may want to create unique pages covering each keyword or group of keywords that stand alone. Within that content you might list locations or services unique to each city. If the services are all the same, could you make 1 unique page describing those services? That way you could simply list the cities and use text links to the services page. Good luck- Handcrafter
-
If you saw that address in a search result would you actually click it?
I'd say cramming that many keywords in to a URL would send a bad signal anyway. More to the point though it is going to look like someone jumped in Marty McFly's Delorean and came back with a boot full of spam from the late 1990s.
Ranking is NOT the most important thing (even if this would help, which I would doubt). If the listing looks poor quality then that ranking will bring less traffic. Less traffic means less money.
I would much rather see a short URL without the keywords, and use the keywords in the title. Better still break it up in to every page that makes logical sense and have an appropriate URL and matching content for each. There is no "trying" to avoid duplicate content though - you have to avoid it
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 vs Canonical - With A Side of Partial URL Rewrite and Google URL Parameters-OH MY
Hi Everyone, I am in the middle of an SEO contract with a site that is partially HTML pages and the rest are PHP and part of an ecommerce system for digital delivery of college classes. I am working with a web developer that has worked with this site for many years. In the php pages, there are also 6 different parameters that are currently filtered by Google URL parameters in the old Google Search Console. When I came on board, part of the site was https and the remainder was not. Our first project was to move completely to https and it went well. 301 redirects were already in place from a few legacy sites they owned so the developer expanded the 301 redirects to move everything to https. Among those legacy sites is an old site that we don't want visible, but it is extensively linked to the new site and some of our top keywords are branded keywords that originated with that site. Developer says old site can go away, but people searching for it are still prevalent in search. Biggest part of this project is now to rewrite the dynamic urls of the product pages and the entry pages to the class pages. We attempted to use 301 redirects to redirect to the new url and prevent the draining of link juice. In the end, according to the developer, it just isn't going to be possible without losing all the existing link juice. So its lose all the link juice at once (a scary thought) or try canonicals. I am told canonicals would work - and we can switch to that. My questions are the following: 1. Does anyone know of a way that might make the 301's work with the URL rewrite? 2. With canonicals and Google parameters, are we safe to delete the parameters after we have ensures everything has a canonical url (parameter pages included)? 3. If we continue forward with 301's and lose all the existing links, since this only half of the pages in the site (if you don't count the parameter pages) and there are only a few links per page if that, how much of an impact would it have on the site and how can I avoid that impact? 4. Canonicals seem to be recommended heavily these days, would the canonical urls be a better way to go than sticking with 301's. Thank you all in advance for helping! I sincerely appreciate any insight you might have. Sue (aka Trudy)
Intermediate & Advanced SEO | | TStorm1 -
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
If I block a URL via the robots.txt - how long will it take for Google to stop indexing that URL?
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Redirecting to Modal URLs
Hi everyone! Long time no chat - hope you're all well! I have a question that for some reason is causing me some trouble. I have a client that is creating a new website, the process was a mess and I am doing a last minute redirect file for them (long story, for another time). They have different teams for different business categories, so there are multiple staff pages with a list of staffers, and a link to their individual pages. Currently they have a structure like this for their staff bios... www.example.com/category-staff/bob-johnson/ But now, to access the staffers bio, a modal pops up. For instance... www.example.com/category-staff/#bob-johnson Should I redirect current staffers URLs to the staff category, or the modal URL? Unfortunately, we are late in the game and this is the way the bio pages are set up. Would love thoughts, thanks so much guys!!
Intermediate & Advanced SEO | | PatrickDelehanty0 -
URL Construction
Working on an old site that currently has category urls (that productively rank) like this example: LakeNameBoating.com/category/705687/rentals I want to enhance the existing mid page one rank for terms related to "Lake Name Boat Rentals," 301ing the old urls to the new, would you construct the new urls as: LakeNameBoating.com/lake-name-boat-rentals or... LakeNameBoating.com/boat-rentals And why? It's all for one particular lake with "name" being just an anonymous placeholder example. Thanks!
Intermediate & Advanced SEO | | 945010 -
HTML for URL markup
Hi, We are changing our URLs to be more SEO friendly. Is there any negative impact or pitfall of using <base> HTML-tag? Our developers are considering it as a possible solution for relative URLs inside HTML-markup in the Friendly URL context.
Intermediate & Advanced SEO | | theLotter0 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Sudden increase in number of indexed URLs. How ca I know what URLs these are?
We saw a spike in the total number of indexed URLs (17,000 to 165,000)--what would be the most efficient way to find out what the newly indexed URLs are?
Intermediate & Advanced SEO | | nicole.healthline0 -
Brackets in a URL String
Was talking with a friend about this the other day. Do Brackets and or Braces in a URL string impact SEO? (I know short human readable etc... but for the sake of conversation has anyone relaised any impacts of these particular Characters in a URL?
Intermediate & Advanced SEO | | AU-SEO0