URL Structure Question
-
Am starting to work with a new site that has a domain name contrived to help it with a certain kind of long tail search.
Just for fictional example sake, let's call it WhatAreTheBestRestaurantsIn.com. The idea is that people might do searches for "what are the best restaurants in seattle" and over time they would make some organic search progress. Again, fictional top level domain example, but the real thing is just like that and designed to be cities in all states.
Here's the question, if you were targeting searches like the above and had that domain to work with, would you go with...
whatarethebestrestaurantsin.com/seattle-washington
whatarethebestrestaurantsin.com/washington/seattle
whatarethebestrestaurantsin.com/wa/seattle
whatarethebestrestaurantsin.com/what-are-the-best-restaurants-in-seattle-wa
... or what and why?
Separate question (still need the above answered), would you rather go with a super short (4 letter), but meaningless domain name, and stick the longtail part after that?
I doubt I can win the argument the new domain name, so still need the first question answered.
The good news is it's pretty good content.
Thanks... Darcy
-
take the new 4 letter domainname you can market and brand. Redirect the old domain as best and logical you can to the specific pages on the 4 letter domainname.
4 letters are much easy-er to market. usernames in twitter, facebook etc, and you can make xyxy seatle, xyxy newyork as branding or social handlers for local markets and stuf..
#marketing #branding #worlddomination
-
Thanks for the answers Richard, Tobey & Lesley. Good points all.
Another option is to repurpose a domain name/one page site (used to be 1000 pages) that has been up for a long time, gained a bunch of authority/links for a totally unrelated subject, had a tragic developer experience where it's old content and could be used for this project. Currently it's a one page placeholder. That old TLD is equally meaningless to the new subject matter and could be anything.
So, if the choice were new 4 letter meaningless .com TLD or old meaningless 13 letter domain name with links for its old purpose and lots of old pages gone, which would you prefer? Is it hard to get Google to see an old domain name as a new subject... any harder than establishing relevance through content etc for a new domain name?
Thanks... Darcy
-
If it's a new domain then I definitely wouldn't go with anything like WhatAreTheBestRestaurantsIn.com. I would rather go with besteat.com or bestin.com and I could rank those domains much easier too. Don't start with a long spammy domain, build a brand instead. New domains with keywords help very little these days.
Most of the words in your domain examples are 'stop words' and shouldn't even be in domain names. (Words like 'are-best-in'). Even if you had categories for states they still don't belong in the final url either. Example, whatarethebestrestaurantsin.com/wa/seattle should still resolve to whatarethebestrestaurantsin.com/seattle Although you could still visit whatarethebestrestaurantsin.com/wa/ when you click on seatttle the url should rewrite to whatarethebestrestaurantsin.com/seattle
For longevity, quality, branding, trust, and non spammy purposes, I would build the site using clean short urls like the below made up examples. EMD's are all but dead, especially long ones like whatarethebestrestaurantsin.com
tastyeat.com/seattle/
bestin.com/seattle/
tastytown.com/seattle/
dinein.com/seattle/ -
Personally, I would go for something much shorter. Long domain names can appear spammy, and I believe are one of the spam metrics used by Moz in their spam score. The other problem with a long domain name is that pages and posts on your site may have titles which will be much too long to fit in a search engines search window, although you may be able to tweak this. You may well be better off having a very short domain name so that as new keywords come through which you want to target you can do this effectively without having too long a URL.
-
I would prefer this one, whatarethebestrestaurantsin.com/wa/seattle It keeps the state ISO in the url for when you grow large enough that you start running into cities with multiple names. Plus people are lazy, they abbreviate states and I think that helps with using that url structure as well.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Does Yelp Create URLs?
Hi all, How does Yelp (or other sites) go about creating URLs for just about every service and city possible ending with the search? in the URL like this https://www.yelp.com/search?cflt=chiropractors&find_loc=West+Palm+Beach%2C+FL. They clearly aren't creating all of these pages, so how do you go about setting a meta title/optimization formula that allows these pages to exist AND to be crawled by search engines and indexed?
Intermediate & Advanced SEO | | RickyShockley0 -
Canonical questions
Hi, We are working on a site that sells lots of variations of a certain type of product. (Car accessories) So lets say there are 5 products but each product will need a page for each car model so we will potentially have a lot of variations/pages. As there are a lot of car models, these pages will have pretty much the same content, apart from the heading and model details. So the structure will be something like this; Product 1 (landing page) Audi (model selection page)
Intermediate & Advanced SEO | | davidmaxwell
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) BMW (model selection page)
---BMW 1 Series (Model detail page)
---BMW 3 Series (Model detail page) Product 2 (landing page) Audi (model selection page)
---Audi A1 (Model detail page)
---Audi A2 (Model detail page)
---Audi A3 (Model detail page) BMW (model selection page)
etc
etc The structure is like this as we will be targeting each landing page for AdWords campaigns. As all of these pages could look very similar to search engines, will simply setting up each with a canonical be enough? Is there anything else we should do to ensure Google doesn't penalise for duplicate page content? Any thoughts or suggestions most welcome.
Thanks!0 -
Redirect to url with parameter
I have a wiki (wiki 1) where many of the pages are well index in google. Because of a product change I had to create a new wiki (wiki 2) for the new version of my product. Now that most of my customers are using the new version of my product I like to redirect the user from wiki 1 to wiki 2. An example of a redirect could be from wiki1.website.com/how_to_build_kitchen to wiki2.website.com/how_to_build_kitchen. Because of a technical issue the url I redirect to, needs to have a parameter like "?" so the example will be wiki2.website.com/how_to_build_kitchen? Will the search engines see it as I have two pages with same content?
Intermediate & Advanced SEO | | Debitoor
wiki2.website.com/how_to_build_kitchen
and
wiki2.website.com/how_to_build_kitchen? And will the SEO juice from wiki1.website.com/how_to_build_kitchen be transfered to wiki2.website.com/how_to_build_kitchen?0 -
Internal links question
I've read that Google frowns upon large numbers of internal links. We're building a site that helps users browse a list of shows via dozens of genres. If the genres are expose, say, as a pulldown menu as opposed to a list of static links, and selecting the pulldown option filters the list of shows, would those genres count against our internal links count?
Intermediate & Advanced SEO | | TheaterMania0 -
If I own a .com url and also have the same url with .net, .info, .org, will I want to point them to the .com IP address?
I have a domain, for example, mydomain.com and I purchased mydomain.net, mydomain.info, and mydomain.org. Should I point the host @ to the IP where the .com is hosted in wpengine? I am not doing anything with the .org, .info, .net domains. I simply purchased them to prevent competitors from buying the domains.
Intermediate & Advanced SEO | | djlittman0 -
International URL Puzzle
Hello, I have 4 different URL's going to 4 different countries that all contain the same content and Google is seeing them as duplicate pages. For ecommerce reasons I have to have these 4 pages separated. Here is a example of the pages below so you can see the URL structure: www.example/com/canada www.example.com/australia www.example.com/usa www.example.com/UK How do I fix this duplicate content problem? Thanks!
Intermediate & Advanced SEO | | digitalops0 -
2 Language Versions on Same URL
A site we are working on is a large gift retailer in Canada. They have a language option for French, but the page URLs are the same. If you click 'French' in the header, a cookie is set and then all pages are dynamically served the French content (and all nav/site elements of course change to French). The URLs then are exactly the same as it's the cookie that determines the language option to serve. e.g. www.site.ca/index.php?category=7&product=99.... would be the same regardless of if I'm set for English or French. Question: Does this setup have a negative impact on any SEO factors? The site has several thousand pages.
Intermediate & Advanced SEO | | BMGSEO0 -
Architecture questions.
I have two architecture related questions. Fewer folders is better. For example, www.site.com/product should rank better than www.site.com/foldera/folderb/product, all else constant. However, to what extreme does it make sense to remove folders? With a small site of 100 or so pages, why not put all files in the main directory? You'd have to manually build the navigation versus tying navigation to folder structure, but would the benefit justify the additional effort on a small site? I see a lot of sites with expansive footer menus on the home page and sometimes on every page. I can see how that would help indexing and user experience by making every page a click or two apart. However, what does that do to the flow of link juice? Does Google degrade the value of internal footer links like they do external footer links? If Google does degrade internal footer links, then having a bunch of footer links would waste link juice by sending a large portion of juice through degraded links, wouldn't it? Thank you in advance, -Derek
Intermediate & Advanced SEO | | dvansant0