Question about creating friendly URLs
-
I am working on creating new SEO friendly URLs for my company website.
The products are the items with the highest search volume and each is very geo-specific
There is not a high search volume for the geo-location associated with the product, but the searches we do get convert well.Do you think it is preferable to leave the location out of the URL or include it?
-
I agree with CleverPhD.
Keywords in URLs are extremely helpful both in terms of generating traffic and tracking your reporting in Analytics.
-
We do this currently and include the geo specific information in the URL. You have more than just a ranking boost, it also helps with CTR. The user sees the title, description and URL contain the keywords they are looking for and so it supports that you have the information they need. It has worked really well for us and also helps in organizing the site and also producing analytic reports as we can parse off the URL.
-
My experience is that having the main keyword in the URL is definitely a ranking factor.
If the location keyword phrases convert well, and are easier to rank for there is a definite benefit in including them in the URL, as well as optimising your product pages accordingly as well.
-
Yep - geo-location info is in both locations
-
Hi Sara
If I understand you it sounds like your products are Geo specific. If this IS the case then including the location in the URL is going to be a benefit right now as URL keywords still work, at least in Google.
Once the search engines figure out how to properly rank webpages without taking account of the URL then that little trick will have no effect.
Are you including the same geo-specific info in the product details too?
Regards
Steve
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Urls
Hi All, I have a question I hope someone can help me with. I ran a scan on a website and it has a stack of urls that are far too long. I am going through and changing the urls to shorter ones. But my question is regarding redirections. Wordpress seems to be automatically redirecting the old urls to the new ones, should i be adding a more solid 301 in as well or is the wordpress redirect enough? I ask as they dont all seem to stay redirecting Thanks in advance for the help
Technical SEO | | DaleZon2 -
Why xml generator is not detecting all my urls?
Hi Mozzers, After adding 3 new pages to example.com, when generating the xml sitemap, Iwasn't able to locate those 3 new url. This is the first time it is happening. I have checked the meta tags of these pages and they are fine. No meta robots setup! Any thoughts or idea why this is happening? how to fix this? Thanks!
Technical SEO | | Ideas-Money-Art0 -
Newbie Duplicate Title Question
We recently update our website with DNN 6. Once the upgrade was done, I kept recieving log in links on my duplicate title and duplicate content error reports. Is anyone familiar with how to stop these links from showing up? Example of link: http://www.faisongroup.com/Login/tabid/750/Default.aspx?returnurl=%2F Any help would be greatly appreciated! Thank you!
Technical SEO | | VeronicaCFowler0 -
How to handle temporary campaign URLs
Hi, We have just run a yearly returning commercial campaign for which we have created optimized URL's. (e.g. www.domain.tld/campaign including the category and brand names after the campaign www.domain.tld./campaign/womens This has resulted in 4500+ URL's being indexed in Google including the campaign name, now the campaign is over and these URL's do not exist anymore. How should we handle those URL's? 1.) 301 them to the correct category without the campaign name 2.) Create a static page www.domain.tld/campaign to which we 301 all URL's that have the campaign name in them Do you have any other suggestions on what the best approach would be? This is a yearly commercial campaign so in a year time we will have the same URL's again. Thanks, Chris
Technical SEO | | eCommerceSEO0 -
Title Length Question?
So we have a lot of UGC on our site and so the title of pages is often created by the user and this has created about 400 pages with over 70 characters and I was just wondering what people think. I know typically keeping them short and sweet is the best thing, but what about when it's the user doing it? Should I go ahead and cut off the titles at 70 characters or keep them? I don't see it hurting traffic so I'm basically just looking for opinions right now.
Technical SEO | | KateGMaker0 -
Caps in URL creating duplicate content
Im getting a bunch of duplicate content errors where the crawl is saying www.url.com/abc has duplicate at www.url.com/ABC The content is in magento and the url settings are lowercase, and I cant figure out why it thinks there is duplicate consent. These are pages with a decent number of inbound links.
Technical SEO | | JohnBerger0 -
GWT, URL Parameters, and Magento
I'm getting into the URL parameters in Google Webmaster Tools and I was just wondering if anyone that uses Magento has used this functionality to make sure filter pages aren't being indexed. Basically, I know what the different parameters (manufacturer, price, etc.) are doing to the content - narrowing. I was just wondering what you choose after you tell Google what the parameter's function is. For narrowing, it gives the following options: Which URLs with this parameter should Googlebot crawl? <label for="cup-crawl-LET_GOOGLEBOT_DECIDE">Let Googlebot decide</label> (Default) <label for="cup-crawl-EVERY_URL">Every URL</label> (the page content changes for each value) <label style="color: #5e5e5e;" for="cup-crawl-ONLY_URLS_WITH_VALUE">Only URLs with value</label> ▼(may hide content from Googlebot) <label for="cup-crawl-NO_URLS">No URLs</label> I'm not sure which one I want. Something tells me probably "No URLs", as this content isn't something a user will see unless they filter the results (and, therefore, should not come through on a search to this page). However, the page content does change for each value.I want to make sure I don't exclude the wrong thing and end up with a bunch of pages disappearing from Google.Any help with this is greatly appreciated!
Technical SEO | | Marketing.SCG0 -
Duplicate content question with PDF
Hi, I manage a property listing website which was recently revamped, but which has some on-site optimization weaknesses and issues. For each property listing like http://www.selectcaribbean.com/property/147.html there is an equivalent PDF version spidered by google. The page looks like this http://www.selectcaribbean.com/pdf1.php?pid=147 my question is: Can this create a duplicate content penalty? If yes, should I ban these pages from being spidered by google in the robots.txt or should I make these link nofollow?
Technical SEO | | multilang0