How do you handle different business locations for search?
-
Would like to get peoples suggestion on how you handle different business locations in different cities. We have tried multiple tactics for different clients. Some have worked better than others.
Example: We have a window company that does great in Austin. Now they want to move into Dallas. In the past, we have created "landing pages" optimized for that locality. However, with Googles new updates I dont think this will work anymore. With others, we have added a "tab" and have similar pages of the entire site optimized for that location. This seems to have been working better.
What are your suggestion of how you handle different locations from city to city.
Any input is appreciated! Thanks!
Charles
-
Hi Charles,
Do you mean that your client has more than one physical location, or that they have a single physical location and would like to rank for more than one city?
If the former, then standard practice is to have a landing page for each physical office, optimized for that city with the complete NAP and unique text. The Google+ Local page and citations can then be tied to this page.
If the latter, then you can't build a Google+ Local page or citations because these require a physical address within a given city. Instead, common practice is to build city landing pages that showcase your work in that city. You can back this up with linkbuilding to these pages, if possible. You can also work on other efforts such a Social Media outreach to audiences in these cities. Video marketing can be a big help, too.
Basically, your options depend on your business model. Does this answer your question? Please, let me know.
-
Hi Charles
This may be something you have used in the past as well, but not mentioned in what you said. Have you created a Google Places/Local page for the business to sidecar off of their local website and web pages? Have you checked out https://getlisted.org/ ?
Additionally, Rand's Whiteboard Friday on Local citations may help you. See here...
I hope that helps,
Peter
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to best handle search landing pages - that don't exist
I have quite a bit of blog information that can be searched, which results in "pages" that don't actually live anywhere. These are scanned by Moz and appear as poor page quality for speed, etc. How do I get the service to either ignore all of these or is there a way to treat them as a real page with content? As there are quite a few generated over time, I'd like to be able to capture them somehow. Thanks.
On-Page Optimization | | amac70 -
Should blog tags be location specific?
I understand the concept of organizing blogs with categories, but how specific should the tags on blog posts be? i.e. "cosmetic dentist" vs. "cosmetic dentist san francisco" I'm specifically using Squarespace if that helps. Thanks!
On-Page Optimization | | ReaderSam0 -
My company's product is referred to by two different names (SVN and Subversion). When cleaning up our Title tags, is it OK to use either name to keep the title tags around 70 characters?
I am cleaning up title tags that are too long or not correct. In our title tag we reference our product (a version of OSS source code). This product is often referred to as both SVN or Subversion. When writing Title tags is it OK to use one or the other depending on the length of the Title Tag? For instance: Contact Us | Free SVN & Git Hosting | Bug & Issue tracking | CloudForge vs **About CloudForge | Free Subversion & Git Hosting | Bug Tracking ** | |
On-Page Optimization | | CollabNet0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
Does hosting server location affect my local google page rank
Hi All, This is my first (of many I would say) questions. My clients site is in Ireland but the hosting service is in Canada. So when i use the MozBar it states that the IP is in Canada. Will this affect my page ranking in Dublin, Ireland? The company is a plumbing service so we would only want local Dublin customers. Thanks so much and I hope someone can help me out. cheers, Aidan
On-Page Optimization | | aidanlawlor0 -
URL for location pages
Hello all We would like to create clean, easy URLs for our large list of Location pages. If there are a few URLs for each of the pages, am I right when I'm saying we would like this to be the canonical? Right now we would like the URL to be: For example
On-Page Optimization | | Ferguson
Domain.com/locations/Columbus I have found some instances where there might be 2,3 or more locations in the same city,zip. My conclusion for these would be: adding their Branch id's on to the URL
Domain.com/locations/Columbus/0304 Is this an okay approach? We are unsure if the URL should have city,State,zip for SEO purposes?
The pages will have all of this info in it's content
BUT what would be best for SEO and ranking for a given location? Thank you for any info!0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0