Multiple Locations Google Places (URL's)?
-
I'm managing a restaurant chain with 10 locations. Can they all share the home page url of the corporate site in Google Places or is it better to link each location url separately?
Meaning can I use www.company.com for all locations in Google places for all locations or is it better to go with www.company.com/location.html for each location.
The page authority of the home page is 60 while individual location pages the page authority is in the 20's.
Hope this makes sense.
Thanks
Gary -
Hi Gary,
Ben has provided a good answer. Now, technically, you are perfectly free to link to the homepage from all of the local listings. That is in no way forbidden, but it is a preferred practice in many cases to link to a landing page for each location. Just be sure your location landing pages are strong pages (not duplicates, not with thin content, etc.). The use of landing pages is also a good place to reinforce the NAP (name, address, phone) for each unique location. They will all share the same business name, but will have unique street addresses and phone numbers, and these can be clearly listed on each respective landing page.
Years ago, Eric Enge interviewed Carter Maslan on this topic, and to my mind, Carter's advice about these types of landing pages still stands: http://www.stonetemple.com/articles/interview-carter-maslan-032710.shtml
Scroll down to this part and read from there:
Eric Enge: Let’s say you have more than one location, 100 for example. In your view, is it helpful to have individual pages on the website for all of the locations? Also, is it helpful to have the Google local business center linked to each of those individual pages rather than having 100 locations that point to a single web address?
*Note, this was published in 2010, so some of the terminology is outdated, but the main message about this particular topic is still right on. Hope this helps!
-
It's best to link them to location pages on the domain. In local, domain authority wins over page authority. User experience wins over bot experience. This is the current trend of local and it doesn't appear to be changing. Here is a great article by Nifty Marketing that illustrates a good local landing page.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Only Indexing Canonical Root URL Instead of Specified URL Parameters
We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.
Intermediate & Advanced SEO | | Nitruc0 -
Partial Match or RegEx in Search Console's URL Parameters Tool?
So I currently have approximately 1000 of these URLs indexed, when I only want roughly 100 of them. Let's say the URL is www.example.com/page.php?par1=ABC123=&par2=DEF456=&par3=GHI789= All the indexed URLs follow that same kinda format, but I only want to index the URLs that have a par1 of ABC (but that could be ABC123 or ABC456 or whatever). Using URL Parameters tool in Search Console, I can ask Googlebot to only crawl URLs with a specific value. But is there any way to get a partial match, using regex maybe? Am I wasting my time with Search Console, and should I just disallow any page.php without par1=ABC in robots.txt?
Intermediate & Advanced SEO | | Ria_0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
Removing Dynamic "noindex" URL's from Index
6 months ago my clients site was overhauled and the user generated searches had an index tag on them. I switched that to noindex but didn't get it fast enough to avoid being 100's of pages indexed in Google. It's been months since switching to the noindex tag and the pages are still indexed. What would you recommend? Google crawls my site daily - but never the pages that I want removed from the index. I am trying to avoid submitting hundreds of these dynamic URL's to the removal tool in webmaster tools. Suggestions?
Intermediate & Advanced SEO | | BeTheBoss0 -
Google+ Pages on Google SERP
Do you think that a Google+ Page (not profile) could appear on the Google SERP as a Rich Snippet Author? Thanks
Intermediate & Advanced SEO | | overalia0 -
Can you see the 'indexing rules' that are in place for your own site?
By 'index rules' I mean the stipulations that constitute whether or not a given page will be indexed. If you can see them - how?
Intermediate & Advanced SEO | | Visually0 -
Where do we place Google plus one button
Does Google +1 button have to placed on each page of the website or on on the home page ?
Intermediate & Advanced SEO | | seoug_20050 -
DCMI and Google's rich snippets
I haven't seen any consistent information regarding DCMI tags for organic SEO in a couple of years. Webmaster Tools obviously has a rich set of instructions for microdata. Has there been any updated testing on DCMI or information above the whisper/rumor stage on whether engines will be using Dublin? As a final point, would it be worth going back to static pages that haven't been touched in a couple of years and updating them with microdata? It seems a natural for retail sites and maybe some others, but what about content heavy pages?
Intermediate & Advanced SEO | | jimmyseo0