Google Places & Multiple Listings
-
Our client used to have a listing in each city, but after updating the addresses they were forever under review. Google said that businesses serving customers at their locations can only list their primary office.
Back when this client had multiple city listings, all addresses but one were UPS boxes. If they are to change back to "No, all customers come to the business location," can they once again submit a listing for each city using these addresses?
Yes, I realize they are UPS boxes, but they insist on being listed for each city.
-
You are so welcome, Zeke!
-
Thank you, Miriam. Sometimes it's good to have a third party confirm what you already know the correct answer should be. Appreciate it.
-
Hi Zeke,
Oh, clients like these are a handful! Explain, very clearly, to the client that the reason their listings went under review was because they broke the rules. What they want to do now is still breaking the rules and could risk their one legitimate location's rankings if Google decides they are spamming the index. Don't be vague. Be totally straightforward on this. Show them the guidelines: http://support.google.com/places/bin/answer.py?hl=en&answer=107528
Especially this part:
Business Location: Use a precise, accurate address to describe your business location.
Do not create a listing or place your pin marker at a location where the business does not physically exist. P.O. Boxes are not considered accurate physical locations.
Do not create more than one listing for each business location, either in a single account or multiple accounts.
Businesses that operate in a service area, as opposed to a single location, should not create a listing for every city they service. Businesses that operate in a service area should create one listing for the central office or location and designate service areas. Learn how to add service areas to your listing.
If you don't conduct face-to-face business at your location, you must select "Yes, this business serves customers at their locations" under the "Service Areas and Location Settings" section of your dashboard, and then select the "Do not show my business address on my Maps listing" option.
If the client cannot see that these rules are precisely describing that what they want to do is a violation, my advice is to drop them like a hot potato.
Local SEOs strive to help honest business people - not to abet rule breakers. If your client changes his tune after he sees the guidelines, then you can offer him an alternative, legitimate strategy that would work along these lines:
-
The client may go after true local rankings for his city of location by running a well optimized website that incorporates important local hooks, by having a single Places listing/Google+ Local Page that follows all the rules, and by building citations for his single, legit address.
-
If he is a service-radius-type business (like a plumber, carpet cleaner, chimney sweep) and serves customers at their locations rather than at his location, then he must comply with the hide address rule on his single Places Listing.
-
All of the above goes toward achieving high local rankings within the pinned, lettered blended/local pack of results.
-
Now, to approach the task of ranking well for his service cities (as a plumber, carpet cleaner or lawyer would), he can begin to showcase his work in these other surrounding cities where he is not physically located by created awesome city landing pages for each. These pages must feature totally unique, first class copy (no cutting and pasting copy, no thin content). He can create a unique page for each city that he serves.
-
He can then work on earning links to these pages to improve their chances of rankings.
-
Unlike the goal of steps 1,2 and 3, the goal of steps 4 and 5 for his service cities will be organic rankings - not local rankings. Google predominantly views any business as being most relevant to its city of location - not its service cities, so this is vital for the client to understand.
By following the above method, the client will be doing all he can to try to gain high local rankings for his city of location terms, and high organic rankings for his service city location terms. This is a completely valid way of working with this type of business model. Lay it out clearly for the client what you can do, and then let him make a decision. If he just won't see the light, walk away...he's going to be living in penalty land until he decides to play by the rules. In my own work as a Local SEO, I have learned to shoot straight with clients like this one who are spamming either because they don't understand the rules, or because they do know the rules and want to bend them for their own perceived benefit. The first type, I have a wonderful opportunity to educate. The second type, I can be quite direct in stating that I only offer guidelines-compliant services. Then, let them decide. Good luck and I hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is my content being fully read by Google?
Hi mozzers, I wanted to ask you a quick question regarding Google's crawlability of webpages. We just launched a series of content pieces but I believe there's an issue.
Intermediate & Advanced SEO | | TyEl
Based on what I am seeing when I inspect the URL it looks like Google is only able to see a few titles and internal links. For instance, when I inspect one of the URLs on GSC this is the screenshot I am seeing: image.pngWhen I perform the "cache:" I barely see any content**:** image.pngVS one of our blog post image.png Would you agree with me there's a problem here? Is this related to the heavy use of JS? If so somehow I wasn't able to detect this on any of the crawling tools? Thanks!0 -
HTTPS & Redirects
Hi We're moving to https imminently & I wondered if anyone has advice on redirects. Obviously we'll be redirecting all http versions to https - but should I be checking how many redirects are in each chain and amending accordingly? If there's 4-5 in a chain, remove the middle unnecessary URLS ? Advice please 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
Pagination & duplicate meta
Hi I have a few pages flagged for duplicate meta e.g.: http://www.key.co.uk/en/key/workbenches?page=2
Intermediate & Advanced SEO | | BeckyKey
http://www.key.co.uk/en/key/workbenches I can;t see anything wrong with the pagination & other pages have the same code, but aren't flagged for duplicate: http://www.key.co.uk/en/key/coshh-cabinets http://www.key.co.uk/en/key/coshh-cabinets?page=2 I can't see to find the issue - any ideas? Becky0 -
Business Listing sites SEO
How websites like justdial, askme, indiamart, tradeindia do their search engine optimization? Is it different from normal seo? please help
Intermediate & Advanced SEO | | Obbserv0 -
Google Penalty - Has It Been Lifted?
Hi, We have been trying to remove a ‘partial’ google penalty for a new client by the way of removing unnatural backlinks over a period of time and then submitting a reconsideration request, and uploading a disavow file etc. Previously Google listed the partial penalty in the ‘manual actions’ section of webmaster tools, making it possible for us to submit a reconsideration request. Having just logged in however we get the message ‘no manual webspam actions found’. So there isn’t any way we can submit a reconsideration request. Does this mean that the penalty has been lifted? Or could it still exist? If the latter is there any other way to submit a reconsideration request? Many thanks in advance, Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Google disavow tool
I have an algorithmic penalty on one of my websites. I never received a notification of a manual penalty in GWMT and even sent in a reconsideration request 6 months ago ad they told me their were no manual penalties on the website. I have cleaned up my link profile and what I could not clean up I sent in using the Google disavow tool a few days ago. I've heard to just wait if it's algorithmic or should I send in another reconsideration request for disavow links tool?
Intermediate & Advanced SEO | | MarkHIggins0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280 -
Check Google ban on domainname
Hello all, If I wanted to know if a domainname has a google ban on it would the following be a good idea to test it. Place an article on the domain page with unique content and then link to the page so its gets indexed and then link to the article from a well indexed page. If it doesn't get indexed there might be a ban on the page, if it does get indexed there is no ban on the page... Or are there other points I should keep in mind while doing this. All help is very welcome. Cheers, Arnout
Intermediate & Advanced SEO | | hellemans0