Google Places & Multiple Listings
-
Our client used to have a listing in each city, but after updating the addresses they were forever under review. Google said that businesses serving customers at their locations can only list their primary office.
Back when this client had multiple city listings, all addresses but one were UPS boxes. If they are to change back to "No, all customers come to the business location," can they once again submit a listing for each city using these addresses?
Yes, I realize they are UPS boxes, but they insist on being listed for each city.
-
You are so welcome, Zeke!
-
Thank you, Miriam. Sometimes it's good to have a third party confirm what you already know the correct answer should be. Appreciate it.
-
Hi Zeke,
Oh, clients like these are a handful! Explain, very clearly, to the client that the reason their listings went under review was because they broke the rules. What they want to do now is still breaking the rules and could risk their one legitimate location's rankings if Google decides they are spamming the index. Don't be vague. Be totally straightforward on this. Show them the guidelines: http://support.google.com/places/bin/answer.py?hl=en&answer=107528
Especially this part:
Business Location: Use a precise, accurate address to describe your business location.
Do not create a listing or place your pin marker at a location where the business does not physically exist. P.O. Boxes are not considered accurate physical locations.
Do not create more than one listing for each business location, either in a single account or multiple accounts.
Businesses that operate in a service area, as opposed to a single location, should not create a listing for every city they service. Businesses that operate in a service area should create one listing for the central office or location and designate service areas. Learn how to add service areas to your listing.
If you don't conduct face-to-face business at your location, you must select "Yes, this business serves customers at their locations" under the "Service Areas and Location Settings" section of your dashboard, and then select the "Do not show my business address on my Maps listing" option.
If the client cannot see that these rules are precisely describing that what they want to do is a violation, my advice is to drop them like a hot potato.
Local SEOs strive to help honest business people - not to abet rule breakers. If your client changes his tune after he sees the guidelines, then you can offer him an alternative, legitimate strategy that would work along these lines:
-
The client may go after true local rankings for his city of location by running a well optimized website that incorporates important local hooks, by having a single Places listing/Google+ Local Page that follows all the rules, and by building citations for his single, legit address.
-
If he is a service-radius-type business (like a plumber, carpet cleaner, chimney sweep) and serves customers at their locations rather than at his location, then he must comply with the hide address rule on his single Places Listing.
-
All of the above goes toward achieving high local rankings within the pinned, lettered blended/local pack of results.
-
Now, to approach the task of ranking well for his service cities (as a plumber, carpet cleaner or lawyer would), he can begin to showcase his work in these other surrounding cities where he is not physically located by created awesome city landing pages for each. These pages must feature totally unique, first class copy (no cutting and pasting copy, no thin content). He can create a unique page for each city that he serves.
-
He can then work on earning links to these pages to improve their chances of rankings.
-
Unlike the goal of steps 1,2 and 3, the goal of steps 4 and 5 for his service cities will be organic rankings - not local rankings. Google predominantly views any business as being most relevant to its city of location - not its service cities, so this is vital for the client to understand.
By following the above method, the client will be doing all he can to try to gain high local rankings for his city of location terms, and high organic rankings for his service city location terms. This is a completely valid way of working with this type of business model. Lay it out clearly for the client what you can do, and then let him make a decision. If he just won't see the light, walk away...he's going to be living in penalty land until he decides to play by the rules. In my own work as a Local SEO, I have learned to shoot straight with clients like this one who are spamming either because they don't understand the rules, or because they do know the rules and want to bend them for their own perceived benefit. The first type, I have a wonderful opportunity to educate. The second type, I can be quite direct in stating that I only offer guidelines-compliant services. Then, let them decide. Good luck and I hope this helps!
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Google Indexing & Caching Some Other Domain In Place of Mine-Lost Ranking -Sucuri.net Found nothing
Again I am facing same Problem with another wordpress blog. Google has suddenly started to Cache a different domain in place of mine & caching my domain in place of that domain. Here is an example page of my site which is wrongly cached on google, same thing happening with many other pages as well - http://goo.gl/57uluq That duplicate site ( protestage.xyz) is showing fully copied from my client's site but showing all pages as 404 now but on google cache its showing my sites. site:protestage.xyz showing all pages of my site only but when we try to open any page its showing 404 error My site has been scanned by sucuri.net Senior Support for any malware & there is none, they scanned all files, database etc & there is no malware found on my site. As per Sucuri.net Senior Support It's a known Google bug. Sometimes they incorrectly identify the original and the duplicate URLs, which results in messed ranking and query results. As you can see, the "protestage.xyz" site was hacked, not yours. And the hackers created "copies" of your pages on that hacked site. And this is why they do it - the "copy" (doorway) redirects websearchers to a third-party site [http://www.unmaskparasites.com/security-report/?page=protestage.xyz](http://www.unmaskparasites.com/security-report/?page=protestage.xyz) It was not the only site they hacked, so they placed many links to that "copy" from other sites. As a result Google desided that that copy might actually be the original, not the duplicate. So they basically hijacked some of your pages in search results for some queries that don't include your site domain. Nonetheless your site still does quite well and outperform the spammers. For example in this query: [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 But overall, I think both the Google bug and the spammy duplicates have the negative effect on your site. We see such hacks every now and then (both sides: the hacked sites and the copied sites) and here's what you can do in this situation: It's not a hack of your site, so you should focus on preventing copying the pages: 1\. Contact the protestage.xyz site and tell them that their site is hacked and that and show the hacked pages. [https://www.google.com/search?q=](https://www.google.com/search?q=)%22We+offer+personalized+sweatshirts%22%2C+every+bride#q=%22GenF20+Plus+Review+Worth+Reading+If+You+are+Planning+to+Buy+It%22 Hopefully they clean their site up and your site will have the unique content again. Here's their email [email protected] 2\. You might want to send one more complain to their hosting provider (OVH.NET) [email protected], and explain that the site they host stole content from your site (show the evidence) and that you suspect the the site is hacked. 3\. Try blocking IPs of the Aruba hosting (real visitors don't use server IPs) on your site. This well prevent that site from copying your site content (if they do it via a script on the same server). I currently see that sites using these two IP address: 149.202.120.102\. I think it would be safe to block anything that begins with 149.202 This .htaccess snippet should help (you might want to test it) #-------------- Order Deny,Allow Deny from 149.202.120.102 #-------------- 4\. Use rel=canonical to tell Google that your pages are the original ones. [https://support.google.com/webmasters/answer/139066?hl=en](https://support.google.com/webmasters/answer/139066?hl=en) It won't help much if the hackers still copy your pages because they usually replace your rel=canonical with their, so Google can' decide which one is real. But without the rel=canonical, hackers have more chances to hijack your search results especially if they use rel=canonical and you don't. I should admit that this process may be quite long. Google will not return your previous ranking overnight even if you manage to shut down the malicious copies of your pages on the hacked site. Their indexes would still have some mixed signals (side effects of the black hat SEO campaign) and it may take weeks before things normalize. The same thing is correct for the opposite situation. The traffic wasn't lost right after hackers created the duplicates on other sites. The effect build up with time as Google collects more and more signals. Plus sometimes they run scheduled spam/duplicate cleanups of their index. It's really hard to tell what was the last drop since we don't have access to Google internals. However, in practice, if you see some significant changes in Google search results, it's not because of something you just did. In most cases, it's because of something that Google observed for some period of time. Kindly help me if we can actually do anything to get the site indexed properly again, PS it happened with this site earlier as well & that time I had to change Domain to get rid of this problem after I could not find any solution after months & now it happened again. Looking forward for possible solution Ankit
Intermediate & Advanced SEO | | killthebillion0 -
Ecommerce, SEO & Pagination
Hi I'm trying to workout if there's something wrong with our pagination. We include the rel="next" and "prev" on our pages. When clicking on page 2 on a product page, the URL will show as something like - /lockers#productBeginIndex:30&orderBy:5&pageView:list& However, if I search site:http://www.key.co.uk/en/key/lockers in Google, it seems to find paginated pages: http://www.key.co.uk/en/key/lockers?page=2 I have a feeling something is going wrong here, but haven't worked massively on Pagination before. Can anyone help?
Intermediate & Advanced SEO | | BeckyKey0 -
301 redirect to multiple domain
Hi guys, I have a domain A, B and C. The domain A was an association of two business and they are about to split. Parts of domain A are going to be redirect to domain B, but some content belong to the domain C. So my question : Is it possible to 301 redirect some pages from A to B and some other pages from A to C and if yes, what would be the impact on SEO ? Thanks a lot!
Intermediate & Advanced SEO | | StevePatenaude0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Is there any delay between crawling a page by google and displaying of the ratings in rich snippet of the results in google?
Intermediate & Advanced SEO | | NEWCRAFT0 -
Google Places - How do we rank
So, google places showing up on search results is great feature . . . But how can we get our results to the top? I mean I can see some terrible websites appearing at the top of the google places with their places page having no activity whatsoever. Is there a trick to this at all? What can we do to increase our ranking on Google Places because our old GOOD rankings are now appearing BELOW the map results Cheers
Intermediate & Advanced SEO | | kayweb0 -
Google Places verification
What advice do you have for achieving verification for Google Places for a client? I have a client at the moment and I tried getting the call sent through and I'm not sure what happened but a couple of tries at this did not work. I've tried the post card way and I'm still waiting. Do I need to be more patient in Australia for this verification post card? Is there a way I can verify the info myself? note: I have set up a seperate email that there business email to handle a lot of the link building but this is different to there business email which Google uses.
Intermediate & Advanced SEO | | iSenseWebSolutions0