Local results vs Normal results
-
Hi everyone,
I am currently working on the website of a friend, who's owning a French spa treatment company.
I have been working on it for the past 6 months, mostly on optimizing the page titles and the link building.
So far the results are great in terms on normal results : if you type most of the keywords and the city name, the website would be very well positioned, if not top positioned.
My only problem is that in the local results (Google Maps), nothing has improved at all. In most of the same keyword where the website is ranking 1st on normal results, the website doesn't appear at all on the same keywords in local results. This is confusing as you would think Google think the website is relevant to the subject according to the normal results but it doesn't show any good ones in a local matter.
The website is clearly located in the city (thanks to the pages titles and there's a Google Map in a specific page dedicated to its location).
The company has a Google Places page and it has positive customers reviews on different trusted websites for more than a year now (the website is 2 years old). I focused my work concerning the link building on the local websites (directories and specialized websites) for the past 2 months. The results kept improving on normal results but still no improvement at all in the local ones.
As far as I know, there is no mistakes such as multiple addresses for the same business etc. Everything seems to be done by the rules.
I am not sure at all what more I can do. The competitors do not seem to be working their SEO pretty much and in terms of linking (according to the -pretty good- Seomoz tools), they have up to 10 times less (good) links than us.
Maybe you guys have some advice on how I can manage this situation ? I'm kind of lost here
Thanks a lot for your help, appreciate it.
Cheers,
Raphael -
Hi Raphael,
Without actually investigating your client's unique business, my response will have to be somewhat general. The main areas one would investigate would be:
1. Age of domain
2. Strength/optimization of website/inclusion of local hooks on site (like NAP)
3. Correctness of Google local listings/lack of violations/lack of duplicates
4. Number, quality and age of citations (if yours are new, you need to give them time to go into effect)
5. Consistency of NAP across all citations
6. Review count
7. Proximity to city centroid
8. Traditional SEO factors, such as linkbuilding
There are other areas that could be applicable, certainly, but these are the main ones.
Here are some resources regarding local search rankings and related factors:
http://www.davidmihm.com/local-search-ranking-factors.shtml
http://blumenthals.com/blog/2012/09/26/infographic-citations-time-to-live/
http://www.localvisibilitysystem.com/2012/10/02/how-long-does-local-search-visibility-take/
The above are a few selections which I feel will be highly relevant to what you are trying to understand about local search ranking factors and the time it can take for your work to go into full effect.
I hope you will find these useful!
-
The first thing I would say is I've learned not to rely on what I call Google Places "vanity searches" because sometimes your proximity may be preventing you from seeing positive results for your businesses.
I would like to know is what do the analytics look like from within the Google places login for the business. Are you getting any impressions at all? If you are, that's a good sign. If you're not getting any impressions (not even for business name + city name searches for instance) that's usually a sign of something being wrong with your profile, such as a wrong category, not enough content, a user flagged profile etc.
If you're getting impressions but not from your most desirable keywords, then really examine the content of your profile and look for opportunities to add those keywords into your Google places content without being spammy - especially not in the title of the business. Just about everywhere else is fair game thought so long as it's logical.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
Cannibalization vs long tail keyword dilemma
Hi all. I have a dilemma that I'm trying to work out a solution to and could use some input. We offer a Foreign Qualification (FQ) service for businesses, and thus "foreign qualification" is a strong keyword for which we currently hold great ranking position for our service page. FQ is different in each state, so we have a series of blog posts focusing on the requirements for each state. "Alabama foreign qualification" is one of many long tail keywords (50 states x various phrasings) we're targeting here. The problem is that it's impossible to write 50 blog posts that are not very similar content, since the process is similar, just not identical, in each state. I'm worried about duplicate content penalties here. I'm thinking that I'd want to create a landing page that serves as a hub for each of these blog posts, perhaps with a reference table for the 50 states too, and set the blog post canonicals to this landing page (thereby pushing all state-focused long tail KWs there). However, I don't want to take away ranking strength of the aforementioned service page for the primary keyword. If I do this, and also link the new landing page to the service page using "foreign qualification" as the anchor text, am I more likely to add or take away from the strength of the service page? Thanks for any and all insight!
Intermediate & Advanced SEO | | mkupfer1 -
Rubber Ball Ranking Results
We noticed a few weeks ago that rankings for the phrase
Intermediate & Advanced SEO | | Jayblue
Charity Collection Buckets
Were bouncing between this page
http://www.carefundraisingsupplies.co.uk/fundraising-products/Charity-Collection-Buckets
Rank 16
to this page
http://www.carefundraisingsupplies.co.uk/fundraising-products/fundraising-supplies
Rank 85
So we de-SEO'd the second page and added more content to the first page.
This seemed to lock Google onto the first page at 16, but it then started to slowly slide downwards. We have made a few more on page text tweaks, tried to reduce keywords density all to no avail. Even though overall this site has a better DA and MOZ profile than those ranked 1 and 2 for the phrase, we just cannot seem to get it moving in the right direction. We are just about to apply some quality links to see if that helps. But we are wondering if we are missing something at a technical level, like category structure, Canonicalisation, 301 redirects or something else. Any thoughts?0 -
Using Meta Header vs Robots.txt
Hey Mozzers, I am working on a site that has search-friendly parameters for their faceted navigation, however this makes it difficult to identify the parameters in a robots.txt file. I know that using the robots.txt file is highly recommended and powerful, but I am not sure how to do this when facets are using common words such as sizes. For example, a filtered url may look like www.website.com/category/brand/small.html Brand and size are both facets. Brand is a great filter, and size is very relevant for shoppers, but many products include "small" in the url, so it is tough to isolate that filter in the robots.txt. (I hope that makes sense). I am able to identify problematic pages and edit the Meta Head so I can add on any page that is causing these duplicate issues. My question is, is this a good idea? I want bots to crawl the facets, but indexing all of the facets causes duplicate issues. Thoughts?
Intermediate & Advanced SEO | | evan890 -
Domain authority vs. moz difficulty
what type relationship do you see with domain authority and moz difficulty scores? i'm finding a rule of "tens' usually applies.... meaning if da = 45, then difficulty scores of 40-50 are generally within short term reach (3-6 months of simple onpage optimization and an appropriate # ofinbound links to the page). your thoughts/data? just trying to get a feel for a consensus 🙂
Intermediate & Advanced SEO | | DonnieCooper0 -
Mission Possible? You have 3 hours to do Local SEO. Which top 5 sites do you go Social Bookmark, Local Search Engine Submit and Directory List.
Mission Possible? Here is a test. Suppose you had 3 hours (okay 7) to go and submit links, etc, on Social Bookmarking, Local Search Engines and Directories, which top 5 or more of each would you do? (Assuming your on-page is already sweetened). I just got 2 more clients and I need to get started on a few things for each. Thankful for all your advice.............
Intermediate & Advanced SEO | | greenhornet770 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
How to redirect www vs. non-www in IIS
I have been wanting to set our site up to redirect non-www to www for the SEO benefits so often described here on SeoMoz. I see a lot on Apache but not so much for IIS. Is there any developers here that can point me to a how tutorial for people with little IIS experiences?
Intermediate & Advanced SEO | | KJ-Rodgers0