What Mystery Local SEO Factors Are At Play Here?
-
Absolutely perplexed on the ranking factors for Google Maps (hence also the 3-pack in normal search results).
Are seeing search queries that return 3-pack and organic result like this and wondering why these sites are getting 3-pack preference?Not that sites 2 and 3 are no closer to the test user's location than Site 4. All 4 sites have a street address showing.3-pack result:#1 - Site 1 - No reviews. Same distance as Site 4 to user. #2 - Site 2 - 1 review for 1 star. Farther from user than site 4. #3 - Site 3 - 2 reviews for 5-star average. Farther from user than site 1, 2, and 4.#4 (not show in 3-pack) - Site 4 - 6 reviews with 6 star rating, closer to user than site 2 and 3.Organic results below 3-pack:#1 - Site 4#2 - Site 4#3 - Other site#4 - Site 1Sites 2 and 3 not in top 10 organic non-map resultsSo what would be the most likely ranking factors keeping making site 1-3 rank above site 4 in the 3-pack/map results?If on-page and backlink factors were at play, you'd expect to see sites 1, 2 and 3 higher than site 4, and in the case of site 2 and 3 at least in the top 10 of the organic results. All sites were similar distance to the user.
-
Well there's another 'mystery listing' in the same search now. Same case, business is not in close proximity, no reviews, poor orgranic rank. It is starting to look like indeed Google rotates in a random listing - sort of like it gives newer advertisers/ads some exposure in the Adwords auction to build some analytics data to see how effective the ad is (to see if they can make some money off it.)
This sort of makes sense from the 3-pack standpoint because businesses listed there will obviously get higher CTR and then would be self-perpetuating so to speak so that if the 3-pack was solely based on reviews, organic rank, CTR, and other aspects, the businesses in the 3-pack would almost never change. So they need to add some sort of random rotational function to give other businesses a "chance" to demonstrate their relevance. So one of the 3-pack spots may be rotating newer listings despite have little or poor local ranking factors such as organic rank and reviews. Just my educated guess based on lots of observations.
-
In addition the schema on the contact page uses the address:
2310 Central Ave, Irwindale, CA 91010 USA
Also not Los Angeles
-
I found the Wild Rabbit company at one point (may still be) had an address in Duarte, about 20 miles E/NE of Los Angeles.
Domain is registered in San Gabriel.
Business license has Woodland Hills and San Gabriel addresses.
If it's a proximity to center point thing I would guess they verified address is NOT one of these addresses.
-
Another thing I noticed about the original search is that there is heavy filtering going on at the automatic zoom level of the map. Once you zoom in, tons of other companies appear. So, this could point to Google lacking confidence in these results.
I found this pack interesting enough to share with Mike Blumenthal, who smartly pointed out the Google has no category for "drone company". Just a theory, but this could possibly be leading Google to have to rely on the signal of what is in the business title, and the company ranking #1 has added the keyword "drone" to their title (though it doesn't appear to be part of their legal business name, and is, of course, then not allowed). So, this could have something do do with the mysteriousness of this pack.
To see the centroid of a city, look it up in Google and click on the map. The spot where Google has placed the city name is the centroid. In this case, the centroid of LA is in the extreme east of the city borders. The company we're looking at lists no address on its GMB listing or website. The website just shows a map of LA. The GMB listing describes the business as being in Glendale, which is a bit to the north of the centroid. You could compare this to the revealed locations of the other two companies and see what you think. It's a good question you've raised.
At any rate, there seems to be a lack of Google confidence in these results.
-
Yes, that's an interesting observation.
Try searching: drone companies in los angeles ca
White Rabbit is still #2 but at least you see a more representative set of listings in the maps results.
Maybe the stark difference in map results between two very similar searches gives us a clue as to what's going on, but I've yet to figure it out.
One thought is for any city search Google has to use some specific location as the "center point" to determine proximity (for us users not physically in Los Angeles). Maybe the actual verified address of White Rabbit is nearest the point Google is using for the center of Los Angeles?
Wonder if there is a way to determine what Google is using as the center point?
-
Hey, that is a good mystery pack! Something seems odd about it. Do you notice that even when you click through on the map, there are only 3 companies, total, showing in the local finder view? Are there really only 3 drone companies in LA? I find that very hard to believe. For some reason I can't identify, Google is acting like it only knows of 3 such companies that match the query. I was expecting to see dozens of them upon clicking through to the local finder view. So, something is odd there.
-
Okay, for those that want an example, I found a good one.
Search: Los Angeles Drone Companies
Why the heck is Wild Rabbit listed #1 in 3-pack?
They are listed position 13 in organic SERPS. They have no reviews. They aren't showing their physical address (so no pin on map). They are in the HUGE market of Los Angeles. The don't have the words 'drone' or 'company' in their page title or content (only in their meta description). They aren't in any of the major directories (other than Yelp) like yellowpages.com or superpages.com
Baffling
-
Hi SEO1805,
Without seeing the actual result, this is shooting in the dark, but I'd look at filters (Possum), factors like domain authority, and the possibility of spam either positively or negatively impacting the results.
If you can share the SERP you're looking at, that might help us dig down a bit deeper on this.
I also recommend doing a complete competitive analysis between the site ranking #1 and the one you are marketing. (See: https://mza.seotoolninja.com/blog/basic-local-competitive-audit)
-
Yes, we all realize there are most likely hundreds of ranking factors although I would guess the 80/20 rule applies that 20% of the ranking factors make up 80% of the "weight" in the ranking algo.
One thing we no for sure is that Google's objective is to provide the most relevant search results given the user's intention. So for those of us that are intimately familiar with a specific business or subject area niche and all the players, we can compare the results to our human evaluation of what the real world situation is. You may know company A is the leader in the category with the best service and value and a long-standing history, great customer kudos, etc. So the results should steer you towards that company.
In my 17 years experience, i find it remarkable how on the mark the organic results are on Google. It really puts Bing and other search engines to shame. However I guess the point of this thread, speaking in general terms now, is that I'm not seeing that same AI ability transferred over to the local citation rankings on the 3-pack and Maps Search Results.
It's really in my mind not rocket science. Their organic algo IS rocket science in my opinion but tweeking it for local results is in my opinion a far simpler task by comparison. (a) Take advantage of your existing algo and make that a large part of your local ranking, (b) make proximity to user's location intent much stronger, (c) make backlinks on authoritative local directories or organizations stronger (BBB, Dunns, Chamber of Commerce, etc.) (d) add a bit more importance to user reviews.
What other factors could be as important or more important than those from a local search standpoint? This should be a fairly straight forward exercise in simple logic.
To me it looks like Google has not invested the same brain power in tweaking it's local rankings that it has in it's normal organic ranking algo and so going forward I would expect more significant changes to the local search algo by comparison.
-
First sorry for the typos. I did come up with one difference I know of... citation age. Site #4 is a newer business. But it is in all the aggregators and has proper local schema markup.
No significant pattern regarding page length. That seems to me would be another factor used in the regular organic results so wouldn't make sense Site #4 would rank so much better if it was being demoted on 3-pack due to page length. Site #4 does beat out 2 of the 3 sites in the 3-pack for many other similar searches though. So citation and/or domain age can't be that big of a factor.
I was always under the impressions that closeness to user's location was #1, most normal organic ranking factors was second most important, and reviews were last.
I guess another explanation could be the do some random round robin to agree similar to the Adwords auction in order to test CTR of newer ads.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regional multilingual seo
Hello, I have a problem related to 3 website that we are publishing in Belgium. 90% of the content will be exactly the same. The domain names are going to be xxxx-flanders.be (content in NL) xxxx-wallonia.be (content in FR and DE) xxxx-brussels.be (content in FR and NL) My problem is that I will have duplicate content in French between xxxx-wallonia.be and xxxx-brussels.be and duplicate content in Dutch between xxxx-flanders.be and xxxx-brussels.be The granularity of hreflang language annotation and geotargeting in the Google Search Console is at the country level. How can I do that at the regional level?
Local Website Optimization | | Lvet0 -
Local Business Schema Image requirement
Hello, I work exclusively with Dentists and we have been putting our json schema in the footer for a while now. Just recently they made 'image' a requirement for the Dentist category. We already use the logo in our schema and that is an image. Since the schema is in the footer, it is on every page, and the only image on every page is the logo. Does the image we add to our schema need to be on the actual web page or could it be anything related to the business, like an image of the practice or the dentist? Would it hurt to have the logo listed twice in the schema - once as the logo and once as the image? Trying to figure out what the best thing to do is for the required 'image' field for a dentist. Thanks! Angela
Local Website Optimization | | tntdental0 -
Does having an embedded Google Map still count as a positive SEO signal?
I know this was true a few years ago, however is there still an advantage to having an embedded map vs. a pop up map in 2017?
Local Website Optimization | | BigChad21 -
Best SEO practice for project galley (image gallery) ? I need SEO Professionals advice.
Hi, i Have a website that is powerful and i dont want to hurt it. http://dreamgaragedoor.com/ right now i need a projects gallery page that people goes there to find out the models and products and services images. i have created the page and it would be 6 slider in the page and each slider has at least 10 images inside. first question is having this much images would or wouldnt hurt my webiste. second what ALT should i use for this many pictures in 1 page. for example i think having ALT like below in one page would be bad SEO wise. Sliding-gate-1, Sliding-gate-2, Sliding-gate-3, Sliding-gate-4,... please take a look at the gallery page and let me have your pro ideas. http://dreamgaragedoor.com/galleries/ thanks
Local Website Optimization | | Mishel2980 -
Optimizing Local SEO for Two Locations
Hi there! I have a client that has just opened a 2nd location in another state. When optimizing for local I have a few questions: We're creating a landing page for each location, this will have contact information and ideally some information on each location. Any recomendations for content on these landing pages? The big question is dual city optimization. Should Include the city & state of BOTH locations in all my title tags? or should I leave that to the unique city landing pages? What other on-page optimizations should i consider across the site? Thanks! Jordan
Local Website Optimization | | WorkhorseMKT0 -
Company sells home appliances and commercial appliances. What is the best way to differentiate the two on our site for the best user experience/SEO?
Should we structure it starting at the homepage with the user selecting for home or for business, that way they have to make a selection before moving further OR should we somehow differentiate in the navigation using the top menu tabs, dropdowns, etc?
Local Website Optimization | | dkeipper1 -
Local SEO HELP for Franchise SAB Business
This all began when I was asked to develop experiment parameters for our content protocol & strategy. It should be simple right? I've reviewed A/B testing tips for days now, from Moz and other sources.I'm totally amped and ready to begin testing in Google Analytics. Say we have a restoration service franchise with over 40 franchises we perform SEO for. They are all over the US. Every franchise has their own local website. Example restorationcompanylosangeles.com Every franchise purchases territories in which they want to rank in. Some service over 100 cities. Most franchises also have PPC campaigns. As a part of our strategy we incorporate the location reach data from Adwords to focus on their high reach locations first. We have 'power pages' which include 5 high reach branch preferences (areas in which the owners prefer to target) and 5 non branch preference high reach locations. We are working heavily on our National brand presence & working with PR and local news companies to build relationships for natural backlinks. We are developing a strategy for social media for national brand outlets and local outlets. We are using major aggregators to distribute our local citation for our branch offices. We make sure all NAP is consistent across all citations. We are partners with Google so we work with them on new branches that are developing to create their Google listings (MyBusiness & G+). We use local business schema markup for all pages. Our content protocol encompasses all the needed onsite optimization tactics; meta, titles, schema, placement of keywords, semantic Q&A & internal linking strategies etc. Our leads are calls and form submissions. We use several call tracking services to monitor calls, caller's location etc. We are testing Callrail to start monitoring landing pages and keywords that generating our leads. Parts that I want to change: Some of the local sites have over 100 pages targeted for 'water damage + city ' aka what Moz would call "Doorway pages. " These pages have 600-1000 words all talking about services we provide. Although our writers (4 of them) manipulate them in a way so that they aren't duplicate pages. They add about 100 words about the city location. This is the only unique variable. We pump out about 10 new local pages a month per site - so yes - over 300 local pages a month. Traffic to the local sites is very scarce. Content protocol / strategy is only tested based on ranking! We have a tool that monitors ranking on all domains. This does not count for mobile, local, nor user based preference searching like Google Now. My team is deeply attached to basing our metrics solely on ranking. The logic behind this is that if there is no local city page existing for a targeted location, there is less likelihood of ranking for that location. If you are not seen then you will not get traffic nor leads. Ranking for power locations is poor - while less competitive low reach locations rank ok. We are updating content protocol by tweaking small things (multiple variants at a time). They will check ranking everyday for about a week to determine whether that experiment was a success or not. What I need: Internal duplicate content analyzer - to prove that writing over 400 pages a month about water damage + city IS duplicate content. Unique content for 'Power pages' - I know based on dozens of chats here on the community and in MOZ blogs that we can only truly create quality content for 5-10 pages. Meaning we need to narrow down what locations are most important to us and beef them up. Creating blog content for non 'power' locations. Develop new experiment protocol based on metrics like traffic, impressions, bounce rate landing page analysis, domain authority etc. Dig deeper into call metrics and their sources. Now I am at a roadblock because I cannot develop valid content experimenting parameters based on ranking. I know that a/b testing requires testing two pages that are same except the one variable. We'd either non index these or canonicalize.. both are not in favor of testing ranking for the same term. Questions: Are all these local pages duplicate content? Is there a such thing as content experiments based solely on ranking? Any other suggestions for this scenario?
Local Website Optimization | | MilestoneSEO_LA1 -
General SEO Help
Hi Everyone, **Website: **www.helppestcontrol.com I've been working on a wordpress based website for the past few months now. This is a new website that we designed for an existing company that decided to rebrand. The previous website had little to no traffic.. so we've basically started for scratch. I've followed SEO guides and have completed many of the basics. We started using MOZ just under a month ago and have made a ton of changes based upon those suggestions. With all of this being said, we have seen some slight improvements in traffic, but nothing truly noticeable. In fact, 90% of our traffic is coming from a Facebook PPC campaign. I think the main struggle is that the company has such a wide operating based (a ton of very small towns and cities). We created an optimize page for each one (same content, just switched out the keywords).. in hopes of driving traffic. Is this the correct approach? Or should be optimize for general terms such as "Bed Bug Removal" versus "Bed Bug Removal Barrie"? I was hoping that the community could take a look at the website (maybe run it through a few tests) and give me some more suggestions. I would really appreciate any feedback. Thank you!
Local Website Optimization | | Timrhendry0