What Mystery Local SEO Factors Are At Play Here?
-
Absolutely perplexed on the ranking factors for Google Maps (hence also the 3-pack in normal search results).
Are seeing search queries that return 3-pack and organic result like this and wondering why these sites are getting 3-pack preference?Not that sites 2 and 3 are no closer to the test user's location than Site 4. All 4 sites have a street address showing.3-pack result:#1 - Site 1 - No reviews. Same distance as Site 4 to user. #2 - Site 2 - 1 review for 1 star. Farther from user than site 4. #3 - Site 3 - 2 reviews for 5-star average. Farther from user than site 1, 2, and 4.#4 (not show in 3-pack) - Site 4 - 6 reviews with 6 star rating, closer to user than site 2 and 3.Organic results below 3-pack:#1 - Site 4#2 - Site 4#3 - Other site#4 - Site 1Sites 2 and 3 not in top 10 organic non-map resultsSo what would be the most likely ranking factors keeping making site 1-3 rank above site 4 in the 3-pack/map results?If on-page and backlink factors were at play, you'd expect to see sites 1, 2 and 3 higher than site 4, and in the case of site 2 and 3 at least in the top 10 of the organic results. All sites were similar distance to the user.
-
Well there's another 'mystery listing' in the same search now. Same case, business is not in close proximity, no reviews, poor orgranic rank. It is starting to look like indeed Google rotates in a random listing - sort of like it gives newer advertisers/ads some exposure in the Adwords auction to build some analytics data to see how effective the ad is (to see if they can make some money off it.)
This sort of makes sense from the 3-pack standpoint because businesses listed there will obviously get higher CTR and then would be self-perpetuating so to speak so that if the 3-pack was solely based on reviews, organic rank, CTR, and other aspects, the businesses in the 3-pack would almost never change. So they need to add some sort of random rotational function to give other businesses a "chance" to demonstrate their relevance. So one of the 3-pack spots may be rotating newer listings despite have little or poor local ranking factors such as organic rank and reviews. Just my educated guess based on lots of observations.
-
In addition the schema on the contact page uses the address:
2310 Central Ave, Irwindale, CA 91010 USA
Also not Los Angeles
-
I found the Wild Rabbit company at one point (may still be) had an address in Duarte, about 20 miles E/NE of Los Angeles.
Domain is registered in San Gabriel.
Business license has Woodland Hills and San Gabriel addresses.
If it's a proximity to center point thing I would guess they verified address is NOT one of these addresses.
-
Another thing I noticed about the original search is that there is heavy filtering going on at the automatic zoom level of the map. Once you zoom in, tons of other companies appear. So, this could point to Google lacking confidence in these results.
I found this pack interesting enough to share with Mike Blumenthal, who smartly pointed out the Google has no category for "drone company". Just a theory, but this could possibly be leading Google to have to rely on the signal of what is in the business title, and the company ranking #1 has added the keyword "drone" to their title (though it doesn't appear to be part of their legal business name, and is, of course, then not allowed). So, this could have something do do with the mysteriousness of this pack.
To see the centroid of a city, look it up in Google and click on the map. The spot where Google has placed the city name is the centroid. In this case, the centroid of LA is in the extreme east of the city borders. The company we're looking at lists no address on its GMB listing or website. The website just shows a map of LA. The GMB listing describes the business as being in Glendale, which is a bit to the north of the centroid. You could compare this to the revealed locations of the other two companies and see what you think. It's a good question you've raised.
At any rate, there seems to be a lack of Google confidence in these results.
-
Yes, that's an interesting observation.
Try searching: drone companies in los angeles ca
White Rabbit is still #2 but at least you see a more representative set of listings in the maps results.
Maybe the stark difference in map results between two very similar searches gives us a clue as to what's going on, but I've yet to figure it out.
One thought is for any city search Google has to use some specific location as the "center point" to determine proximity (for us users not physically in Los Angeles). Maybe the actual verified address of White Rabbit is nearest the point Google is using for the center of Los Angeles?
Wonder if there is a way to determine what Google is using as the center point?
-
Hey, that is a good mystery pack! Something seems odd about it. Do you notice that even when you click through on the map, there are only 3 companies, total, showing in the local finder view? Are there really only 3 drone companies in LA? I find that very hard to believe. For some reason I can't identify, Google is acting like it only knows of 3 such companies that match the query. I was expecting to see dozens of them upon clicking through to the local finder view. So, something is odd there.
-
Okay, for those that want an example, I found a good one.
Search: Los Angeles Drone Companies
Why the heck is Wild Rabbit listed #1 in 3-pack?
They are listed position 13 in organic SERPS. They have no reviews. They aren't showing their physical address (so no pin on map). They are in the HUGE market of Los Angeles. The don't have the words 'drone' or 'company' in their page title or content (only in their meta description). They aren't in any of the major directories (other than Yelp) like yellowpages.com or superpages.com
Baffling
-
Hi SEO1805,
Without seeing the actual result, this is shooting in the dark, but I'd look at filters (Possum), factors like domain authority, and the possibility of spam either positively or negatively impacting the results.
If you can share the SERP you're looking at, that might help us dig down a bit deeper on this.
I also recommend doing a complete competitive analysis between the site ranking #1 and the one you are marketing. (See: https://mza.seotoolninja.com/blog/basic-local-competitive-audit)
-
Yes, we all realize there are most likely hundreds of ranking factors although I would guess the 80/20 rule applies that 20% of the ranking factors make up 80% of the "weight" in the ranking algo.
One thing we no for sure is that Google's objective is to provide the most relevant search results given the user's intention. So for those of us that are intimately familiar with a specific business or subject area niche and all the players, we can compare the results to our human evaluation of what the real world situation is. You may know company A is the leader in the category with the best service and value and a long-standing history, great customer kudos, etc. So the results should steer you towards that company.
In my 17 years experience, i find it remarkable how on the mark the organic results are on Google. It really puts Bing and other search engines to shame. However I guess the point of this thread, speaking in general terms now, is that I'm not seeing that same AI ability transferred over to the local citation rankings on the 3-pack and Maps Search Results.
It's really in my mind not rocket science. Their organic algo IS rocket science in my opinion but tweeking it for local results is in my opinion a far simpler task by comparison. (a) Take advantage of your existing algo and make that a large part of your local ranking, (b) make proximity to user's location intent much stronger, (c) make backlinks on authoritative local directories or organizations stronger (BBB, Dunns, Chamber of Commerce, etc.) (d) add a bit more importance to user reviews.
What other factors could be as important or more important than those from a local search standpoint? This should be a fairly straight forward exercise in simple logic.
To me it looks like Google has not invested the same brain power in tweaking it's local rankings that it has in it's normal organic ranking algo and so going forward I would expect more significant changes to the local search algo by comparison.
-
First sorry for the typos. I did come up with one difference I know of... citation age. Site #4 is a newer business. But it is in all the aggregators and has proper local schema markup.
No significant pattern regarding page length. That seems to me would be another factor used in the regular organic results so wouldn't make sense Site #4 would rank so much better if it was being demoted on 3-pack due to page length. Site #4 does beat out 2 of the 3 sites in the 3-pack for many other similar searches though. So citation and/or domain age can't be that big of a factor.
I was always under the impressions that closeness to user's location was #1, most normal organic ranking factors was second most important, and reviews were last.
I guess another explanation could be the do some random round robin to agree similar to the Adwords auction in order to test CTR of newer ads.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
What are SEO best practices for Java Language Redirections?
We would like to get some insight on what is the best practice of setting up canonical URLs in the below scenario. CMS used: Liferay – we believe they are using java. The URL structure at this stage can not be changed to best practices (/en/ and /ar/). Currently the language redirections works like this: English: https://www.website.com/page1?AF_language=en Arabic: https://www.website.com/page1?AF_language=ar Depending how you entered the website last time the root URL will show English or Arabic content without the ‘sufix’: https://www.website.com/page1 All 3 different URL’s are being indexed on Google - which is causing duplication and confusion. We have a few ideas: Have 2 main URLS: https://www.website.com/page1?AF_language=en and have the canonical set to https://www.website.com/page1?AF_language=en https://www.website.com/page1?AF_language=ar and have canonical set to https://www.website.com/page1?AF_language=ar However, how would you handle the root page which does not have a specific language attached. If we need to make a choice we would go with Arabic, as mainly Arabic pages are indexed on Google with the root domain. This way we would (hopefully) retain the rankings for this. Question: did anybody had to deal with a similar situation? What would you do in a similar situation and why? Thanks for all your input.
Local Website Optimization | | skrauss0 -
Local SEO Over Optimization
We are targeting a bunch of services for our local business that works in and around their location. I'm concerned about over optimization and need some guidance on whether these points should be resolved. The company is based in a city and works mostly in the city but also in the surrounding areas. Currently, the site has 6 services pages (accessible via main nav) targeting the same location i.e. “Made Up Service London”, “Imaginary Service London” (with URLs and H1 tags etc. in place containing this location). However this is soon going to become 9 services pages, I am concerned that the repetition of this one location is starting to look spammy, especially as its where the company is based. Initially, I also wanted pages targeting the same services in other nearby areas. For example “Made Up Service Surrey”, “Imaginary Service Essex”. This has not happened as the info available has been too sporadic. I was going to add links to relevant case studies into these pages to beef up the content and add interest. To that end, we came up with case studies, but after a while, I noticed that these are also largely focused on the primary location. So out of 32 case studies, we have 19 focused on the primary location again with URL’s and H1 tags etc containing the location keyword. So in total, we have 25 pages optimized for the location (soon to be 28 and more if further case studies are added). My initial feeling was that the inclusion of pages targeting services in other locations would legitimize what we have done with the main pages. But obviously we have not got these pages in place and I question whether we ever will. What is my best course of action moving forward?
Local Website Optimization | | GrouchyKids1 -
SEO Best Practice for Managing a Businesses NAP with Multiple Addresses
I have a client with multiple business addresses - 3 across 3 states, from an SEO perspective what would be the best approach for displaying a NAP on the website? So far I've read that its best: to get 3 GMB account to point to 3 location pages & use a local phone number as opposed to a 1300 number. Display all 3 locations in the footer, run of site
Local Website Optimization | | jasongmcmahon1 -
Question About Local SEO
Hey all, If a business operates in one city but works with associated organizations across multiple regions how would this impact a local SEO campaign? For example, a transportation company is located in Texas but services the Northwest and New England by outsourcing to smaller transportation companies in each of those regions. Would it be wise to create pages for each region they service on their website and then break that down in further into specific cities? Also, would it be worth targeting local search terms even though specific cities are serviced by the associated organizations and not the parent company itself? Thanks in advance, Andrew
Local Website Optimization | | mostcg0 -
Repairing SEO issues on Different Platforms
I work for a car dealership in Southern California and have been tasked with a seemingly impossible task. They would like for me to remove Title Tags, Duplicate Content, Descriptions, and get all other SEO issues in order. The concerns I have rank in this order: 1. Remove Duplicate Metadata: When the platform spits out new pages they use template Title/Description/Keywords and we are not always informed of their addition. There are also somewhere near 1K vehicles in the inventory that are being accused of duplicate content/Metadata. The fix that I have been spit balling is adding canonical - No Follow to these pages. I am not sure that this is the best way forward, but would appreciate the feedback 2. Duplicate Content: Most of the information is supplied from the manufacturer so we have been sourcing the information back to the manufacturers site. They are showing up on random "SEO Tools" pulls as harmful to the site. Although we use the Dealers name and local area, the only way I can assume to get the heat off and possibly fix any negative ramifications is to once again use a Canonical Tag - No Follow to these pages. 3. Clean up Issues: Most of the other issues I am finding is when the website platform dumps new pages to the site without notice and creates more then 1k pages that are coming with duplicate everything. Please provide with any assistance you can.
Local Website Optimization | | BBsmyth0 -
Should I use pipe in title tags for local seo?
Hi, I've created a bunch of landing pages for local areas, reading, windsor, slough etc for the title tag I have for Windsor Emergency Electrician Windsor - BrandName should I be using a pipe in the tag to further help search engines learn/identify the location? Emergency Electrician | Windsor - BrandName Thank you Kev
Local Website Optimization | | otex1 -
What's with Google? All metrics in my favor, yet local competitors win.
In regards to local search with the most relevant keyword, I can't seem to get ahead of the competition. I've been going through a number of analytics reports, and in analyzing our trophy keyword (which is also the most relevant, to our service and site) our domain has consistently been better with a number of factors. There is not a moz report that I can find that doesn't present us as the winner. Of course I know MOZ analytics and google analytics are different, but I'm certain that we have them beat with both. When all metrics seem to be in our favor, why might other competitors continue to have better success? We should be dominating this niche industry. Instead, I see a company using blackhat seo, another with just a facebook page only, and several others that just don't manage their site or ever add unique, helpful content. What does it take to get ahead? I'm pretty certain I've been doing everything right, and doing everything better than our local competitors. I think google just has a very imperfect algorythm, and the answer is "a tremendous amount of patience" until they manage to get things right.
Local Website Optimization | | osaka730