Geo-location by state/store
-
Hi there,
We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions
-
How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website.
-
Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?
-
We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?
Any help will be greatly appreciated.
Thanks - Costa
-
-
If you consistently see the IP address and redirect, or change content, based only on that then you will want to exempt Googlebot from those personalizations in one way or another. There are many options to this, like blocking the resources that handle this (i.e. the JavaScript.js file associated with personalization based on history or geo-location), or what was suggested above. Blocking that piece of script in the robots.txt file is less likely to be seen as cloaking.
All of this begs the question though: If you're looking at the IP, then setting a cookie, then updating the content based on the cookie, it shouldn't be an issue in the first place. Googlebot isn't accepting your cookies. So if I were to browse in Incognito mode using Chrome (and thus not accept cookies), would I see the same site and product assortments no matter which location I was in? If that's the case, maybe you don't have a problem. This is pretty easy to test.
Ultimately, I think you're going to want a single product page for each Sku, rather than one for each product at each location. The content, pricing, etc.. can be updated by location if they have a cookie, but the URL should probably never change - and the content shouldn't change by IP if they don't have a cookie.
1. Check IP
2. Embed their location in a cookie
3. Set cookie
4. If cookie is excepted and thus exists, do personalize.
If the cookie does not exist, do not personalize. You can show a message that says you must accept cookies to get the best experience, but don't make it block any major portion of the content.
-
Thanks for this. Few clarifications please,
Isnt having a different journey for a user and bot cloaking? Will google not penalise a site for that? - To make it clear - we have a single website and based on the Geo of the user, we will filter product availability. If a customer is from state A, we will should "X" products and if a customer is from State B, we will show X+Y or X-Y. All the products will have a canonical URL as part of the sitemap, so even if the product is not navigatable through the hierarchy on the website, crawlers will be able to find it through the direct canonical URL.
Here us a link to the article where John Mueller from google has some comments on the subject - https://www.seroundtable.com/google-geolocation-redirects-are-okay-26933.html
I have picked excerpts from you reply where I have some doubts, great if you can throw more light into these?
-
- "It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others)"
Since, we will have same (X products) across all our stores and across stores these products will have a single canonical URL, what will be the advantage of having different content by stores. we are thinking the content on the product pages will be the same, but, the availability of the product alone will differ based on geo. The sitemap will also remain the same across stores with the canonical product URLs
-
- "Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
Could you please explain a bit more on what do you mean by re-direct, as all products will exists in the website for a crawler to see if the canonical URL is used for crawling. Only the availability and the product visibility through the navigation journey will change based on geo.
Thank you for your time on this. Its extremely useful
Thanks - Costa
-
-
-
"We are a Grocery co-operative retailer and have chain of stores owned by different people. We are building a new website, where we would geo-locate the closest store to the customer and direct them to a particular store (selected based on cookie and geo location). All our stores have a consistent range of products + Variation in 25% range. I have few questions" - make sure you exempt Googlebot's user-agent from your geo-based redirects otherwise the crawling of your site will end up in a big horrible mess
-
"How to build a site-map. Since it will be mandatory for a store to be selected and same flow for the bot and user, should have all products across all stores in the sitemap? we are allowing users to find any products across all stores if they search by product identifier. But, they will be able to see products available in a particular store if go through the hierarchical journey of the website." - any pages you want Google to index should be in your XML sitemap. Any pages you don't want Google ti index should not be in there (period). If a URL uses a canonical tag to point somewhere else (and thus marks itself as NON-canonical) it shouldn't be in the XML sitemap. If a URL is blocked via robots.txt or Meta no-index directives, it shouldn't be in the XML sitemap. If a URL results in an error or redirect, it shouldn't be in your XML sitemap.The main thing to concern yourself with, is creating a 'seamless' view of indexation for Google. It seems like you'll have to have the same products available in multiple stores. You will want them all indexed, but will have to work hard to differentiate them (different images, different copy, different Meta data) otherwise Google will probably pick one product from one store as 'canonical' and not index the rest, leading to unfair product purchasing (users only purchasing X product from Y store, never the others). In reality, setting out to build a site which such highly divergent duplication is never going to yield great results, you'll just have to be aware of that from the outset
-
"Will the bot crawl all pages across all the stores or since it will be geolocated to only one store, the content belonging to only one store will be indexed?" - No it won't. Every time Google crawls from a different data centre, they will think all your other pages are being redirected now and that part of the site is now closed. Exempt Googlebot's user-agent from your redirects or face Google's fiery wrath when they fail to index anything properly
-
"We are also allowing customers to search for older products which they might have bought few years and that are not part of out catalogue any more. these products will not appear on the online hierarchical journey but, customers will be able to search and find the products . Will this affect our SEO ranking?" - If the pages are orphaned except in the XML sitemap, their rankings will go down over time. It won't necessarily hurt the rest of your site, though. Sometimes crappy results are better than no results at all!
Hope that helps
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking for keywords locally with multiple locations
If we have a company with multiple physical locations across multiple states, but selling the same products, what would be an optimal strategy? All local locations have been claimed, but the site is not coming up for searches with local intent. If the corporate site focuses on the "products", what is the best way to get that associated with the individual locations as well? When implementing json+ld, would we put the specific location on the specific location pages and nothing on the rest? Any other tips would be great! Thanks in advance,
Local Website Optimization | | IDMI.Net0 -
Business has multiple locations, but want to rank for commutable cities, geographies
Hello, The business I am working for has multiple locations, but the service they provide is one that you would commute for. At present, they have 20 or so pages with yucky geographical keyword stuffed content (think "New York computer services" and they are based out of a suburb (maybe 40 miles away). For some ridiculous reason, some of these pages are ranking for exact match search terms? We are in the process of revamping the whole site-taking approx five sites and integrating into one mega site. I want to first, figure out the best strategy for ranking for the region that each is in and serve, without being spammy like the previous SEO. I want to eliminate the spammy pages without losing the rank and link juice. What is the most appropriate and above-board strategy? These are my thoughts. Should I: 1. Keep the pages, but tweak them enough to make the content quality? If I do, should they be geo pages? Should they be "locations served", statistics of the area, etc? 2. Group the pages according to region (one page per region) that are location-oriented and tweaked to still include the terms they were ranking for (without the spammy look and stuffing), along with a map, etc? And then, I have to figure out how to redirect so not to lose the value we have now for some of them. The company deals with treatment for addiction, so in recommending and tips-remember that our audience will commute by car, and eventually (hopefully) by plane. 😉 Thank you so so much for any and all help you can provide! Sorry for such a long description!
Local Website Optimization | | lfrazer1231 -
Difficulty Ranking Two Locations in the Same City
We are in the self-storage business and have locations through the Pacific Northwest. As we grow, there are cities where we've added multiple (2-3) locations. But we're discovering that we're having a great deal of difficulty ranking for all of these. For instance, we have two locations in Vancouver, WA. One is West Coast Self-Storage Vancouver, and the other is West Coast Self-Storage Padden Parkway. Both are in Vancouver, WA, but for the most part, only West Coast Self-Storage Vancouver is getting ranked. In fact, on those searches where Vancouver ranks, Padden Parkway doesn't show up anywhere. Not in the top 10 pages anyway. Each location has an outer landing page and an inner details page. On each page, we've placed unique, city-optimized keywords in the URL, Page Title, h1s, content. Of course each location has a separate NAP. Each location also has its own GMB page. Each location has a decent amount of reviews across multiple sites (Google, Yelp, GetFiveStars.) Both locations were previously on their own domain until a year ago when they were redirected to their current URLs. Both of those original domains were close to the same age. With the Padden Parkway location, we've tried to be even more hyper-local, by including the address in the URLs and in the h1 of the outer page. We've also created an h2 that references local neighborhoods around the business. We're also running into this situation in at least one other city, so I'm wondering if this has something to do with our url structure. Other businesses in our space use the URL structure of domain.com/state/city/location. We only go down to the state level. What are we missing?
Local Website Optimization | | misterfla0 -
Local SEO for Multiple Locations - Is this the best approach?
Hi everyone! I previously have worked with single-location companies, and am now working for a company that is continuously growing and adding new locations. We are a financial institution that currently has 12 locations, and we should have 15+ locations by year-end 2017. Seeing as we have all of these locations, I thought the following approach would be the best for increasing our presence in local search. Our primary keyword is "credit union in location". Our search traffic has increased heavily over last year, but is down from the beginning of the year. I've gone through and done the following: Freshened up the content on the main website Created pages for each of our locations around April-end Attributed these location page URLs to our Google My Business locations Verified each location Wrote unique content for each page Our primary keyword rankings seem to fluctuate weekly. My next steps are to get our web design company to add the following: Structured Data on all location pages The ability to change SEO title and meta descriptions on location pages Sitemap (there is none currently, and I've been fighting them to get one added because it isn't needed.) I also plan on utilizing Moz Local to manage our local listings. After this is done I plan on finding ways for us to build links for each location, like the chambers of commerce in each city and local partnerships. Is this the best approach for our overall goal, and should I continue? Is there anything I should change about our current approach? I appreciate the help!
Local Website Optimization | | PelicanStateCU0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
How to approach SEO for a national umbrella site that has multiple chapters in different locations that are different URLS
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local site. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago. We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Subdomain for ticketing of a client website (how to solve SEO problems caused by the subdomain/domain relationship)
We have a client in need of a ticketing solution for their domain (let's call it www.domain.com) which is on Wordpress - as is our custom ticket solution. However, we want to have full control of the ticketing, since we manage it for them - so we do not want to build it inside their original Wordpress install. Our proposed solution is to build it on tickets.domain.com. This will exist only for selling and issuing the tickets. The question is, is there a way to do this without damaging their bounce rate and SEO scores?
Local Website Optimization | | Adam_RushHour_Marketing
Since customers will come to www.domain.com, then click the ticketing tab and land on tickets.domain.com, Google will see this as a bounce. In reality, customers will not notice the difference as we will clone the look and feel of domain.com Should we perhaps have the canonical URL of tickets.domain.com point to www.domain.com? And also, can we install Webmaster Tools for tickets.domain.com and set the preferred domain as www.domain.com? Are these possible solutions to the problem, or not - and if not, does anyone else have a viable solution? Thank you so much for the help.0 -
Website Mods and SEO for Multi-Location Practice?
We're in the process of taking over a WordPress website within the next week for a 3 location medical practice. These are in 3 different cities. 1 location is in a pretty competitive market, while the other 2 are not. The current site isn't bad for design and navigation and they don't have the budget for a full-redesign. Structurally, it is sound. It lacks a lot of content though and a blog. It is not responsive, should we convert to make it responsive? At first glance you can't tell they have 3 locations and their content for each location and services offered is pretty weak. What other suggestions do any of you have for getting the main site to rank for all 3 locations? I know it'll take some time since they are no where to be found now, but just looking for any other tips you may all have. Thanks!! - Patrick
Local Website Optimization | | WhiteboardCreations0