Geo-targeting Content Based On IP address?
-
What are the benefits / disadvantages of geo-targeting content based on IP address. A client is interested in serving up different content on their homepage based on what area the user is coming from. This seems like an SEO nightmare to me as search engine spiders could potentially see different content depending on when they visit. Is there a best practices here? Or is it looked down upon in regards to SEO? Any information would be helpful.
-
Check out this discussion first: http://www.seomoz.org/q/geo-targeting-by-ip-address
There's nothing wrong with dynamic targeting, if it's done right: http://www.seomoz.org/blog/keyword-level-demographics <-- That post is about persona-targeting
Be aware that for many sites' geo-targeted content is not on the homepage per se, but a landing page built for that purpose. That doesn't mean it's the only way to go, but that's where you'll find more examples. It's probably best to do some multivariate testing before going all in on it...
If client's doing this to improve user experience, there's nothing wrong with it, but yes it could have unexpected effects on your results in local search.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Geo-target .ag domain?
Hi Guys, We are looking to purchase a .ag domain for a agriculture website, we want to target two countries Australia (primary) and United States. So the main site e.g. www.farming.ag (will target Australia) While the www.farming.ag/us/ sub-folder will target the United States From my understanding through after reading this: https://www.name.com/domains/ag .ag is for countries Antigua and Barbuda. So i was wondering can you even geo-target to Aus or even the sub-folder to United States in search console? Any advice would be very much appreciated! Cheers.
Intermediate & Advanced SEO | | bridhard80 -
Geo-Targeted Sub-Domains & Duplicate Content/Canonical
For background the sub domain structure here is inherited and commited to due to tech restrictions with some of our platforms. The brand I work with is splitting out their global site into regional sub sites (not too relevant but this is in order to display seasonal product in different hemispheres and to link to stores specific to the region). All sub-domains except EU will be geo-targeted to their relevant country. Regions and sub domains for reference: AU - Australia CA - Canada CH - Switzeraland EU - All Euro zone countries NZ - New Zealand US - United States This will be done with Wordpress multisite. The set up allows to publish content on one 'master' sub site and then decide which other sub sites to 'broadcast' to. Some content is specific to a sub-domain/region so no issue with duplicate and can set the sub-site version as canonical. However some content will appear on all sub-domains. au.example.com/awesome-content/ nz.example.com/awesome-content/ Now first question is since these domains are geo-targeted should I just have them all canonical to the version on that sub-domain? eg Or should I still signal the duplicate content with one canonical version? Essentially the top level example.com exists as a site only for publishing purposes - if a user lands on the top level example.com/awesome-content/ they are given a pop up to select region and redirected to the relevant sub-domain version. So I'm also unsure whether I want that content indexed at all?? I could make the top level example.com versions of all content be the canonical that all others point to eg. and rely on geo-targeting to have the right links show in the right search locations. I hope that's kind of clear?? Obviously I find it confusing and therefore hard to relay! Any feedback at all gratefully received. Cheers, Steve
Intermediate & Advanced SEO | | SteveHoney0 -
I should keep looking at the IP address location on my backlink prospects?
Hello Everyone, I'm doing a backlink research and we are considering how to define a strategy. My website is located in Spain and I am focusing on ranking my website on Google.es for public coming from Spain. The thing is that until now I used to look carefully at the IP address of my backlink prospects, to check if the website was really hosted in Spain. I know Matt Cutts used to say very long ago that hosting location of a website was important. But taking into account that nowadays with cloud servers, CDNs, AnyCast and such technologies it's no longer possible to accurately identify the location of a website, specially if the CDN of the websites uses AnyCast, so that an IP address can be used by different machines in different locations. Do you guys think I should keep looking at the IP address location on my backlink prospects?
Intermediate & Advanced SEO | | C.A0 -
Duplicate Content for Deep Pages
Hey guys, For deep, deep pages on a website, does duplicate content matter? The pages I'm talk about are image pages associated with products and will never rank in Google which doesn't concern me. What I'm interested to know though is whether the duplicate content would have an overall effect on the site as a whole? Thanks in advance Paul
Intermediate & Advanced SEO | | kevinliao1 -
What is best practice to eliminate my IP addr content from showing in SERPs?
Our eCommerce platform provider has our site load balanced in a few data centers. Our site has two of our own exclusive IP addresses associated with it (one in each data center). Problem is Google is showing our IP addresses in the SERPs with what I would assume is bad duplicate content (our own at that). I brought this to the attention of our provider and they say they must keep the IP addresses open to allow their site monitoring software to work. Their solution was to add robots.txt files for both IP addresses with site wide/root disallows. As a side note, we just added canonical tags so the pages indexed within the IP addresses ultimately show the correct URL (non IP address) via the canonical. So here are my questions. Is there a better way? If not, is there anything else we need to do get Google to drop the several hundred thousand indexed pages at the IP address level? Or do we sit back and wait now?
Intermediate & Advanced SEO | | ovenbird0 -
Is it better to use geo-targeted keywords or add the locations as separate keywords?
For example... state keyword (nyc real estate) or keyword, state (nyc, real estate) = 2 keywords Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0