Personalization software and SEO
-
Hi guys,
I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP.
I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide?
I'll appreciate your opinions.
-
So Mr King, would it be reasonable to say that personalizing all locations but California would keep us out of trouble?
Thanks Mike!
-
Thanks for your insights Dirk.
-
Hi Ana,
Just to clarify - if you redirect based on ip to a location based url like /newyork you can still have a link on the page going to the other locations like /sandiego - so Google can access all these pages & index them. This is not possible it the scenario you mentioned.
Not sure how old the article from unbounce is, but Google bot is able to interpret javascript (to a certain extent). Using javascript won't change the problem - as long as you have only one page that adapts automatically to the ip location you will be unable show all versions of the page to Google - it will help your Californian branch, but hurt all the others.
rgds,
Dirk
-
This is great Dirk - thanks so much for your insight as always!
-
Hi Patrick,
If the question would have been about country targeting I guess your answer would have been correct. As mentioned in the article however, the lowest level of geolocation is country. As the question was about locations "nationwide" I would conclude based on this article that at this point of time Google is unable to detect geo-targeted content based on region or city.
Even for countries I think it's a risky business - as the article doesn't indicate if this "local" bots visit the sites with the same frequency & depth as the normal ones, and they don't clearly indicate which country ip's are used.
It's a different story for languages - because you can indicate in the HTTP header that the content is depending on the user's language. A similar case is the dynamic serving for mobile (https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving?hl=en) - here you can indicate that the content is changing based on the user agent.
As far as I know, there is no way to indicate in the HTTP header that the content is varying based on ip address.
rgds,
Dirk
-
Hi both,
Thanks a lot for your ideas and suggestions. No doubt it's a tough subject. I don't really understand Google's position about this, on one hand they want you to provide a better user experience (what can be done through personalization) and on the other hand they don't seem to be providing reasonable solutions to potential SEO drawbacks.
Dirk, referencing this line of yours "What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links", don't you think if the user is directly redirected to the "location based-page" then the Google bot coming from California (as an example) will also be redirected to it and then understand that the website is targeting California?
I read something at Unbounce regarding dynamic text replacement that caught my attention http://documentation.unbounce.com/hc/en-us/articles/203661004-Dynamic-Text-Replacement-pro-
They say “It's always been possible with Unbounce to do text replacement using a little bit of JavaScript, but unfortunately the bots and crawlers that Google (and other ad networks) use to validate your landing page quality don't read Javascript.”
If the fact that the bots cannot read Javascript is true maybe using Javascript for small personalization actions such as changing the location-based text maybe the solution. I wonder if this follows google guidelines or not.
Again I'll appreciate your answers; I'll go through all the links and information and keep investigating. I really need to find some technically supported facts.
Thank again. Ana
-
Hi Dirk
Thanks for the corrections and examples here. I appreciate it and learned something new here myself.
Out of curiosity, what do you make of the following: https://support.google.com/webmasters/answer/6144055?hl=en
After reading your explanation, and Google's suggestion in bold and red here, I understand the importance of your recommendation. I was just wondering your thoughts on this particular article and what you make of it.
Thanks so much again and well done!
-
Hi,
I don't really agree with the answer of Patrick. Depending on the level of personalisation you apply, it can hurt your rankings for locations outside California (our eventual other ip locations for Google bot).
As an example - you manage a chain of Chinese restaurants spread around the country and you have the domain mychineserestaurant.com.
If users accesses the site directly in New York, he will see the address, picture, phone number etc. from the New York restaurant. Googlebot however will never see this content - the bot will only be able to access the content from your branch in Los Angeles. While this is great for the user experience, there is no way to show Google the other locations, as you are obliged to show the bot the same content as normal human users, and hence show the information based on the ip of the bot.
The example of Groupon given by Patrick is not exactly the same - they personalise the homepage based on your ip - but if you search for Groupon New York you go to http://www.groupon.com/local/new-york-city
What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links. In the example above - accessing the site in New York I would go by default to mychineserestaurant.com/newyork but with the option to change the location. This way Google bot would be able to crawl all the locations. It's also the advice coming from Matt Cutts: https://www.mattcutts.com/blog/give-each-store-a-url/
If the personalisation is only minor (example only the local address on the homepage) and if you already have targeted pages for each location it should not really be a problem.
To be honest - it's rather my own opinion than something which is supported by hard facts.
Hope this helps,
Dirk
-
Hi there
Don't worry about this, it shouldn't be an issue. One thing you can do is target your website in Webmaster Tools if you're looking to target specific regions.
Amazon and Groupon have personalization happening on their sites as well - but that doesn't effect their ratings.
I would also take a look at:
SEO in the Personalization AgeLet me know if this helps at all!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to optimize for software carousels? [2019]
There was a lot of discussion back in spring 2017 when carousels were expanded in Google. But no definitive answers on how Google determines which sites rank for these results and what SEOs can do to get listed. My site (ansys.com) ranks for some of these results (example: "cfd software"), but the product listed in the result is not the primary product from our suite that should rank for these queries. Also the image Google displays is an outdated product logo pulled from a low-level blog post. This despite having structured data on our site with correct logos. Also, we only rank for carousels in one of our product categories and not others, despite ranking on pg. 1 for those results. So rank doesn't seem to be a primary factor. What is the secret sauce for ranking for these software carousels? And is it even worth it to try?
Technical SEO | | KHritz1 -
Spam Links Attack, Negative SEO?
Last April we migrated our old domain www.nyc-officespace-leader.com to a new domain www.metro-manhattan.com. The old domain has been receiving numerous links from very spammy sites such as these: -adinternet.net/the_worlds_most_visited_web_pages_958.htm <colgroup><col width="263"></colgroup>
Technical SEO | | Kingalan1
| -www.online-advertisement.net/the_worlds_most_visited_web_pages_958.html | <colgroup><col width="263"></colgroup>
| www.webfind.org/the_worlds_most_visited_web_pages_958/ | Since the old domain redirects to our new domain we are concerned this could be very detrimental. Oddly enough the 50-100 spammy domains that link to us all are a site called: "http://theglobe.se/start/" when the linking domain is entered in the browser. What should we do? Should we disavow these links? Is this some kind of an attack? Would very much appreciate some input/advice. Thank,
Alan1 -
Is Schema markup important for SEO?
I have recently come across the Schema markup and have researched what it's all about. Basically in short it helps search engines identify all the elements of the page. Our competitors have this implementated, so is this important for SEO? We have an e commerce store.
Technical SEO | | Jseddon920 -
MaxMind GeoIP SEO Impact
Hi, A client has mentioned that they are working with their design company to create three different hompage versions (one for Scotland, one for England and one for international visitors) which will be shown to the visitors dependent upon their locations. They are working in connection with MaxMind who I have not come across before. My question is would this have a negative impact on SEO?
Technical SEO | | J_Sinclair0 -
Schema.org Review for Person
I'm trying to make some sense of Google's rich snippets and microdata/Schema.org. Wondering if it's allowed to have reviews or an aggregateReview for a Person, rather than a Thing, Product or Place. Of course, a Person is a Thing, but when I try to validate my markup with Google's Structured Data Testing Tool, it tells me: Warning: Page contains property "aggregaterating" which is not part of the schema. Anyone have any ideas here?
Technical SEO | | Mase0 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
Advertising and negative impact on SEO
On one of my sites, I've been trying to get the word out by contacting blogs and asking them to share my site with their readers. This has resulted in some free publicity for my site, as well as quite a few paid reviews/sponsored posts. Note, however, that I've never paid for links, just reviews of my site... When I started this about 2 months ago, my site was a PR3 and getting fairly lowsy organic search traffic (i.e. 30-40 visits a day from Google). Then a few days ago, my PR dropped to 1. I didn't worry too much though, because my organic traffic was still around 30-40 visits a day. Now today, I checked and I only had 1 visitor the entire day from Google. Obviously I've been penalized. My most important question is, what can I do? Do I have an recourse, or do I need to just shut the domain down and move elsewhere? Second, wtf is Google penalizing this? I understand the argument against paid links, but should I not be allowed to advertise my site? Apparently I can buy links all day long through Google and they'll happily take my money, but the minute I pay some poor blogger to write an article about my site to their audience, I get penalized? Please help, I can't believe I just destroyed one of my sites like this!
Technical SEO | | dustin9990 -
When to SEO optimize a blog post?
Hi there, Here's our situation: there are two people working on the blog. person 1) writes the posts person 2) SEO optimizes the posts I know this is not ideal but it's the best we can do and it's a whole lot better than no blog. 🙂 I'm the fellow optimizing the posts. I've found that my best SEO efforts usually slightly undermine the readability of these posts -- not in an extreme way, I'm not going overboard with keywords or anything. Rather, things like a sexy & enticing article heading may have to be dummed down for search engines... Because of this dumming down, I like to wait a couple of weeks to SEO optimize our posts, the logic being that we get the best of both worlds: a happy regular readership on topic articles that are clearly described for (and aligned to the terms used by) our search engine visitors What I'm wondering is, Generally: can you see any problems with this setup? would you do it differently? Specifically: does Google (et al) punish this sort of backwards re-writing? and, does it somehow amount to less SEO mojo when done retroactively? Thanks so much for your time! Best, Jon
Technical SEO | | JonAmar0