Personalization software and SEO
-
Hi guys,
I'm just testing a personalization software in our website, basically changing the "location" text depending on the user's IP.
I can see in my software that when the Google bot comes to our site the personalization software triggers an action changing the location based text to "California". Can this make Google understand that our website targets only users in California and thereof hurt our rankings in other locations nationwide?
I'll appreciate your opinions.
-
So Mr King, would it be reasonable to say that personalizing all locations but California would keep us out of trouble?
Thanks Mike!
-
Thanks for your insights Dirk.
-
Hi Ana,
Just to clarify - if you redirect based on ip to a location based url like /newyork you can still have a link on the page going to the other locations like /sandiego - so Google can access all these pages & index them. This is not possible it the scenario you mentioned.
Not sure how old the article from unbounce is, but Google bot is able to interpret javascript (to a certain extent). Using javascript won't change the problem - as long as you have only one page that adapts automatically to the ip location you will be unable show all versions of the page to Google - it will help your Californian branch, but hurt all the others.
rgds,
Dirk
-
This is great Dirk - thanks so much for your insight as always!
-
Hi Patrick,
If the question would have been about country targeting I guess your answer would have been correct. As mentioned in the article however, the lowest level of geolocation is country. As the question was about locations "nationwide" I would conclude based on this article that at this point of time Google is unable to detect geo-targeted content based on region or city.
Even for countries I think it's a risky business - as the article doesn't indicate if this "local" bots visit the sites with the same frequency & depth as the normal ones, and they don't clearly indicate which country ip's are used.
It's a different story for languages - because you can indicate in the HTTP header that the content is depending on the user's language. A similar case is the dynamic serving for mobile (https://developers.google.com/webmasters/mobile-sites/mobile-seo/configurations/dynamic-serving?hl=en) - here you can indicate that the content is changing based on the user agent.
As far as I know, there is no way to indicate in the HTTP header that the content is varying based on ip address.
rgds,
Dirk
-
Hi both,
Thanks a lot for your ideas and suggestions. No doubt it's a tough subject. I don't really understand Google's position about this, on one hand they want you to provide a better user experience (what can be done through personalization) and on the other hand they don't seem to be providing reasonable solutions to potential SEO drawbacks.
Dirk, referencing this line of yours "What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links", don't you think if the user is directly redirected to the "location based-page" then the Google bot coming from California (as an example) will also be redirected to it and then understand that the website is targeting California?
I read something at Unbounce regarding dynamic text replacement that caught my attention http://documentation.unbounce.com/hc/en-us/articles/203661004-Dynamic-Text-Replacement-pro-
They say “It's always been possible with Unbounce to do text replacement using a little bit of JavaScript, but unfortunately the bots and crawlers that Google (and other ad networks) use to validate your landing page quality don't read Javascript.”
If the fact that the bots cannot read Javascript is true maybe using Javascript for small personalization actions such as changing the location-based text maybe the solution. I wonder if this follows google guidelines or not.
Again I'll appreciate your answers; I'll go through all the links and information and keep investigating. I really need to find some technically supported facts.
Thank again. Ana
-
Hi Dirk
Thanks for the corrections and examples here. I appreciate it and learned something new here myself.
Out of curiosity, what do you make of the following: https://support.google.com/webmasters/answer/6144055?hl=en
After reading your explanation, and Google's suggestion in bold and red here, I understand the importance of your recommendation. I was just wondering your thoughts on this particular article and what you make of it.
Thanks so much again and well done!
-
Hi,
I don't really agree with the answer of Patrick. Depending on the level of personalisation you apply, it can hurt your rankings for locations outside California (our eventual other ip locations for Google bot).
As an example - you manage a chain of Chinese restaurants spread around the country and you have the domain mychineserestaurant.com.
If users accesses the site directly in New York, he will see the address, picture, phone number etc. from the New York restaurant. Googlebot however will never see this content - the bot will only be able to access the content from your branch in Los Angeles. While this is great for the user experience, there is no way to show Google the other locations, as you are obliged to show the bot the same content as normal human users, and hence show the information based on the ip of the bot.
The example of Groupon given by Patrick is not exactly the same - they personalise the homepage based on your ip - but if you search for Groupon New York you go to http://www.groupon.com/local/new-york-city
What you could do is automatically redirect the user to the appropriate page based on ip - but still have the other pages accessible via normal links. In the example above - accessing the site in New York I would go by default to mychineserestaurant.com/newyork but with the option to change the location. This way Google bot would be able to crawl all the locations. It's also the advice coming from Matt Cutts: https://www.mattcutts.com/blog/give-each-store-a-url/
If the personalisation is only minor (example only the local address on the homepage) and if you already have targeted pages for each location it should not really be a problem.
To be honest - it's rather my own opinion than something which is supported by hard facts.
Hope this helps,
Dirk
-
Hi there
Don't worry about this, it shouldn't be an issue. One thing you can do is target your website in Webmaster Tools if you're looking to target specific regions.
Amazon and Groupon have personalization happening on their sites as well - but that doesn't effect their ratings.
I would also take a look at:
SEO in the Personalization AgeLet me know if this helps at all!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spam Links Attack, Negative SEO?
Last April we migrated our old domain www.nyc-officespace-leader.com to a new domain www.metro-manhattan.com. The old domain has been receiving numerous links from very spammy sites such as these: -adinternet.net/the_worlds_most_visited_web_pages_958.htm <colgroup><col width="263"></colgroup>
Technical SEO | | Kingalan1
| -www.online-advertisement.net/the_worlds_most_visited_web_pages_958.html | <colgroup><col width="263"></colgroup>
| www.webfind.org/the_worlds_most_visited_web_pages_958/ | Since the old domain redirects to our new domain we are concerned this could be very detrimental. Oddly enough the 50-100 spammy domains that link to us all are a site called: "http://theglobe.se/start/" when the linking domain is entered in the browser. What should we do? Should we disavow these links? Is this some kind of an attack? Would very much appreciate some input/advice. Thank,
Alan1 -
Is Schema markup important for SEO?
I have recently come across the Schema markup and have researched what it's all about. Basically in short it helps search engines identify all the elements of the page. Our competitors have this implementated, so is this important for SEO? We have an e commerce store.
Technical SEO | | Jseddon920 -
Does Switching Web Hosts Hurt SEO?
A few months ago, my site was shut down by BlueHost because of performance issues, so I moved it to WP Engine, and cleaned up most of the plug-ins. Since then, my search engine traffic has decreased over 50%. Does switching web hosts hurt SEO? Thanks!
Technical SEO | | JodiFTM0 -
How to best remove old pages for SEO
I run an accommodation web site, each listing has its own page. When a property is removed what is the best way to handle this for SEO because the URL will no longer be valid and there will be a blank page.
Technical SEO | | JamieHibbert0 -
Are affiliate programs good for seo?
We found this website https://shareasale.com/learnmore.cfm Actually is an affiliate site, where the users there would be in charge of driving traffic into our website, and then we pay them in some way, I just wanted to know if this can affect my seo efforts negativele? THanks
Technical SEO | | levalencia10 -
Buying new domains to help with SEO
Hi, Does buying new keyword related domains and 301 redirect them to my site have any seo benefit?
Technical SEO | | Socialdude0 -
Site forwarding - seo friendly or not?
Recently i decided to change my domain name - and although i have written several useful and working .htacess files with 301 redirects, this one became more complicated by the fact that I went through TWO domain name changes, before settling on the second one. Having seen some issues with the browser not being able to interpret correctly the .htaccess file, i temporarily suspended the .htaccess file, and opted instead for site forwarding. I don't know the mechanics behind site forwarding, or whether it is seo friendly or just a method for ip addressing, a sort of pseudo domain name server record change.
Technical SEO | | highersourcesites
I let it lie for a few weeks, until the dust settled, and yesterday put back the basic .htaccess file, with a 301 redirect, which directs the original domain name to be forwarded to the new one ( also it has a conditional in place to solve canonical issues). It works fine. But right now i am not seeing the link juice, the domain age, the domain page rank that it has. It has gone to zero, when it used to be three, sometimes four. I also made the change of address using webmaster tools. How long ( forever?) will it take to see my old page rank come back, even if it loses 10% from the change? And does site forwarding help or hinder seo ranking?0 -
Mobile SEO or Block Crawlers?
We're in the process of launching mobile versions of many of our brand sites and our ecommerce site and one of our partners suggested that we should block crawlers on the mobile view so it doesn't compete for the same keywords as the standard site (We will be automatically redirecting mobile handsets to the mobile site). Does this advice make sense? It seems counterintuitive to me.
Technical SEO | | BruceMillard0