Is it OK to dynamically serve different content to paid and non-paid traffic from the same URL?
-
Hi Moz!
We're trying to serve different content to paid and non-paid visitors from the same URL. Is this black hat?
Here's the reason we want to do this -- we're testing a theory that paid ads boost organic rankings. This is something we saw happen to a client and we want to test this further. But we have to have a different UX that's more sparse and converts better for paid.
Thanks for reading!
-
Hi David,
First of all as far as I know paid campaign doesn't helps in organic ranking. Google repeatedly said that paid campaign doesn't affect organic rankings.
As far as I know Google says that showing one version to users and other version to boat is called cloaking and we must not use this but didn't say anything on paid & non paid visitors.
If I assume that paid campaign helps in organic ranking then it is the only one thing that can affect ranking by paid campaign that is CTR.
I do run AdWords campaign for my website over 8 years and CTR is minimum 10% but I never noticed that paid campaign helps in ranking.
** I wouldn't suggest you to do that***
Please also check this once @ https://support.google.com/adwordspolicy/answer/6020954?hl=en&rd=1#701
Hope this helps you.
Thanks
-
Hi Highland,
Thanks for the quick response!
I wasn't clear in my question. The paid visitors I'm referring to are visitors coming through search ads, not people who are subscribed to our service. So this isn't regarding paywalls. Rather, we're trying to send paid traffic to a page to see if it will increase its rankings. At the same time, we want to have a different user experience for paid and non-paid visitors to increase conversions.
Also we'd like to have different content for the two versions, not just have one version with no/little content and another with all of it.
-
Google's rule on paywalls is that you have to offer up some content for free to searchers. So, for instance, if you search for a NY Times article and click in, they'll tell you how many free articles you have left before you have to pay. Google, of course, can see all that content.
Search Engine Land had a good article on paywall implications
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Only Indexing Canonical Root URL Instead of Specified URL Parameters
We just launched a website about 1 month ago and noticed that Google was indexing, but not displaying, URLs with "?location=" parameters such as: http://www.castlemap.com/local-house-values/?location=great-falls-virginia and http://www.castlemap.com/local-house-values/?location=mclean-virginia. Instead, Google has only been displaying our root URL http://www.castlemap.com/local-house-values/ in its search results -- which we don't want as the URLs with specific locations are more important and each has its own unique list of houses for sale. We have Yoast setup with all of these ?location values added in our sitemap that has successfully been submitted to Google's Sitemaps: http://www.castlemap.com/buy-location-sitemap.xml I also tried going into the old Google Search Console and setting the "location" URL Parameter to Crawl Every URL with the Specifies Effect enabled... and I even see the two URLs I mentioned above in Google's list of Parameter Samples... but the pages are still not being added to Google. Even after Requesting Indexing again after making all of these changes a few days ago, these URLs are still displaying as Allowing Indexing, but Not On Google in the Search Console and not showing up on Google when I manually search for the entire URL. Why are these pages not showing up on Google and how can we get them to display? Only solution I can think of would be to set our main /local-house-values/ page to noindex in order to have Google favor all of our other URL parameter versions... but I'm guessing that's probably not a good solution for multiple reasons.
Intermediate & Advanced SEO | | Nitruc0 -
Similar product descriptions but with different urls
I had this question before and was not fully satisfied with the answer.. We are selling adhesives and some of the products have the same name and description, the only thing that separates them are the width on the roll.. Are old/online setup are as following, each product has its own product page with more or less the same description. For example here http://siga-sverige.se/siga/fentrim-2-100/ and here http://siga-sverige.se/siga/fentrim-2-150/ The above product pages are for a product called Fentrim 2. its availiable in widhts from 75 to 300mm.. so, its six diffent products pages with more or less the same description. The other variations of the products besides the width. are Fentrim 20, Fentrim IS 2 and Fentrim IS 20. So this gives us 6 x Fentrim 20 product pages with the same description, just the width that changes. 6 x Fentrim 2 product pages with the same description, just the width that changes. 6 x Fentrim IS 20 product pages with the same description, just the width that changes. 6 x Fentrim IS 2 product pages with the same description, just the width that changes. I get that this can cause us problems in the terms of duplicate content. The plan that we have now is to have 4 different product pages with variations instead. For each of those for product pages we have well written and unique content. And have the old ones 301 redirected to them. Like this http://siga-sverige.se/siga/fentrim-2 http://siga-sverige.se/siga/fentrim-20 http://siga-sverige.se/siga/fentrim-IS-2 http://siga-sverige.se/siga/fentrim-IS-20 Today we gain traffic from one product page per variation and it seems that google has picked those ones out randomly, see the attached screenshot.. Will we loose rank? will this increase our position, whats your ideas? // Jonas PG4aAcM
Intermediate & Advanced SEO | | knubbz0 -
Website Re-Launch - New URLS / Old URL WMT
Hello... We recently re-launched website with a new CMS (Magento). We kept the same domain name, however most of the structure changed. We were diligent about inputting the 301 redirects. The domain is over 15 years old and has tons of link equity and history. Today marks 27 days since launch...And Google Webmaster Tools showed me a recently detected (dated two days ago) URL from the old structure. Our natural search traffic has take a slow dive since launch...Any thoughts? Some background info: The old site did not have a sitemap.xml. The relaunched site does. Thanks!
Intermediate & Advanced SEO | | 19prince0 -
Should we show(to google) different city pages on our website which look like home page as one page or different? If yes then how?
On our website, we show events from different cities. We have made different URL's for each city like www.townscript.com/mumbai, www.townscript.com/delhi. But the page of all the cities looks similar, only the events change on those different city pages. Even our home URL www.townscript.com, shows the visitor the city which he visited last time on our website(initially we show everyone Mumbai, visitor needs to choose his city then) For every page visit, we save the last visited page of a particular IP address and next time when he visits our website www.townscript.com, we show him that city only which he visited last time. Now, we feel as the content of home page, and city pages is similar. Should we show these pages as one page i.e. Townscript.com to Google? Can we do that by rel="canonical" ? Please help me! As I think all of these pages are competing with each other.
Intermediate & Advanced SEO | | sanchitmalik0 -
Different rankings for same keyword in different geo locations
I am listed at # 11 in google.com.pk for a same keyword but listed 4th when I use &gl=us in the same query or use a US ip address. Could this be because google's recent update and maybe they are taking time to push the update towards other countries or something else? In any case I see decent rankings when I use the query in google.com or when I use gl=us, but lower quality results in google.com.pk so please guide what is going on. Thanks in advance for guidance (y)
Intermediate & Advanced SEO | | hpk0 -
SEO implications of serving a different site on HTTPS vs. HTTP
I have two sites: Site A, and Site B. Both sites are hosted on the same IP address, and server using IIS 7.5. Site B has an SSL cert, and Site A does not. It has recently been brought to my attention that when requesting the HTTPS version of Site A (the site w/o an SSL cert), IIS will serve Site B... Our server has been configured this way for roughly a year. We don't do any promotion of Site A using HTTPS URLs, though I suppose somebody could accidentally link to or type in HTTPS and get the wrong website. Until we can upgrade to IIS8 / Windows Server 2012 to support SNI, it seems I have two reasonable options: Move Site B over to its own dedicated IP, and let HTTPS requests for Site A 404. Get another certificate for Site A, and have it's HTTPS version 301 redirect to HTTP/non-ssl. #1 seems preferable, as we don't really need an SSL cert for Site A, and HTTPS doesn't really have any SEO benefits over HTTP/non-ssl. However, I'm concerned if we've done any SEO damage to Site A by letting our configuration sit this way for so long. I could see Googlebot trying https versions of websites to test if they exist, even if there aren't any ssl/https links for the given domain in the wild... In which case, option #2 would seem to mostly reverse any damage done (if any). Though Site A seems to be indexed fine. No concerns other than my gut. Does anybody have any recommendations? Thanks!
Intermediate & Advanced SEO | | dsbud0 -
What do I do about sites that copy my content?
I've noticed that there are a number of websites that are copying my content. They are putting the full article on their site, mentioning that it was reposted from my site, but contains no links to me. How should I approach this? What are my rights and should I ask them to remove it or add a link? Will the duplicate content affect me?
Intermediate & Advanced SEO | | JohnPeters0 -
Should I 301 Poorly Worded URL's which are indexed and driving traffic
Hi, I'm working on our sites structure and SEO at present and wondering when the benefit I may get from a well written URL, i.e ourDomain / keyword or keyphrase .html would be preferable to the downturn in traffic i may witness by 301 redirecting an existing, not as well structured, but indexed URL. We have a number of odd looking URL's i.e ourDomain / ourDomain_keyword_92.html alongside some others that will have a keyword followed by 20 underscores in a long line... My concern is although i would like to have a keyword or key phrase sitting on its own in a well targeted URL string I don't want to mess to much with pages that are driving say 2% or 3% of our traffic just because my OCD has kicked in.... Some further advice on strategies i could utilise would be great. My current thinking is that if a page is performing well then i should leave the URL alone. Then if I'm not 100% happy with the keyword or phrase it is targeting I could build another page to handle the new keyword / phrase with the aim of that moving up the rankings and eventually taking over from where the other page left off. Any advice is much appreciated, Guy
Intermediate & Advanced SEO | | guycampbell0