What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
-
Hi,
We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages.
As far as I know, the page URL won't change and won't have parameters.
Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots.
Is it better to have URL parameters for version B and C of the content? For example:
- /page for the default content
- /page?id=2 for the B version
- /page?id=3 for the C version
The dynamic content comes from the server side, so not all pages copy variations are in the default HTML.
I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
-
Hi everyone,
I have a related question about personalisation too which is a variation on the theme but which I would appreciate some help with.
There is a project afoot within my company to "personalise" the user experience by presenting pages to users which better respond to their interests.
That is to say that, when a user visits our page about "tennis-shoes", the next time they visit the homepage they will be presented with a homepage which focusses on tennis-shoes.
So far so good.
However rather than personalising certain elements of the homepage, the idea is to intercept those users, and 301 them to an entirely different URL, completly hidden from Google, which will contain entirely different content focussing only on shoes.
The top navegation will remain the same.
This sounds like a massive breach of Quality Guidelines on at least two counts to me. It reeks of cloacking and "sneaky redirects", and I am very concerned this will do us way more harm than good.
I'm guessing that the correct way of going about this would be to either generate a great "shoes" page and allow users to navigate to it, visit it, and do whatever they want with it, or to personalise the homepage including some dynamic elements on the same URL, without hiding things from Google or frustrating users by not allowing them to access the page they are trying to access.
Any feedback from the community would be a great help.
Thanks a lot!
-
Brilliant thread guys!
This will be far more discussed in the not so distant future i'm sure!
Dynamic Homepages are becoming more common and I have a client using one so this info has really helped me.
This topic should be a future Whiteboard Friday.
-
Yes, that sounds great! Please let me know how it all goes and if you run into any other hiccups.
Cheers,
B
-
Hi Britney
Thank you for your detailed feedback!
I checked the posts you linked and a few other sources and I think the solution will be the following:
- The default content will be loaded with the parameter free URL, e.g. /product
- Personalised versions of the page will have different (short) parameters, e.g. /product?version=8372762
- The default and the personalised pages will have the same canonical tag (default page)
- Let Google know in the Search Console's URL Parameters settings that the version parameter changes the page content (specifies + let Googlebot decide)
I hope it makes sense.
-
Did some digging and found a few resources stating:
Googlehadan official statement about this in its webmaster guidelines:
"If you decide to use dynamic pages (i.e., the URL contains a ? character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. Don't use &id= as a parameter in your URLs, as we don't include these pages in our index."That was many years ago but more recently Google changed its position on that subject. The entry has been removed from Google's guidelines but here's the official statement from Google's blog:
"Google now indexes URLs that contain the &id= parameter. So if your site uses a dynamic structure that generates it, don't worry about rewriting it -- we'll accept it just fine as is.Keep in mind, however, that dynamic URLs with a large number of parameters may still be problematic for search engine crawlers in general, so rewriting dynamic URLs into user-friendly versions is always a good practice when that option is available to you. If you can, keeping the number of URL parameters to one or two may make it more likely that search engines will crawl your dynamic urls."
Click here read the full article
Penalization for personalisation
Let me know if this helps
-
Fascinating question Gyorgy!
I've always been a big fan of dynamic targeting.
It would be a great idea to have different URL parameters for each unique set of content. You might also want to push these pages to fetch & index within Google Search Console (and your sitemap.xml to showcase you're not attempting to cloak, etc.)
This would be a fantastic question for Google reps...I can try to reach out to someone today and let you know what they say.
Cheers,
B
PS. Just curious, how are you pulling in persona data?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my inner pages ranking higher than main page?
Hi everyone, for some reason lately i have discovered that Google is ranking my inner pages higher than the main subfolder page. www.domain.com/subfolder --> Target page to be ranked
Technical SEO | | davidboh
www.domain.com/subfolder/aboutus ---> page that is currently ranking Also in the SERP most of the time, it is showing both links in this manner. www.domain.com/subfolder/aboutus
-----------www.domain.com/subfolder Thanks in advance.1 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
I noticed all my SEOed sites are getting attacked constantly by viruses. I do wordpress sites. Does anyone have a good recommendation to protect my clients sites? thanks
We have tried all different kinds of security plugins but none seem to work long term.
Technical SEO | | Carla_Dawson0 -
Fixing Duplicate Pages Titles/Content
I have a DNN site, which I created friendly URL's for; however, the creation of the friendly URL's then created duplicate page content and titles. I was able to fix all but two URL's with rel="canonical" links. BUT The two that are giving me the most issues are pointing to my homepage. When I added the rel = "canonical" link the page then becomes not indexable. And for whatever reason, I can't add a 301 redirect to the homepage because it then gives me "can't display webpage" error message. I am new to SEO and to DNN, so any help would be greatly appreciated.
Technical SEO | | VeronicaCFowler0 -
What happens to content under a category page that is not indexed?
We are reevaluating our URL structure. We have a flat architecture but would like to add subfolders per recommendations here and elsewhere. Some of our category pages are ad heavy/content light so we have them no indexed. We do have lots of quality content on the site that we would like to put under some of these keyword categories. Should we leave it flat? If Google does not see that category page then there will be no link from the homepage to the content page? Now: homepage/content-page Proposed: homepage/category/content-page (category is not indexed)
Technical SEO | | hoch0 -
Local Search | Website Issue with Duplicate Content (97 pages)
Hi SEOmoz community. I have a unique situation where I’m evaluating a website that is trying to optimize better for local search and targeting 97 surrounding towns in his geographical location. What is unique about this situation is that he is ranking on the 1st and 2nd pages of the SERPs for his targeted keywords, has duplicate content on 97 pages to his site, and the search engines are still ranking the website. I ran the website’s url through SEOmoz’s Crawl Test Tool and it verified that it has duplicate content on 97 pages and has too many links (97) per page. Summary: Website has 97 duplicate pages representing each town, with each individual page listing and repeating all of the 97 surrounding towns, and each town is a link to a duplicate page. Question: I know eventually the site will not get indexed by the Search Engines and not sure the best way to resolve this problem – any advice?
Technical SEO | | ToddSEOBoston0 -
Once duplicate content found, worth changing page or forget it?
Hi, the SEOmoz crawler has found over 1000 duplicate pages on my site. The majority are based on location and unfortunately I didn't have time to add much location based info. My question is, if Google has already discovered these, determined they are duplicate, and chosen the main ones to show on the SERPS, is it worth me updating all of them with localalized information so Google accepts the changes and maybe considers them different pages? Or do you think they'll always be considered duplicates now?
Technical SEO | | SpecialCase0