What are the SEO recommendations for dynamic, personalised page content? (not e-commerce)
-
Hi,
We will have pages on the website that will display different page copy and images for different user personas. The main content (copy, headings, images) will be supplied dynamically and I'm not sure how Google will index the B and C variations of these pages.
As far as I know, the page URL won't change and won't have parameters.
Google will crawl and index the page content that comes from JavaScript but I don't know which version of the page copy the search robot will index. If we set user agent filters and serve the default page copy to search robots, we might risk having a cloak penalty because users get different content than search robots.
Is it better to have URL parameters for version B and C of the content? For example:
- /page for the default content
- /page?id=2 for the B version
- /page?id=3 for the C version
The dynamic content comes from the server side, so not all pages copy variations are in the default HTML.
I hope my questions make sense. I couldn't find recommendations for this kind of SEO issue.
-
Hi everyone,
I have a related question about personalisation too which is a variation on the theme but which I would appreciate some help with.
There is a project afoot within my company to "personalise" the user experience by presenting pages to users which better respond to their interests.
That is to say that, when a user visits our page about "tennis-shoes", the next time they visit the homepage they will be presented with a homepage which focusses on tennis-shoes.
So far so good.
However rather than personalising certain elements of the homepage, the idea is to intercept those users, and 301 them to an entirely different URL, completly hidden from Google, which will contain entirely different content focussing only on shoes.
The top navegation will remain the same.
This sounds like a massive breach of Quality Guidelines on at least two counts to me. It reeks of cloacking and "sneaky redirects", and I am very concerned this will do us way more harm than good.
I'm guessing that the correct way of going about this would be to either generate a great "shoes" page and allow users to navigate to it, visit it, and do whatever they want with it, or to personalise the homepage including some dynamic elements on the same URL, without hiding things from Google or frustrating users by not allowing them to access the page they are trying to access.
Any feedback from the community would be a great help.
Thanks a lot!
-
Brilliant thread guys!
This will be far more discussed in the not so distant future i'm sure!
Dynamic Homepages are becoming more common and I have a client using one so this info has really helped me.
This topic should be a future Whiteboard Friday.
-
Yes, that sounds great! Please let me know how it all goes and if you run into any other hiccups.
Cheers,
B
-
Hi Britney
Thank you for your detailed feedback!
I checked the posts you linked and a few other sources and I think the solution will be the following:
- The default content will be loaded with the parameter free URL, e.g. /product
- Personalised versions of the page will have different (short) parameters, e.g. /product?version=8372762
- The default and the personalised pages will have the same canonical tag (default page)
- Let Google know in the Search Console's URL Parameters settings that the version parameter changes the page content (specifies + let Googlebot decide)
I hope it makes sense.
-
Did some digging and found a few resources stating:
Googlehadan official statement about this in its webmaster guidelines:
"If you decide to use dynamic pages (i.e., the URL contains a ? character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few. Don't use &id= as a parameter in your URLs, as we don't include these pages in our index."That was many years ago but more recently Google changed its position on that subject. The entry has been removed from Google's guidelines but here's the official statement from Google's blog:
"Google now indexes URLs that contain the &id= parameter. So if your site uses a dynamic structure that generates it, don't worry about rewriting it -- we'll accept it just fine as is.Keep in mind, however, that dynamic URLs with a large number of parameters may still be problematic for search engine crawlers in general, so rewriting dynamic URLs into user-friendly versions is always a good practice when that option is available to you. If you can, keeping the number of URL parameters to one or two may make it more likely that search engines will crawl your dynamic urls."
Click here read the full article
Penalization for personalisation
Let me know if this helps
-
Fascinating question Gyorgy!
I've always been a big fan of dynamic targeting.
It would be a great idea to have different URL parameters for each unique set of content. You might also want to push these pages to fetch & index within Google Search Console (and your sitemap.xml to showcase you're not attempting to cloak, etc.)
This would be a fantastic question for Google reps...I can try to reach out to someone today and let you know what they say.
Cheers,
B
PS. Just curious, how are you pulling in persona data?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Duplicate Content on a Page Due to Responsive Version
What are the implications if a web designer codes the content of the site twice into the page in order to make the site responsive? I can't add the url I'm afraid but the H1 and the content appear twice in the code in order to produce both a responsive version and a desktop version. This is a Wordpress site. Is Google clever enough to distinguish between the 2 versions and treat them individually? Or will Google really think that the content has been repeated on the same page?
Technical SEO | | Wagada0 -
NOINDEX,NOFOLLOW - Any SEO benefit to these pages?
Hi I could use some advice on a site architecture decision. I am developing something akin to an affiliate scheme for my business. However it is not quite as simple as an affliate setup because the products sold through "affiliates" will be slightly different, as a result I intend to run the site from a subdomain of my main domain. I am intending to NOINDEX,NOFOLLOW the subdomained site because it will contain huge amounts of duplication from my main site (it is really a subset of the main site with some slightly different functionality in places). I don't really want or need this subdomain site indexed, hence my decision to NOINDEX,NOFOLLOW it. However given I will, hopefully, be having lots of people link into the subdomain I am hoping to come up with some sort of arrangement that will mean that my main domain derives some sort of benefit from the linking. They are, after all, votes for my business so they feel like "good links". I am assuming here that a direct link into my NOFOLLOW,NOINDEX subdomain is going to provide ZERO benefit to my main domain. Happy to be corrected! The best I can come up with is to have a "landing page" on my main domain which links into parts of my main domain and then provides a link through to the subdomain site. However this feels like a bad experience from the user's point of view (i.e. land on a page and then have to click to get to the real action) and feels a bit spammy, i.e. I don't really have a good reason for this page other than linking! Equally I could NOINDEX,FOLLOW the homepage of the affiliate site and link back to the main domain from there. However this also feels a bit spammy and would be far less beneficial, I guess, because the subdomain homepage would have many more outgoing links than I envisaged for my "landing page" idea above. Also, it also looks a bit spammy (i.e. why follow the homepage and nofollow everything else?)! The trouble, I guess, is that whatever I do feels a bit spammy. I suppose this is because IT IS spammy! 🙂 Has anyone got any good ideas how I could setup an arrangement like I described above and derive benefit to my main domain without it looking (or being) spammy? I just hate to think of all of those links being wasted (in an SEO sense). Thanks Gary
Technical SEO | | gtrotter6660 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0 -
URL content format - Any impact on SEO
I understand that there is a suggested maximum length for a URL so as not to be penalized by search engines. I'm wondering if I should if should optimize our ecommerce categories to be descriptive or use abbreviations to help keep the URL length to a minimum? Our products are segmented into many categories, so many products URL's are pretty long if we go the descriptive route. I've also heard that removing the category component entirely from a product URL can also be considered. I'm fairly new to all this SEO stuff, so I'm hoping the community can share their knowledge on the impact of these options. Cheers, Steve
Technical SEO | | SteveMaguire0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0