Noindex follow on checkout pages in 2017
-
Hi,
My website really consists of 2 separate sites.-
Product site:
• Website with product pages.
• These product pages have SEO optimised content. -
Booking engine & checkout site:
• When a user clicks 'Book' on one of the product pages on the aforementioned product site they go to a seaparate website which is a booking engine and checkout.
• These pages are not quality, SEO optimised content, they only perform the function of booking and buying.
Q1) Should I set 'noindex follow' via the meta tag on all pages of the 'Booking engine and checkout' site?
ie.Q2) should i add anything to the book buttons on the product site?
I am hoping all this will somehow help concentrate the SEO juice onto the Product Site's pages by declaring the Booking engine and Checkout sites pages to be 'not of any content value'.
-
-
Hi
Ironically MOZ will pick this up as a problem as it reports anything that is noindexed!
For me I just ignore noindex as a problem in certain cases as clearly it makes perfect sense to noindex certain pages and indeed sometimes whole directories.
I sometimes find that developers have noindexed directories like /new-products or /sale but clearly there are better ways of handling the potential duplicate problem here by adding a canonical. In you case it makes no sense having Google index the checkout pages.
Regards Nigel
-
Hi Martin / Nigel,
Thanks for your responses, In regards to Q1.
By adding to the 'Booking engine and checkout' site's pages will this also stop Moz from Crawling these pages - and consequently remove 'issues' from their Moz Site Crawl 'issues count' as it currently crawls these pages and picks up issues?
-
Hi Nigel,
You're right, I didn't think about the duplicates from UTM previously.
Thanks for the update.
Best, Martin
-
Hi Martin
Surely if the traffic was coming from a different source then that would be in the URL of that source. Adding a UTM would simply create duplicate page content between the URL and the UTM tagged URL.
He'd then be faced with the tricky and potentially dangerous task of messing with parameters. I just wouldn't mess with creating UTM tagged URLs,
Apologies - I didn't mean to argue I just couldn't understand your logic.
Regards Nigel
-
Hey Nigel,
As far as I've understand the system of his websites, it consists of two separate websites (unless he meant "page" by the "site").
Then, I think it would be useful to add the UTM so he can see from exactly which source a user comes (since those are two separate websites).
Also, I suppose that by clicking on the book buttons on the product site, they will be redirected to the book site so you would basically add the UTMs in the URL.
If he meant by "site" only "page" then the solution would be different, of course.
Cheers, Martin
-
Hi Martin
Please can you explain why and how you would add UTM parameters to the book buttons on his website?
Thanks Nigel
-
Hey there,
Regarding Q1, I'd set , as you've said. Since the Booking site has no content value for the visitor, there's no need for it to be found in Google SERP.
Regarding Q2, you can add UTM parameters to make the analytics easier in GA.
Since the booking site has no "content value", there's nothing more you can really pass.
Hope it helps. Cheers, Martin
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product pages - should the meta description match our product description?
Hi, I am currently adding new products to my website and was wondering, should I use our product description (which is keyword optimised) in the meta description for SEO purposes? Or would this be picked up by Google as duplicate content? Thanks in advance.
Algorithm Updates | | markjoyce1 -
Linking from high ranking sub domain pages to less ranking main domain pages to benefit latter
Hi all, We have our product guide pages on sub domain which are years old, so have some backlinks and high ranking for the beand related queries. Now we created new guide pages on our main website and we want these new pages to rank top beating the old pages from sub domain. Again we can't deindex or rel canonical to solve the issue as there are some part of users still using the old pages. We are planning to give a link from every old page of sub domain to same new page on main domain. Will this linking increases the authority of new pages technically and helps in ranking better? Like we give a link to "Moz guide 1" page to "Moz guide 2" page to rank latter better. Thanks
Algorithm Updates | | vtmoz0 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Best Moz article on landing pages?
From what I understand, building landing pages to link back to sites is a thing of the past. I am looking for a good article that explains best current landing page practices (post Panda and Penquin). Any suggestions?
Algorithm Updates | | cschwartzel0 -
Why am i not ranking in the top 50 for the keyword 'cocktails' even though all my other cocktail related keywords are in the first 2 pages of Google???
I have checked the first 50 pages of google for my website www.socialandcocktail.co.uk using the keyword 'cocktails'. It is NOT to be found. However, if I search for other keyword combinations eg cocktail recipes, cocktail bars etc they are all in the first 2 pages! What is going on????????
Algorithm Updates | | cocktailboss0 -
Can a google data refresh knock your pages out of the rankings?
I see that around mid November 2013 a handful of my sites pages dropped off of Google completely. It was around the data refreshes in November, and while everyone says it doesn't effect that much I was wondering if anyone knew if it could knock some of my pages out of the rankings for a specific keyword. Note - we had previously held muliple listings for different pages on our site for this particular keyword. Google kept the highest ranking and knocked the lower ones off. See attached image of our keyword ranking history to see what I mean. DcJJM0M
Algorithm Updates | | franchisesolutions0 -
Increasing Brands/Products thus increasing pages - improve SEO?
We curently have 5 brands on our website and roughly 200 pages. Does increasing the number of products you stock and thus increasing the number of pages improve your SEO?
Algorithm Updates | | babski0 -
After on-page is perfect, then what?
I am working on a friend's site and I've optimized his on-page stuff. It's pretty much perfect from SEOMoz analysis. I reduced his Errors from 150+ to 3 and his warnings from 50+ to 0. Notices are at 5. Now, a month has gone by and I've added 3 backlinks in what appears to be reputable sites PR7, PR7, and PR3. His site has dropped from the 2nd listing on the 2nd page to the bottom of the 2nd page now. Now, I realize other sites may be doing seo as well and trying to move to the top, but, seems peculiar. I put in a bit of work to revamp 3 of his pages as well b/c each one had over 300 html validation errors and I reduced it down to 6 or so. The competitor I'm aiming for has these stats: DA: 16 MozRank 3.14 MozTrust: 2.73 My friend's site is: DA: 19 MozRank: 2.21 MozTrust: 2.64 Subdomain metrics is 0 for everything, whereas his competitors have good stats for subdomain metrics. What is that? And what can I do to improve his ranking in Google? Is it just more backlinks? Thanks.
Algorithm Updates | | webtarget0