Noindex follow on checkout pages in 2017
-
Hi,
My website really consists of 2 separate sites.-
Product site:
• Website with product pages.
• These product pages have SEO optimised content. -
Booking engine & checkout site:
• When a user clicks 'Book' on one of the product pages on the aforementioned product site they go to a seaparate website which is a booking engine and checkout.
• These pages are not quality, SEO optimised content, they only perform the function of booking and buying.
Q1) Should I set 'noindex follow' via the meta tag on all pages of the 'Booking engine and checkout' site?
ie.Q2) should i add anything to the book buttons on the product site?
I am hoping all this will somehow help concentrate the SEO juice onto the Product Site's pages by declaring the Booking engine and Checkout sites pages to be 'not of any content value'.
-
-
Hi
Ironically MOZ will pick this up as a problem as it reports anything that is noindexed!
For me I just ignore noindex as a problem in certain cases as clearly it makes perfect sense to noindex certain pages and indeed sometimes whole directories.
I sometimes find that developers have noindexed directories like /new-products or /sale but clearly there are better ways of handling the potential duplicate problem here by adding a canonical. In you case it makes no sense having Google index the checkout pages.
Regards Nigel
-
Hi Martin / Nigel,
Thanks for your responses, In regards to Q1.
By adding to the 'Booking engine and checkout' site's pages will this also stop Moz from Crawling these pages - and consequently remove 'issues' from their Moz Site Crawl 'issues count' as it currently crawls these pages and picks up issues?
-
Hi Nigel,
You're right, I didn't think about the duplicates from UTM previously.
Thanks for the update.
Best, Martin
-
Hi Martin
Surely if the traffic was coming from a different source then that would be in the URL of that source. Adding a UTM would simply create duplicate page content between the URL and the UTM tagged URL.
He'd then be faced with the tricky and potentially dangerous task of messing with parameters. I just wouldn't mess with creating UTM tagged URLs,
Apologies - I didn't mean to argue I just couldn't understand your logic.
Regards Nigel
-
Hey Nigel,
As far as I've understand the system of his websites, it consists of two separate websites (unless he meant "page" by the "site").
Then, I think it would be useful to add the UTM so he can see from exactly which source a user comes (since those are two separate websites).
Also, I suppose that by clicking on the book buttons on the product site, they will be redirected to the book site so you would basically add the UTMs in the URL.
If he meant by "site" only "page" then the solution would be different, of course.
Cheers, Martin
-
Hi Martin
Please can you explain why and how you would add UTM parameters to the book buttons on his website?
Thanks Nigel
-
Hey there,
Regarding Q1, I'd set , as you've said. Since the Booking site has no content value for the visitor, there's no need for it to be found in Google SERP.
Regarding Q2, you can add UTM parameters to make the analytics easier in GA.
Since the booking site has no "content value", there's nothing more you can really pass.
Hope it helps. Cheers, Martin
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website and landing pages - Proportionate authority
Does website's (homepage) ranking going to influence landing pages ranking or vice-versa? If the homepage is ranking good for a "keyword", will that improve ranking of other landing pages which are optimised for related "keywords" & Vice-versa?
Algorithm Updates | | vtmoz0 -
US domain pages showing up in Google UK SERP
Hi, Our website which was predominantly for UK market was setup with a .com extension and only two years ago other domains were added - US (.us) , IE (.ie), EU (.eu) & AU (.com.au) Last year in July, we noticed that few .us domain urls were showing up in UK SERPs and we realized the sitemap for .us site was incorrectly referring to UK (.com) so we corrected that and the .us domain urls stopped appearing in the SERP. Not sure if this actually fixed the issue or was such coincidental. However in last couple of weeks more than 3 .us domain urls are showing for each brand search made on Google UK and sometimes it replaces the .com results all together. I have double checked the PA for US pages, they are far below the UK ones. Has anyone noticed similar behaviour &/or could anyone please help me troubleshoot this issue? Thanks in advance, R
Algorithm Updates | | RaksG0 -
Indexing of Search Pages
I have a question on indexing search pages of an ecommerce or any website. I read Google doesn't recommend this and sites shouldn't allow indexing of their search pages. I recently attended an SEO event (BrightonSEO) and one of the talks was on search pages and how big players like eBay, Amazon do index their search pages. In fact, it is a core part of the pages that are indexed. eBay has to do it, as their product pages are on a time frame and Amazon only allows certain category search pages to be indexed. Reviewing my competitors, they are indexing search pages and this is why they have thousands and millions of web pages indexed. What are your thoughts? I thought search pages were too dynamic (URL strings) and they wouldn't have a unique page title, meta description or rich content to act as a well optimised page. Am I missing a trick here? Cyto
Algorithm Updates | | Bio-RadAbs0 -
How long does google take to re-ranking pages in results?
I mean when google dance, the pages in results go up and down frequency every minue, but finally your page will rank in any position in google, what is the time when you get another position in google
Algorithm Updates | | engtamous0 -
Who else is noticing a shift in deeper pages ranking?
Without mentioning names, we're noticing a shift in many of our clients ranking pages. Previously many of them held page 1 positions with their home page. We've been building brand only anchor text to these pages for some time now and there's a noticeable change in visibility to the domain as a whole displayed in GWT and there's an uplift in organic traffic too. It just happens that some of our clients already had pages in the root directory that were very optimised for the clients' head terms, but all of a sudden, these sub pages with very few inbound links have started ranking in the place of the home pages. I've attached a screenshot of the landing page organic traffic. The pages in question have been there for at least 8-10 months. These inner pages would not normally have been able to hold their ground in this position and I'm concerned that this is a temporary change. I can see this going one of two ways; (i) home page beings to out rank sub page as before, (i) sub page loses ranking ability and home page rank does not come back. My questions to the community are therefore; **Has anyone else noticed this shift in ranking behaviour? ** What are everyone's thoughts on this? - Will it remain this way? From this query I can easily ask another wider question; Good advice across the internet says we should be building strong brand links and citations to our clients' domains. Typically brand links go to the homepage, which should provide the homepage and (to a lesser extent the domain) with a ranking/traffic/visibility uplift. However, as I'm noticing other pages now picking up ranking boosts as a result of this; **Should we still be trying to gain links to these more commercial landing pages? ** How are others building high quality links to pages full of commercial copy? I hope this can spark a little bit of a debate. I look forward to hearing everyone's thoughts. Thanks yPOEjVA.png
Algorithm Updates | | tomcraig860 -
Pages fluctuating +/- 70 positions in Google SERPs?
I've got some pages that appear somewhere around #25 in Google. Every now and then, it just goes away from the top 100 results for a few days (even up to a week) and then it comes back. I've got other pages that rank around #8 which falls down to about #75 for a while and then it comes back. But while a page may be gone from the top 100 results in the US, it still ranks at about the same place everywhere else in the world (+/- 10 positions). I've seen this happen in the past but never it happened so often. What gives?!?
Algorithm Updates | | sbrault740 -
Drop in Page Indexing, Small rise in Search Queries
Hello, I have a news based website so i am creating multiple new posts daily. I changed a lot of the site and got rid of old potentially duplicate content back in feb and had a sharp drop in pages indexed. I know this was because I removed a lot of pages though. However I still have a good 20,000 + pages on my site and my indexing has dropped a further three times since then. From 9,000 to 2,000 a coupe of months ago and then slowly down since April to just 133. It doesn't seem to have affected my search queries yet but surely will if it continues. I am really confused as to how this might happen & how to turn it around. We dont use any dodgy SEO tricks either.
Algorithm Updates | | luwhosjack0 -
Will google punish us for using formulaic keyword-rich content on different pages on our site?
We have 100 to 150 words of SEO text per page on www.storitz.com. Our challenge is that we are a storage property aggregator with hundreds of metros. We have to distinguish each city with relevant and umique text. If we use a modular approach where we mix and match pre-written (by us) content, demographic and location oriented text in an attempt to create relevant and unique text for multiple (hundreds) of pages on our site, will we be devalued by Google?
Algorithm Updates | | Storitz0