Noindex follow on checkout pages in 2017
-
Hi,
My website really consists of 2 separate sites.-
Product site:
• Website with product pages.
• These product pages have SEO optimised content. -
Booking engine & checkout site:
• When a user clicks 'Book' on one of the product pages on the aforementioned product site they go to a seaparate website which is a booking engine and checkout.
• These pages are not quality, SEO optimised content, they only perform the function of booking and buying.
Q1) Should I set 'noindex follow' via the meta tag on all pages of the 'Booking engine and checkout' site?
ie.Q2) should i add anything to the book buttons on the product site?
I am hoping all this will somehow help concentrate the SEO juice onto the Product Site's pages by declaring the Booking engine and Checkout sites pages to be 'not of any content value'.
-
-
Hi
Ironically MOZ will pick this up as a problem as it reports anything that is noindexed!
For me I just ignore noindex as a problem in certain cases as clearly it makes perfect sense to noindex certain pages and indeed sometimes whole directories.
I sometimes find that developers have noindexed directories like /new-products or /sale but clearly there are better ways of handling the potential duplicate problem here by adding a canonical. In you case it makes no sense having Google index the checkout pages.
Regards Nigel
-
Hi Martin / Nigel,
Thanks for your responses, In regards to Q1.
By adding to the 'Booking engine and checkout' site's pages will this also stop Moz from Crawling these pages - and consequently remove 'issues' from their Moz Site Crawl 'issues count' as it currently crawls these pages and picks up issues?
-
Hi Nigel,
You're right, I didn't think about the duplicates from UTM previously.
Thanks for the update.
Best, Martin
-
Hi Martin
Surely if the traffic was coming from a different source then that would be in the URL of that source. Adding a UTM would simply create duplicate page content between the URL and the UTM tagged URL.
He'd then be faced with the tricky and potentially dangerous task of messing with parameters. I just wouldn't mess with creating UTM tagged URLs,
Apologies - I didn't mean to argue I just couldn't understand your logic.
Regards Nigel
-
Hey Nigel,
As far as I've understand the system of his websites, it consists of two separate websites (unless he meant "page" by the "site").
Then, I think it would be useful to add the UTM so he can see from exactly which source a user comes (since those are two separate websites).
Also, I suppose that by clicking on the book buttons on the product site, they will be redirected to the book site so you would basically add the UTMs in the URL.
If he meant by "site" only "page" then the solution would be different, of course.
Cheers, Martin
-
Hi Martin
Please can you explain why and how you would add UTM parameters to the book buttons on his website?
Thanks Nigel
-
Hey there,
Regarding Q1, I'd set , as you've said. Since the Booking site has no content value for the visitor, there's no need for it to be found in Google SERP.
Regarding Q2, you can add UTM parameters to make the analytics easier in GA.
Since the booking site has no "content value", there's nothing more you can really pass.
Hope it helps. Cheers, Martin
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
More or less pages from Homepage? Linking 3rd hierarchy level pages from Homepage.
Hi Moz community, With the concept of preserving link juice, many websites stopped linking too many pages from homepage. We even removed our 3rd hierarchy level pages removed from our homepage. We didn't notice much change in rankings. Recently I have gone through some SEO articles where some experts suggested to link low level pages from homepage which indicates to Google the way we respect and prioritise those pages but not just homepage and very next level pages. This also works in internal linking it seems. Is this true? Can we add such low level pages from homepage? Which actually works Thanks
Algorithm Updates | | vtmoz0 -
Can a page be 100% topically relevant to a search query?
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent. Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
Algorithm Updates | | Christy-Correll1 -
Does it matter? 404 v.s. 302 > Page Not Found
Hey Mozers, What are your thoughts of this situation i'm stuck in all inputs welcome 🙂 I am in the middle of this massive domain migration to a new server. Also we are going to be having a very clean SEO friendly url structure. While I was doing some parsing and cleaning up some old urls I stumbled upon a strange situation on my website. I have a bunch of "dead pages" and they are 302'd to a "page not found" probably a old mistake of one of the past developers. (To clarify the HTTP Status code is not 404) Should I try to fight to get all these "dead pages" a 404 error code or could I just leave the temp redirect 302 > "page not found" ( even though I know for a fact theses pages are not going to turn on again)
Algorithm Updates | | rpaiva0 -
Why is a sub page ranking over home page?
Hey guys! I was wondering whether any of you Mozzers out there could shed some light on this query for me. Currently, one of our clients is ranking (on the second page, at least) for one of their target keywords. However, it's not the home page that is ranking - it is a sub page. I guess you could say both are targeted to rank for the keyword in question but the home page has a considerable more PA (+10) and has a lot more incoming links so it's a little bit baffling as to why the sub page has been given an advantage. Does anyone know why this may be? Also, on a secondary note, should I continue to build links to the home page or target this particular sub page to have a better chance of ranking higher for the keyword? Any advice on this welcome! Cheers!
Algorithm Updates | | Webrevolve0 -
Correct usage of expired pages -410 or not?
Hi Mozzes, We're running a property portal that carries around 200.000 listings in two languages. All listings are updated several times per day and when one of our ads expire we report this via the "410 Gone", and place a link to our users: This ad has expired, click here to search for similar properties. Looking at our competition I seems that here are many different ways to deal with this, one popular being a 301 to the corresponding search result. We've tried to get directions from Google on what method they prefere, but as usual dead silence. Advices are mostly welcome.
Algorithm Updates | | PropertyPortal0 -
Does Google or Bing use words in the page title beyond the displayed limit for ranking purposes?
Standard good practice for on-page SEO includes keeping page title length below the maximum that Google displays in the SERPs. But words in the title beyond that maximum can be indexed, even if they don't show in the SERPs for end users. For ranking purposes, is there any value in words beyond the character limit in page titles that are truncated in the SERPs?
Algorithm Updates | | KyleJB0 -
Changing the # of results per page in Google search settings displays totally different results. Why is this?
Curious what's going on here. This is the first time I've seen this before. What's happening is this ... In Google, I search for "mobile apps orange county" and get a standard list of 10 results. I go to Google's search settings in the top right corner of the page (button is grey with a gear) to change the number of results per page from 10 to 50 (also did 100). When I go back to Google and search again for "mobile apps orange county" I get a much larger list but with completely different results. This time around the top 10-12 are dominated by the same website (ocregister.com) What's going on here that Google would now show different results? Why is this one website all of a sudden dominating the first 12 results? Thanks everyone! ByteLaunch
Algorithm Updates | | ByteLaunch0 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0