Noindex pages being indexed
-
Hi all
Wondering if anyone could offer a pointer on a problem i am having please. I am developing an affiliate store and to prevent problems with duplicate content I have added name="robots" content="NOINDEX,FOLLOW" /> to all the product pages to avoid google penalties.
However, Google appears to be indexing product pages. When I do a site: search I see a few hundred product pages in the engine. This is odd as the site has always had noindex on these pages. Even viewing the cache of the indexed page shows the noindex meta tag to be in place.
I'm at a loss as to why these pages are being indexed and could do with removing them asap to stop any penalties on the site.
Many thanks for any help.
-
Thanks for taking the time to look at the site.
Not sure why the code is coming in the wrong place it is using a magento seo plugin so will need to chase them up on that. Just searched a random selection of pages and the code seems to be in the header section on all of those, so it seems there are some pages not playing nicely.
I would like to index the product pages but there are over 250,000 items pulled from the merchants and no chance of writing that much unique content so I feel safest to noindex them all. The main traffic strategy will be to use content which will promote items, such as fashion advice pieces etc.
In the example you give, that seems to be a problem with the categories, will check that out, thanks for pointing it out.
-
Okay, i have found the indexing problem. The noindex meta tag should be placed within the head section of your HTML. It is now in your body content so it's not getting used.
Are you sure you want to no-index product pages though? These are very important pages on your site and there are other ways to fix duplicate content issues.
Most duplicate content issues on pages comes from the fact that most CMS systems have the same product at different URL's. On your site i see http://yochic.co.uk/dress-267864/bodycon/black-cut-out-side-bodycon-dress.html and http://yochic.co.uk/black-cut-out-side-bodycon-dress.html
Same product, different URL. This could be fixed by adding a rel="canonical" on product pages which links back to the preferred version. Easiest would be to link back to www.domain.com/product.html
This would solve most duplicate content issues without having to deindex them.
Hope i was helpful, if you have any questions left, feel free to ask me.
-
hi, yes, maybe should have added that to start with. The site is yochic. co. uk - it is still being worked on so bear that in mind with random products in the wrong places or wonky looking menus.
The category pages are set to be indexed so no problem there, the product pages are the ones which contain the most duplicate content so have been blocked, but a number are still going in the listing.
-
Perhaps you could give us a link to your website using the following notation: domainname (dot) TLD It's difficult to tell you what the problem is without being able to take a look at your site.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product Page Links
I have a product category page at https://www.hurtlegear.com.au/s1000rr/ which currently has 38 products on it. Problem is, all the product titles start with the name of the text: "bmw s1000rr" (because that's what they are) - so that means there are 38 anchored internal links on that page, all starting with the same keyword. You can see how that might look to the Google crawler. Recently that page dropped from around 15 to outside the top 100, and Moz tells me that the page is keyword stuffed with "bmw s1000rr" (no suprise) so I'm guessing that may be the reason the page has disappeared out of the SERPs. I don't really want to change all the product titles (then they wouldn't make sense) so I'm just wondering if there is any way around this? Is there some way of telling Google that this is a product category page and therefore to ignore the anchor text in all of those product links? Can/should the links have some kind of markup on them? Or is the page beyond help? Basically I'm looking at a way of keeping the product titles as they are, but avoiding a page penalty from Google somehow. I'm a bit of a newbie, any suggestions would be most appreciated. Cheers, Graeme
On-Page Optimization | | graeme720 -
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Is it better to try and boost an old page that ranks on page #5 or create a better new page
Hello Everyone, We have been looking into our placements recently and see that one of our blog posts shows on page #5 for a popular keyword phrase with a lot of search volume. Lets say the keyword is "couples fitness ideas" We show on page 5 for a post /couples-fitness-ideas-19-tips-and-expert-advice/ We want to try and get on the first page for that phrase and wanted to know if it is better if we did one of the following: 1. Create a new page with over 100 ideas with a few more thousands of words. with a new url (thinking /couples-fitness-ideas) 2. Create a new page with a new url (thinking /couples-fitness-ideas) with the same content as the currently ranking post. We would want to do this for more freedom with layout and design of the page rather than our current blog post template. Add more content, let's say 100 more ideas. Then forward the old URL to the new one with a 301 redirect. 3. Add more content to the existing post without changing the layout and change the URL. Look forward to your thoughts
On-Page Optimization | | MobileCause0 -
Thin Content pages
I have a couple of pages that are thin content. One is essentially a page with the icons of our customers and a link out to their website. The other is a summary portfolio page that has some images of some of the client work we have done with links to internal pages that have more details about each client situation, approach, etc. These deeper pages are just fine. What is the recommendation for handling these thin content pages? We could add content, but then it wouldn't really help the user very much.
On-Page Optimization | | ExploreConsulting0 -
Indexing pages after de-indexing them
I have been de-indexing duplicate content on my website which has almost 40 pages contain duplicate content from other websites. later on the website ranking drop down. so should i re index them or just wait ?
On-Page Optimization | | MohammadSabbagh0 -
When to not index
We are working on a brand new site http://www.shedfinders.com/ The site has some login sections i.e. agent profiles for property agents etc, register etc. I figured what is the harm to submit all of these in sitemap and use Yoast to ensure on page is as good as it can be. If a user stumbles across any then they would be redirected back to a a register/login page. Not sure what is best practice? Laura
On-Page Optimization | | lauratagdigital0 -
Duplicate page content
what is duplicate page content, I have a dating site and it's got a groups area where the members can base there discussions in a category like for an example, night life, health and beauty, and such. why would this cause a problem of duplicate page content and how would I fix it. explained in the terms of a dummy.
On-Page Optimization | | clickit2getwithit0 -
City targeting on home page
Client has a site that ranks well for "Town_A_KW", "Town_B_KW" and "Town_C_KW". The home page is the page that's ranking. These towns are part of the larger metro area for Portland. They want to start ranking for "Portland_KW" and normally, I'd recommend optimizing the home page for this phrase, and better optimizing the sub-pages for town A, B and C KW's. The client is understandably nervous about messing with re-targeting the home page since it already ranks well. Is it best to: Add "Portland_KW" to home page meta titles, content, etc. to try and rank for that phrase? (so home page would be optimized for Town A, B and C KW's + Portland_KW). Re-target home page for "Portland_KW" only, and better optimize sub-pages for town A, B and C? Leave home page as is, and create a "Portland KW" sub-page? (client's original idea). Thanks in advance for your insights!
On-Page Optimization | | 540SEO0