Can't get auto-generated content de-indexed
-
Hello and thanks in advance for any help you can offer me!
Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated.
Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly).
When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up.
Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary.
I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for
Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them.
Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed.
One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either).
Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end?
Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813
Thanks for reading through all this!
-
I don't think there's any harm in submitting a new/full list, even if it duplicates past lists. The URLs haven't been removed, and you did fix the tags. This isn't like disavowing links - it's more of a technical issue. Worst case, it doesn't work, from what I've seen.
-
Thanks for helping me with this.
You are correct that all the product pages are in the same folder regardless of whether they are public or private so unfortunately, removing an entire folder isn't an option at this point.
When I go to Webmaster tools and view past removal requests, each one shows as either "Expired" or "Removed". WMT only allows me to resubmit the removal request if the label is "Expired". Going back past 90 days, many are still labeled "removed" but the further back I go, more and more say "Expired". There are too many requests to try to determine whether or not each page is indexed - so I think our best bet is to re-submit every expired private product page removal request and then monitor removal. Does this make sense?
Back in August, a Moz crawl showed tons of duplicates for the designer pages (the pages where the user actually designs the jewelry). Using NOINDEX tags and removal requests (credit to Dr. Pete and Everett Sizemore) the number of designer pages in the index dropped from 5K to exactly 8 - so it worked.
Our XML sitemap is dynamic and doesn't list private product pages.
-
It honestly sounds like you're on the right track - you do need to explicitly mark those (and META NOINDEX should be fine). Could you just request removal for all private pages? Worst case, Google removes some that aren't in the index, or attempts to. Since the public/private setting can be changed, you can't really put the private pages all in one folder (real or virtual) - that would make life easier, long-term, but probably isn't useful/appropriate for your case.
I'd also recommend having a clean XML sitemap with just the public entries (updated dynamically). That won't deindex the other pages, but it's one more cue Google can use. You want all of the signals you're sending to be consistent.
I agree with Doug, though - this is really tricky, because ideally you would want people to share these pages, and if you NOINDEX then you're losing out on that. My gut feeling is that, until your site is stronger, you probably can't support 3K near duplicates (and counting). If you want to get sophisticated, though, you could dynamically NOINDEX and only noindex posts that have very little content or our obvious dupes. As people fill out or share a product, you could remove the NOINDEX.
-
Hi Doug,
Thanks for the quick response. I will do my best to answer each of your points.
In Webmaster Tools, under Index Status, it shows 1781 pages indexed, with a high of 6515 on June 2, 2013. Not sure that helps to clarify anything but it's another piece of Google data to consider.
We continually monitor WMT and Analytics. I'm addressing this issue specifically because search impressions on our product pages average less than 5 impressions/day despite continuous improvements over the last 12 months - keyword research, better page titles/product names and longer, more informative descriptions. These 500 or so product pages are vastly better today than then were 12 months ago - but impressions have not improved at all.
Every design, public or private, has social/sharing buttons. As I mentioned above, these designs can all be linked to directly from any external website.
I think the category pages are sufficient. There is some fine-tuning that could be done in terms of how products are organized within categories but overall it's pretty solid and probably not an issue.
Our initial strategy was to attract long-tail traffic with user-generated content but the problem is most users gave their products personal, irrelevant (and possibly spammy) product names. There were other problems with the user generated designs as well - like one user who designed 15 earrings that looked exactly the same except for one bead which she changed to a different color for each design. Anyway, we left all these designs public for over 12 months - as more and more designs were added to the site, organic search traffic actually fell.
-
I agree with Doug.
create better category pages - make sure each product page is under a category.
the user generated products are great and should be indexed.
-
Hey Richard,
First, note that the estimated number of pages displayed by that is an estimate which gets refined the deeper you go into the search results. On page one, they tend to be wildly inaccurate.
If you go all the way to the end (page 13) and then repeat the process with ommitted results included you still get to page 13, and a total of 123 pages. (Somewhat better than the 2k+ results.)
This is less than the 716 pages you mention so maybe you've got he opposite problem? What do you see if you check your google analytics and webmaster tools? Which pages are getting organic traffic from google? Which pages are showing in the search results (Webmaster Tools, Impressions)
What are the pages you want to appear in search and what are the keywords you're targeting?
My first thought is - if you're allowing people to design your own jewellery - are you also allowing them to easily share their creations on social, etc? Have you got embed codes so that they can put their designs on their blog etc? If you're not then I think you're missing a trick.
All of these individual items, designed by users, will (should) all be linking back to the specific category pages (or other landning page) and increasing the authority of that page. Make sure your category/landing pages have good unique content that communicates both the value proposition and the products you've got available.
If you don't have these category pages, then it might be worth looking at your site architecture/hierarchy and think about creating them.
Your individual product pages might get long-tail traffic (and having lots of different variations, described in real-people's own words might actually work to your advantage here), your category pages should be the ones targeting head terms.
I notice you've no-indexed and no-followed the product pages in question. This means that if these pages are shared, then any inbound authority/link equity/link-juice/ is just being discarded. Are you sure you want to do that?
I don't think you need to worry too much about google's index at this point and I certainly wouldn't consider deindexing the whole site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is ranking the wrong page and I don't know why?
I have an E-Commerce store and to make things easy, let's say I am selling shoes. There is: Category named 'Shoes' and 3 products 'Sport shoes', 'Hiking shoes' and 'Dancing shoes' My problem: For the keyword 'Shoes' Google is showing the product result 'Sport shoes'. This makes no sense from user perspective. (It's like searching for 'iPhone' and getting a result for 'iPhone 4s' instead of a general overview.) Now what are the specifics of my category page (Which I want Google to rank): It has more external links with higher quality It has more internal links It has much higher page authority It has useful text to guide the user for the keyword It is a category instead of a product All this given, I just don't know how I can signal Google that this page makes sense to show in SERPs? Hope you can help with this!
Intermediate & Advanced SEO | | soralsokal0 -
Error reports showing pages that don't exist on website
I have a website that is showing lots of errors (pages that cannot be found) in google webmaster tools. I went through the errors and re-directed the pages I could. There are a bunch of remaining pages that are not really pages this is why they are showing errors. What's strange is some of the URL's are showing feeds which these were never created. I went into Google webmaster tools and looked at the remove URL tool. I am using this but I am confused if I need to be selecting "remove page from search results and cache" option or should I be selecting this other option "remove directory" I am confused on the directory. I don't want to accidentally delete core pages of the site from the search engines. Can anybody shed some light on this or recommend which I should be selecting? Thank you Wendy
Intermediate & Advanced SEO | | SOM240 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
No matter what I do, my website isn't showing up in search results. What's happening?
I've checked for meta-robots, all SEO tags are fixed, reindexed with google-- basically everything and it's not showing up. According to SEOMoz all looks fine, I am making a few fixes, but nothing terribly major. It's a new website, and i know it takes a while, but there is no movement here in a month. Any insights here?
Intermediate & Advanced SEO | | Wabash0 -
Content not indexed
How come i google content that resides on my website and on my homepage and my site doesn't come up? I know the content is unique i wrote that. I have a feeling i have some kind of a crawling issue but cannot determine what it is. I ran the crawling test and other tools and didn't find anything. Google shows me that pages are indexed but yet its weird try googling snippets of content and you'll see my site isnt anywhere. Have you experienced that before? First i thought it was penalized but i submitted the reconsideration request and it came back clear, No manual spam action found. And i did not get any message in my GWMT either. Any thoughts?
Intermediate & Advanced SEO | | CMTM0 -
I can't help but think something is wrong with my SEO
So we re-launched our site about a month ago, and ever since we've seen a dramatic drop in search results (probably due to some errors that were made) when changing servers and permalink structure. But, I can't help but think something else is at play here. When we write something, I can check 24 hours later, and if I copy the Title verbatim, but we don't always show up in SERPs. In fact, I looked at a post today, and the meta description showing is not the same, but when I check the source code, it's right. What shows up in Google: http://d.pr/i/jGJg What's actually in the source code: http://d.pr/i/p4s8 Why is this happening? Website is The Tech Block
Intermediate & Advanced SEO | | ttb0 -
Duplicate content on index.htm page
How do I avoid duplicate content on the index.htm page . I need to redirect the spider from the /index.htm file to the main root of http://www.manandhisvan.com.au and hence avoid duplicate content. Does anyone know of a foolproof way of achieving this without me buggering up the complete site Cheers Freddy
Intermediate & Advanced SEO | | Fatfreddy0 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0