Can't get auto-generated content de-indexed
-
Hello and thanks in advance for any help you can offer me!
Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated.
Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly).
When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up.
Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary.
I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for
Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them.
Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed.
One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either).
Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end?
Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813
Thanks for reading through all this!
-
I don't think there's any harm in submitting a new/full list, even if it duplicates past lists. The URLs haven't been removed, and you did fix the tags. This isn't like disavowing links - it's more of a technical issue. Worst case, it doesn't work, from what I've seen.
-
Thanks for helping me with this.
You are correct that all the product pages are in the same folder regardless of whether they are public or private so unfortunately, removing an entire folder isn't an option at this point.
When I go to Webmaster tools and view past removal requests, each one shows as either "Expired" or "Removed". WMT only allows me to resubmit the removal request if the label is "Expired". Going back past 90 days, many are still labeled "removed" but the further back I go, more and more say "Expired". There are too many requests to try to determine whether or not each page is indexed - so I think our best bet is to re-submit every expired private product page removal request and then monitor removal. Does this make sense?
Back in August, a Moz crawl showed tons of duplicates for the designer pages (the pages where the user actually designs the jewelry). Using NOINDEX tags and removal requests (credit to Dr. Pete and Everett Sizemore) the number of designer pages in the index dropped from 5K to exactly 8 - so it worked.
Our XML sitemap is dynamic and doesn't list private product pages.
-
It honestly sounds like you're on the right track - you do need to explicitly mark those (and META NOINDEX should be fine). Could you just request removal for all private pages? Worst case, Google removes some that aren't in the index, or attempts to. Since the public/private setting can be changed, you can't really put the private pages all in one folder (real or virtual) - that would make life easier, long-term, but probably isn't useful/appropriate for your case.
I'd also recommend having a clean XML sitemap with just the public entries (updated dynamically). That won't deindex the other pages, but it's one more cue Google can use. You want all of the signals you're sending to be consistent.
I agree with Doug, though - this is really tricky, because ideally you would want people to share these pages, and if you NOINDEX then you're losing out on that. My gut feeling is that, until your site is stronger, you probably can't support 3K near duplicates (and counting). If you want to get sophisticated, though, you could dynamically NOINDEX and only noindex posts that have very little content or our obvious dupes. As people fill out or share a product, you could remove the NOINDEX.
-
Hi Doug,
Thanks for the quick response. I will do my best to answer each of your points.
In Webmaster Tools, under Index Status, it shows 1781 pages indexed, with a high of 6515 on June 2, 2013. Not sure that helps to clarify anything but it's another piece of Google data to consider.
We continually monitor WMT and Analytics. I'm addressing this issue specifically because search impressions on our product pages average less than 5 impressions/day despite continuous improvements over the last 12 months - keyword research, better page titles/product names and longer, more informative descriptions. These 500 or so product pages are vastly better today than then were 12 months ago - but impressions have not improved at all.
Every design, public or private, has social/sharing buttons. As I mentioned above, these designs can all be linked to directly from any external website.
I think the category pages are sufficient. There is some fine-tuning that could be done in terms of how products are organized within categories but overall it's pretty solid and probably not an issue.
Our initial strategy was to attract long-tail traffic with user-generated content but the problem is most users gave their products personal, irrelevant (and possibly spammy) product names. There were other problems with the user generated designs as well - like one user who designed 15 earrings that looked exactly the same except for one bead which she changed to a different color for each design. Anyway, we left all these designs public for over 12 months - as more and more designs were added to the site, organic search traffic actually fell.
-
I agree with Doug.
create better category pages - make sure each product page is under a category.
the user generated products are great and should be indexed.
-
Hey Richard,
First, note that the estimated number of pages displayed by that is an estimate which gets refined the deeper you go into the search results. On page one, they tend to be wildly inaccurate.
If you go all the way to the end (page 13) and then repeat the process with ommitted results included you still get to page 13, and a total of 123 pages. (Somewhat better than the 2k+ results.)
This is less than the 716 pages you mention so maybe you've got he opposite problem? What do you see if you check your google analytics and webmaster tools? Which pages are getting organic traffic from google? Which pages are showing in the search results (Webmaster Tools, Impressions)
What are the pages you want to appear in search and what are the keywords you're targeting?
My first thought is - if you're allowing people to design your own jewellery - are you also allowing them to easily share their creations on social, etc? Have you got embed codes so that they can put their designs on their blog etc? If you're not then I think you're missing a trick.
All of these individual items, designed by users, will (should) all be linking back to the specific category pages (or other landning page) and increasing the authority of that page. Make sure your category/landing pages have good unique content that communicates both the value proposition and the products you've got available.
If you don't have these category pages, then it might be worth looking at your site architecture/hierarchy and think about creating them.
Your individual product pages might get long-tail traffic (and having lots of different variations, described in real-people's own words might actually work to your advantage here), your category pages should be the ones targeting head terms.
I notice you've no-indexed and no-followed the product pages in question. This means that if these pages are shared, then any inbound authority/link equity/link-juice/ is just being discarded. Are you sure you want to do that?
I don't think you need to worry too much about google's index at this point and I certainly wouldn't consider deindexing the whole site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Doesn't Make Any Sense!! Help Please
Hey again Moz community! I've been trying to read up on schema markup and watch videos multiple times (!) but I can't understand how it works. I would greatly appreciate it if someone can answer these questions: Do I need to ‘markup’ every part of the article? Like “this section can be FAQ snippet, and this can also FAQ etc..". So I guess my question is how detailed does the markup have to be? What are the best tools to use for schema markup for wordpress? What are the best tools to use for schema markup for react web-app? The https://search.google.com/test/rich-results shows if the markup is good for a page, but it doesn’t provide any details. For some articles it says that sitelinks searchbox is detected but that’s only one type of snippet possibility? Do I need to add additional markup for, say, list snippets and FAQ snippets if I want a chance to get those? Thanks a lot! Leo W
Intermediate & Advanced SEO | | Leowa2 -
Duplicate Content That Isn't Duplicated
In Moz, I am receiving multiple messages saying that there is duplicate page content on my website. For example, these pages are being highlighted as duplicated: https://www.ohpopsi.com/photo-wallpaper/made-to-measure/pop-art-graffiti/farm-with-barn-and-animals-wall-mural-3824 and https://www.ohpopsi.com/photo-wallpaper/made-to-measure/animals-wildlife/little-elephants-garden-seamless-pattern-wall-mural-3614. As you can see, both pages are different products, therefore I can't apply a 301 redirect or canonical tag. What do you suggest?
Intermediate & Advanced SEO | | e3creative0 -
Client rebranded with a new website but can't migrate now defunct franchise website to new website.
Hi everyone, My client is a chain of franchised restaurants with a local domain website named after the franchise. The franchise exited the market while the client stayed and built its own brand with a separate website. The franchise website (which is extremely popular) will be shut down soon but the client will not be able to redirect the franchise website to the new website for legal reasons. What can I do to ensure that we start ranking immediately for the franchise keyphrase as soon as the franchise website is shutdown. We currently have the new website and access to the old website (which we can't redirect) Thanks, T
Intermediate & Advanced SEO | | Tarek_Lel0 -
[wtf] Mysterious Homepage De-Indexing
Our homepage, as well as several similar landing pages, have vanished from the index. Could you guys review the below pages to make sure I'm not missing something really obvious?! URLs: http://www.grammarly.com http://www.grammarly.com/plagiarism-checker It's been four days, so it's not just a temporary fluctuation The pages don't have a "noindex" tag on them and aren't being excluded in our robots.txt There's no notification about a penalty in WMT Clues: WMT is returning an "HTTP 200 OK" for Fetch, is showing a redirect to grammarly.com/1 (alternate version of homepage, contains rel=canonical back to homepage) for Fetch+Render. Could this be causing a circular redirect? Some pages on our domain are ranking fine, e.g. https://www.google.com/search?q=grammarly+answers A month ago, we redesigned the pages in question. The new versions are pretty script-heavy, as you can see. We don't have a sitemap set up yet. Any ideas? Thanks in advance, friends!
Intermediate & Advanced SEO | | ipancake0 -
Organic listings disappeared I don't know why!
Brief history: I am MD of a medium sized health organisation in the UK. We have one of the leading websites in the world for our industry. We were hit by a Google algorithm update last year (Penguin or Panda, I can't remember, but that's not relevant here I don't think) and our daily visits went down from around 10,000 to around 5,000 in two separate hits over a couple of months. Then there was a steady decrease to about 3,000-4,000 visits a day until we totally updated the design of the site and did some good work on the content. We have always been white-hat and the site has around 3,000 pages with unique content added daily. So things have really been on the up for the past couple of months. We have been receiving around 6,000 visits a day in recent weeks (a slow incline over the past few months), until Sunday. Sunday morning around 10am all of our organic listings pretty much disappear, including for our brand name. Monday morning a few come back, including our brand name and our main, most competitive keyword, which we were showing up on the third page for and we returned to this page. Then Tuesday morning another few of our most competitive keywords show up, back where they were before. This includes images which had disappeared from Google images. Our PPC and business listings were not really affected at all. My developer submitted a site map through webmaster tools on Monday morning and I'm not sure if this is the reason pages started to show up again. In our Webmaster tools the indexed pages are about a quarter of all of the ones on the site - all pages were indexed before. I just don't know what has happened! It doesn't make any sense as 1. Google don't seem to have rolled out any algorithm updates on that day 2. we do not have any messages in Webmaster Tools 3. a number of our main keywords have re-appeared - why would that happen if we had been hit by a Google update?! Our organic hits, which previously made up about 80% of all our hits, have gone down by 80% and this is drastically affecting business. If this continues it is likely we will have to downsize the business and I'm not sure what to do. When I saw that the 'indexed pages' in Webmaster tools started to increase (they were around 600 on Monday, around 900 yesterday and then this morning, around 1,300), I thought that we were on our way up and maybe this problem would just resolve itself and our listings would re-appear, but now our indexed pages have reduced slightly since this morning, back down to around 1,100 so the increase has stalled. Can anybody help?! Do you have any idea what could be causing this? Apparently there have been no changes made to robots.txt and my developer says that no changes were made that could have affected our listings. ANY ADVICE WOULD BE GREATLY APPRECIATED.
Intermediate & Advanced SEO | | JH11 -
Can I swap a website yet keep it's high ranking for a competitive keyword?
Couldn't fit the entire question in the main bit so the explanation is here: Working on a client's website which is hosted by volusion and also been doing SEO for them for about a year. Now we've finally got them ranking at the lower end of page 1 (around 10+) for their main keyword. They now want to move from volusion over to Amazon Web Store 😢 which seems to be an SEO nightmare from even my basic understanding of SEO. From looking at the coding and the way Amazon Web store is built on top of how restricted you are from doing anything with it, I am almost certain the shop will be extremely difficult to optimise and we will have to completely change nearly all of the content. Finally! the actual question; I was thinking I could get them to delay their move to Amazon webstore until they are ranking in the top 5 for this top keyword. Once they switch over, i assume they'll keep this ranking for at least a short while? This keyword attracts a high volume of traffic and if this traffic is clicking on the result for their website, and google sees that people are finding this website valuable (not clicking back onto google results). Will they be able hold onto this high ranking? Basically what I'm asking is, this will be a terrible outdated badly SEO'd shop, but if a high volume of people are clicking on it and staying on it from their lingering ranking will Google just let it stay at the top? A massive amount of gratitude in advance for anyone who tries to help with this! 😄
Intermediate & Advanced SEO | | acecream0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
A Noob's SEO Plan of attack... can you critique it for me?
I've been digging my teeth into SEO for a solid 1.5 weeks or so now and I've learned a tremendous amount. However, I realize I have only scratched the surface still. One of the hardest things I've struggled with is the sheer amount of information and feeling overwhelmed. I finally think I've found a decent path. Please critique and offer input, it would be much appreciated. Step One: Site Architecture I run an online proofreading & editing service. That being said, there are lots of different segment we would eventually like to rank for other than the catch-all phrases like 'proofreading service'. For example, 'essay editing', 'resume editing', 'book editing', or even 'law school personal statement editing'. I feel that my first step is to understand how my site is built to handle this plan now, and into the future. Right now we simply have the homepage and one segment: kibin.com/essay-editing. Eventually, we will have a services page that serves almost like a site-map, showing all of our different services and linking to them. Step Two: Page Anatomy I know it is important to have a well defined anatomy to these services pages. For example, we've done a decent job with 'above the fold' content, but now understand the importance of putting the same type of care in below the fold. The plan here is to have a section for recent blog posts that pertain to that subject in a section titled "Essay Editing and Essay Writing Tips & Advice", or something to that effect. Also including some social sharing options, other resources, and an 'about us' section to assist with keyword optimization is in the plan. Step Three: Page Optimization Once we're done with Step Two, I feel that we'll finally be ready to truly optimize each of our pages. We've down some of this already, but probably less than 50%. You can see evidence of this on our essay editing page and proofreading rates page. So, the goal here is to find the most relevant keywords for each page and optimize for those to the point we have A grades on our on-page optimization reports. Step Four: Content/Passive Link Building The bones for our content strategy is in place. We have sharing links on blog posts already in place and a slight social media presence already. I admit, the blog needs some tightening up, and we can do a lot more on our social channels. However, I feel we need to start by creating content that our audience is interested in and interacting with them on a consistent basis. I do not feel like I should be chasing link building strategies or guest blog posts at this time. PLEASE correct me if I'm off base here, but only after reading step five: Step Five: Active Link Building My bias is to get some solid months of creating content and building a good social media presence where people are obviously interacting with our posts and sharing our content. My reasoning is that it will make it much easier for me to reach out to bloggers for guest posts as we'll be much more reputable after spending time doing step 4. Is this poor thinking? Should I try to get some guest blog posts in during step 4 instead? Step Six: Test, Measure, Refine I'll admit, I have yet to really dive into learning about the different ways to measure our SEO efforts. Besides being set up with our first campaign as an SEOPro Member and having 100 or so keywords and phrases we're tracking... I'm really not sure what else to do at this point. However, I feel we'll be able to measure the popularity of each blog post by number of comments, shares, new links, etc. once I reach step 6. Is there something vital I'm missing or have forgotten here? I'm sorry for the long winded post, but I'm trying to get my thoughts straight before we start cranking on this plan. Thank you so much!
Intermediate & Advanced SEO | | TBiz2