Publishing pages with thin content, update later?
-
So I have about 285 pages I created with very, very thin content on each. Each is unique, and each serves its own purpose.
My question is, do you guys think it is wise to publish all of these at once to just get them out there and update each as we go along? Each page is very laser targeted and I anticipate that a large handful will actually rank soon after publishing.
Thanks!
Tom
-
Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website.
If this is making many pages for each location, then I would worry about them. However, if all of this information is on a single page then you might be fine. If I owned a company like this I would require each location to give me substantive content.
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed?
I republished two of my thin content pages last week. These were noindexed for about two years. They were upgraded from two or three sentences and one photo to nearly 1000 words and four or five photos. One appeared in the index about five days later and went straight to #4 for a moderately difficult single word query. That single word query is the name of a software product, the name of some type of "gold" in the minecraft video game and has a lot of competition from .gov and .edu. .
The second one was published about eight days ago and we have not seen it in the SERPs yet. This is an unusually long time for us to wait on a republished page for this site which has a DA of about 80.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
I have never done this. I just republish the page.
-
Thanks Andy, I appreciate the response. This was a semi-large project with the main goal of capturing hyper-local leads. I guess once you throw locations into the mix it runs an even bigger chance of being hit due to popular practice of creating a page for every damn city in the country in hopes of ranking locally.
Fortunately we have real locations across the US but I don't want Google to think we're trying to dupe anyone.
Thanks again
Tom -
That's the answer I was expecting. The website I'm referencing has about 4,000 indexed pages, and those 285 may be enough to do some damage.
To give you an example (this mimics exactly what I'm doing), take a business with multiple locations. Each location has their own page, and each location page has their own departments listed with their own pages as well. Each department then has some content such as the NAP, an employee directory, and links to other resourceful pages on the website. Yeah or nay to that?
Also, if I "noindex" the pages to start, add some good content then "index" them, how long in your experience has it taken until you saw a considerable increase in traffic/see those pages indexed? I know that's a site-by-site, page-by-page kind of question but I'm curious to know.
The way I would approach it would be to crawl those pages manually in Search Console (RIP Webmaster Tools) once I updated the "index" tag.
Thoughts?
Thanks!
Tom -
Hi
I agree with the above, you run the risk of getting hit by Panda. If these pages are important to have live to help customers, then surely your priority should be to get good content on their to help your customers / potential customers. If they land on a low quality page with very little content, are they likely to stick around.
I wouldn't put any live until you have the content sorted. I would work out the priority and start there and once the content is good then put live.
There is probably a Panda update around the corner and you don't want to get hit with hit and then you are waiting for Google to release the next version to get out of it.
I wouldnt even run the risk of putting them live with noindex.
Unless of course as said above you have 100,000+ pages of amazing quality content then it probably wont affect you.
Thanks
Andy
-
In my opinion, publishing a lot of thin content pages will get you into trouble with the Panda algorithm. One of my sites had a lot of these types of pages and it was hit with a Panda problem. Most pages on the site were demoted in search. I noindexed those thin content pages and the site recovered in a few weeks.
Here is the code that I used... name="robots" content="noindex, follow" />
Although those pages had thin content, they were still valuable reference for my visitors. That is why I noindexed them instead of deleting them.
Those pages have been noindexed for about two years with no problems. Slowly, I am adding a good article to those pages to reduce their number. I worry that some day, Google might change their minds and hit sites that have lots of thin content pages that are noindexed.
I don't know how big your website is. But I am betting that 285 very very thin pages added to a website of a couple thousand pages will be a problem (that's about what I had when my site had a problem). However, if that many very very thin pages are added to a website with 100,000 pages you might get away with it.
Good luck
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Would My Page Have a Higher PA and DA, Links & On-Page Grade & Still Not Rank?
The Search Term is "Alcohol Ink" and our client has a better page authority, domain authority, links to the page, and on-page grade than those in the SERP for spaces 5-10 and we're not even ranked in the top 51+ according to Moz's tracker. The only difference I can see is that our URL doesn't use the exact text like some of the 5-10 do. However, regardless of this, our on-page grade is significantly higher than the rest of them. The one thing I found was that there were two links to the page (that we never asked for) that had a spam score in the low 20's and another in the low 30's. Does anyone have any recommendations on how to maybe get around this? Certainly, a content campaign and linking campaign around this could also help but I'm kind of scratching my head. The client is reputable, with a solid domain age and well recognized in the space so it's not like it's a noob trying to get in out of nowhere.
Intermediate & Advanced SEO | | Omnisye0 -
6 months Later - 0 Domain Authority/Page Authority and losing Rankings
Hi Moz, Sorry if this comes across as a "Do My Job For Me" type of post but we are an E-Commerce store that have been live since January but have not seen any increase in performance on our site and over the past month, have even seen our rankings decrease. We have 1300 products on site and about 1500 pages in total. 1. As for on-site optimization, we have got 2 reviews and follow up reviews with a highly reputable reviewer from People Per Hour and solved any issues she has found. 2. Updated the Meta Data for products and Alt Descriptions for images focusing on the keywords we wish to rank for. We post weekly blogposts linking back to our products. 3. Social Media Campaigns with regular campaigns on FaceBook, Pinterest, Google+ and Twitter. 4. Attempted to build FOLLOW backlinks to articles relating to products on our site. We have also considered purchasing backlinks to improve our situation as we have yet to see any of these pages be crawled by Google over a month later. I have read a guides on Moz and other sites on how to improve our authority and improve rankings but none have offered much by way of practical solution. My question being, is this just a matter of patience or should I be worried/improving anything given we have 0 Domain Authority and Page Authority on all pages? Thanking you in advance, SEO Novice.
Intermediate & Advanced SEO | | csworkwear0 -
Scraped content ranking above the original source content in Google.
I need insights on how “scraped” content (exact copy-pasted version) rank above the original content in Google. 4 original, in-depth articles published by my client (an online publisher) are republished by another company (which happens to be briefly mentioned in all four of those articles). We reckon the articles were re-published at least a day or two after the original articles were published (exact gap is not known). We find that all four of the “copied” articles rank at the top of Google search results whereas the original content i.e. my client website does not show up in the even in the top 50 or 60 results. We have looked at numerous factors such as Domain authority, Page authority, in-bound links to both the original source as well as the URLs of the copied pages, social metrics etc. All of the metrics, as shown by tools like Moz, are better for the source website than for the re-publisher. We have also compared results in different geographies to see if any geographical bias was affecting results, reason being our client’s website is hosted in the UK and the ‘re-publisher’ is from another country--- but we found the same results. We are also not aware of any manual actions taken against our client website (at least based on messages on Search Console). Any other factors that can explain this serious anomaly--- which seems to be a disincentive for somebody creating highly relevant original content. We recognize that our client has the option to submit a ‘Scraper Content’ form to Google--- but we are less keen to go down that route and more keen to understand why this problem could arise in the first place. Please suggest.
Intermediate & Advanced SEO | | ontarget-media0 -
Duplicate Content Pages - A Few Queries..
I am working through the latest Moz Crawl Report and focusing on the 'high priority' issues of Duplicate Page Content. There are some strange instances being flagged and so wondered whether anyone has any knowledge as to why this may be happening... Here is an example; This page; http://www.bolsovercruiseclub.com/destinations/cruise-breaks-&-british-isles/bruges/ ...is apparently duplicated with these pages; http://www.bolsovercruiseclub.com/guides/excursions http://www.bolsovercruiseclub.com/guides/cruises-from-the-uk http://www.bolsovercruiseclub.com/cruise-deals/norwegian-star-europe-cruise-deals Not sure why...? Also, pages that are on our 'Cruise Reviews' section such as this page; http://www.bolsovercruiseclub.com/cruise-reviews/p&o-cruises/adonia/cruising/931 ...are being flagged as duplicated content with a page like this; http://www.bolsovercruiseclub.com/destinations/cruise-breaks-&-british-isles/bilbao/ Is this a 'thin content' issue i.e. 2 pages have 'thin content' and are therefore duplicated? If so, the 'destinations' page can (and will be) rewritten with more content (and images) but the 'cruise reviews' are written by customers and so we are unable to do anything there... Hope that all makes sense?! Andy
Intermediate & Advanced SEO | | TomKing0 -
Penalized for Duplicate Page Content?
I have some high priority notices regarding duplicate page content on my website www.3000doorhangers.com Most of the pages listed here are on our sample pages: http://www.3000doorhangers.com/home/door-hanger-pricing/door-hanger-design-samples/ On the left side of our page you can go through the different categories. Most of the category pages have similar text. We mainly just changed the industry on each page. Is this something that google would penalize us for? Should I go through all the pages and use completely unique text for each page? Any suggestions would be helpful Thanks! Andrea
Intermediate & Advanced SEO | | JimDirectMailCoach0 -
New Website Look/Structure - Should I Redirect or Update Pages w/ Quality Inbound Links
This questing is regarding an ecommerce website that I hand wrote(html) in 1997. One of the first click and buy websites, with cart/admin system that I also developed. After all this time, the Old plain HTML look just doesnt cut it. I just updated to XHTML w/ a very modern look, and believe the structured data will index better. All products and current category pages will have the identical vrls taken from the old version. I decided to go with the switch after manual penalty, which has since been removed... I figured now is the time to update. My big question is that over the years, a lot of my backlinks came from products/news that are either no longer relevant or just not available. The pages do exist, but can only be found from the Outbound Link Source. For SEO purposes, I have thought a few things I can do but can't decide which one is the best choice. Any Insight or suggestions would be Awesome! 1. Redirect the old link to the most relevant page in my current catalog. 2. Add my new header/footer to old page(this will add a navigation bar w/ brands/cats/etc) 3. Simply add a nice new image to the top of these pages linking home & update any broken/irrelevant links. I was also considering adding just the very top 2 inches of my header(logo,search box, phone, address) *note, some of these pages do receive some traffic. Nothing huge, but consider the 50+ pages, it ads up.
Intermediate & Advanced SEO | | Southbay_Carnivorous_Plants0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0