Removing blog posts with little/thin content
-
We've got quite a lot (I'd say 75%) of our blog posts which I'd consider as low quality. Short content (500 words or less) with few shares, no backlinks & comments; most of which gets 0-2 unique views a day (however combined this adds up).
Will removing these pages provide an SEO benefit greater than the reduction in traffic from the removal of these pages?
I've heard the likes of Neil Patel/Brian Dean suggest so, however I'm scared it will provide the opposite as less content is indexed and I'll actually see traffic fall.
Sam
-
Sam,
If you can safely assume that the pages are not hurting you, let them stay. It's certainly not ideal to have a website loaded with thin content. But, as is the case with most small sites, the posts are likely to do you more good than harm, provided you're willing to show them some attention.
Here's a good strategy to deploy:
-
Find the top 10 posts, as judged by analyzing GA and against the topics you hope to rank for, then beef them up with additional text and graphics.
-
Republish the posts, listing them as "updated."
-
Share the posts via social, using a meaningful quote from each piece to draw interest and invite re-shares.
-
Continue sharing the posts in the following weeks, each time with new text.
-
Gauge the performance of each social share, then use this information to create additional headlines for new posts, in addition to using it to inform you of what content might draw the most interest.
-
Repeat the process with the next 10 posts.
When you have thin, poorly performing content on your site, you aren't able to learn enough about what you're doing right to make a sound call. So to create more content, even "better" content, is likely a mistake. The wise approach is to use the content you have to investigate additional content ideas that would better serve your audience. Through social media and additional traffic to your site, you should be able to better discern what pieces of content will provide the greatest benefit in the future.
Additionally, the old content is likely to perform much better as well.
RS
-
-
It's difficult to talk in terms of truevalue. Someone of them may provide some value, but they pale in comparison to the new blog posts we have lined up and in my opinion bring the blog down; personally I wouldn't be sad to see them go.
I think it's time to exterminate.
Sam
-
Do the contents of these blog posts provide any value at all to the reader? Are they written well, and would you actually be sad to see them go? If yes, then refer to my previous response on re-purposing them to create even better content with more SEO value.
If not, and you're just worried about SEO, I'd say be rid of them. Based on those stats.
-
Thanks all, from my analysis:
In the last twelve months:
376 pages (although I'd estimate 70 of these aren't pages)
104 pages have bounce rate of 100%
307 pages have less than 20 unique views (for the previous 12 months) but the total count for this would be 1,374
which is a sizable sum.So the question is, is it worth pulling all the pages below 20 unique views and all the 100% bounce rate pages from the site? Will it actually benefit our SEO or am I just making work for myself?
I'd love to hear from people who've actually seen positive SEO movements after removing thin pages.
-
It's a waste of good content to remove it because it's considered "thin". In your position, I would consider grouping these under-performing/thin blog posts into topical themes, compile and update them to create "epic content" in the form of detailed guides or whatever is most suitable to the content. Add to the long post so that there's some logical structure to the combining of the little posts (and so it doesn't just read as if you stuck multiple posts together), then redirect the old post URLs to the newly created relevant posts. Not only do you have fresh content that could each provide a ton of value to your readers, but the SEO value of these so-called "epic" posts should in theory be more impactful.
Good luck, whatever you decide to do!
-
My rule of thumb would be:
Take all pages offline, which have under 30 organic sessions per month.
Like Dmitrii already mentioned, check your past data for these posts and have a look at average sessions durations / bounce rates / pages per sessions, with which you can valdiate the "quality of the traffic". If there are posts which have decent stats - don't take them offline. Rather update them or write a new blog post about the topic and make a redirect. In this case have a look in GWT for the actual serach queries (maybe you find some useful new insights).
-
Hi there.
Are those blogs ranking anywhat for any related keyphrases? At the same time, how about bounce rate and time on page for those 2 visits a day? Are you sure those visits are not bots/crawlers?
We have done similar reduction about 6 months ago and we haven't seen any drop in rankings. The share of traffic to thin pages was pretty small and bounce rate was high, as well as time on page was very short. So, why to have anything which doesn't do any good?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Analytics / Funnel benefits to account creation at end of steps?
I'm working with a client that has a number of paths for users to take (demand vs. supply, a lot of variables, etc). All involve new URLs and are perfect for destination-based goals and thus the creation of funnels; however, right now most are irrelevant due to some issues involving redirects, changed paths, etc. In the process of trying to make sense of them, I discovered that the site forces you to make an account before proceeding through the steps in the funnel. Are there any resources that might indicate the best to approach situations like this - that is, account creation at the beginning (and as a requirement before proceeding) to creating an account at the end? I'm trying to make sense of all their potential paths, but it's impossible without making countless accounts. Thanks.
Reporting & Analytics | | Alces0 -
How can I distinguish new visitors from existing (customers) in Google Analytics to attain an avg # of new visitor traffic per day/week?
i do marketing for a business software site where we have hundreds of clients and each account has on avg 100 users. I am having a very challenging time to attempting to figure out the real number of unique traffic that our site receives. **(what's creating the issue is that we have thousands of user accounts where our users log-in via our site to access our app/platform). Would love help with this! Christian
Reporting & Analytics | | Sundance_Kidd0 -
Excluding Cookieless Static Content Sub-domain from GA/GTM
For the purposes of this question our ecommerce site url is www.ecommerce.com Our TLD is ecommerce.com We have, following advice from Yslow, Pagespeed and others, moved our static content to a subdomain - static.ecommerce.com We have Google Analytics and Enhance Ecommerce installed, fired from GTM. The cookieDomain setting in GTM is 'auto' At present cookies are being attached to our static resources. What changes do I need to make to to prevent this happening? Many thanks Julian
Reporting & Analytics | | jdeb0 -
Collecting post codes / zip codes in Google Analytics - Terms of Service
Hi Mozzers, Just reading up on Google Analytic's Terms of Service and wondering if collecting post codes / zip codes (from a website's 'Find my nearest...' tool) adhere's to the following: "You will not (and will not allow any third party to) use the Service to track, collect or upload any data that personally identifies an individual (such as a name, email address or billing information), or other data which can be reasonably linked to such information by Google." What do you think?
Reporting & Analytics | | A_Q0 -
Direct / (none) Spam Traffic Help
In July 2015, we experienced an over 1,000% increase in traffic and it has remained like that ever since. It's all spam traffic and I have no clue how to get rid of it. I added in your typical .htaccess blocks from known culprits with little to no effect. Read up on Ghost traffic and applied filters to no effect. The spam is completely distributed as far as I can tell both geographically as well as by network providers. Where once we had pretty decent bounce rates of around 50%, now, since all my Analytics data is meaningless - it's around 90%. I could apply a filter but beyond my GA account providing no insights, I'm also concerned about the increased use of server resources. I'd ideally like to stop the traffic completely. The only distinguishing feature of the traffic that I have been able to determine is browser size. Comparing June 2015 to July 2015 we saw the following: Browser size visits: 620 x 460 = 6,828 vs 0, 610 x 450 = 175 vs 0, 1330 x 630 = 71 vs 1, 1890 x 940 = 67 vs 0, 780 x 580 = 58 v 5. Other than that, I can find no unifying theme to the traffic beyond being traffic hitting our homepage and having no medium. Nothing special that I am aware of happened in July. We didn't do any sort of...really anything. We did have our network compromised by ransomware in the beginning of June, which we promptly ignored and restored backups - at no point did we try to contact the criminals, but I am doubtful there is any connection considering that our website is remotely hosted. If anyone has any suggestions or has seen anything like this before, please let me know. spam-traffic.jpg
Reporting & Analytics | | Nivik230 -
Is there an efficient way to block/filter referral spam in Google Analytics for a large network of websites?
Hello, everyone - I'm looking for guidance on how to block or filter referral spam in Google Analytics. But I'm needing to block for an entire network of Wordpress websites. We have two networks which total over 2,500 websites. We are currently blocking sites we find out about via htaccess. This works, but only after we see we are getting hit with the spam. Updating 2,500+ Google Analytics accounts with filtering is not an ideal option due to the time factor and the fact that new bots coming out almost daily. We can continue the htaccess method, but does anyone have any other ideas for blocking referral spam for a large network of sites? These are the other ideas we have. 1. Blocking all traffic from Russia and China based up subnets. We know many will still get through, but it should block 50% of it, we hope.
Reporting & Analytics | | copyjack
2. Moving sites to Google Tag manager. This is a huge tasks but we have seen that sites using Tag Manager are not effected, at least for now. Other ideas are appreciated!0 -
How to find goo.gl/ URLs in Google Analytics
Hello! How does one go about finding the impact of goo.gl/ shortened URLs in Google Analytics? (I know I should be using Campaigns, but this was for an old project.) Thanks in advance! Erik
Reporting & Analytics | | SmileMoreSEO0 -
Totally Remove "localhost" entries from Google Analytics
Hello All, In Google Analytics I see a bunch of traffic coming from "localhost:4444 / referral". I had tried once before to create a filter to exclude this traffic source, but obviously I did it wrong since it's still showing up. Here is the filter I have currently: Filter Name: Exclude localhost
Reporting & Analytics | | Robert-B
Filter Type: Custom filter > Exclude
Filter Field: Referral
Filter Pattern: .localhost:4444.
Case Sensitive: No Can anyone see what I'm doing wrong and give me a push in the right direction? Thanks in advance!0