Deleting low quality content
-
Hi there. I have a question about deleting low quality content pages hopefully anyone could share your feedback on.
We have a b2c ecom store and Product Pages are our target LDPs from search. We've built many information pages that are related to different products in the long past that are linked to related product pages.
Problem is many of them lack so-called quality content in terms of volume and quality and they aren't helping. Especially since early this year, organic traffic started declining after having peaked in Feb.
So I'm considering deleting those we and Moz consider low quality that are not receiving search traffic.
Firstly, is that a good idea? Secondly, how should I go about it? Just delete them and put a redirect so that deleted pages will point to related pages or even homepage?
Looking forward to any expert input.
-Yuji -
you do need to obtain seo advice, but often, we don't advise to delete the page but to improve it substantially.
If you have duplicated content, remove it and replace it with well-written, white-hat, high-quality content marketing. This is how we've improved many businesses' local seo by improving on-page SEO, rather than deleting it completely.
-
It would be best to talk to an[SEO Agency to get advice before you delete any blog posts or main pages.
-
Thanks for your advice. Yes, we will definitely be careful deleting pages. Thanks a lot!
-
That's a really good idea! Cut down what you have to manage to the essentials and then spend more time on those pages. Make sure you do some kind of ranking or traffic audit against all the pages though. You don't want to delete the versions of each page which have some (even if it is small) SEO power. You want to target the ones which Google isn't using
-
Thanks a lot for your feedback. It was helpful. I think we may need to remove pages leaving only unique ones and update their content to be more valuable. Thanks!
-
This is usually speaking **not the right mind set **to succeed.
When Google says (through decreasing ranking positions) that you haven't put in enough effort, usually deleting a poor attempt garners no favour in the ranking results. Think about it. Google are saying "you don't have enough quality content" and your answer is to delete content, thus having less than before. Does that seem like a genuine attempt to comply with the increasing stringency of Google's guidelines?
Deleting stuff is the easy way out. Think about it as if you wrote an essay in College and Google were the examiner. They Give you a D- for your essay and mark certain areas of your work as needing improvement. If you deleted those paragraphs, did nothing else and re-submitted the essay would you honestly expect a better grade?
Google want to see effort, unique content, value-add for end users. _Real _hard graft.
If you have high volumes of pages which are identical other than one tiny tab of information or a variable price, then maybe streamlining your architecture by removing pages is the answer. If most of the pages are unique in function (e.g: factually different products, not just parameter-based URL variants etc) then it's more a comment on the lack of invested effort and you must tackle your mindset if you want to rank.
N.B: By effort I don't mean your personal effort. I could also be alluding to the fact that budget was too low when producing content. I'm describing the site - not you personally!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Publishing Volume/Timing
I am working with a company that has a bi-monthly print magazine that has several years' worth of back issues. We're working on building a digital platform, and the majority of articles from the print mag - tips, how-tos, reviews, recipes, interviews, etc - will be published online. Much of the content is not date-sensitive except for the occasional news article. Some content is semi-date-sensitive, such as articles focusing on seasonality (e.g. winter activities vs. summer activities). My concern is whether, once we prepare to go live, we should ensure that ALL historical content is published at once, and if so, whether back-dates should be applied to each content piece (even if dating isn't relevant), or whether we should have a strategy in place in terms of creating a publishing schedule and releasing content over time - albeit content that is older but isn't necessarily time-sensitive (e.g. a drink recipe). Going forward, all newly-created content will be published around the print issue release. Are there pitfalls I should avoid in terms of pushing out so much back content at once?
Intermediate & Advanced SEO | | andrewkissel0 -
Publishing content in two or more places?
I've been thinking about publishing an article on LinkedIn and then posting the same article to the news page on the website. It would be high quality informative and useful but is that likely to cause any duplicate content issues?
Intermediate & Advanced SEO | | seoman100 -
Duplicate content, the distrubutors are copying the content of the manufacturer
Hi everybody! While I was checking all points of the Technical Site Audit Checklist 2015 (great checklist!), I found that the distrubutors of my client are copying part of the content to add it in their websites. When I take a content snippet, and put it in quotes and search for it I get four or five sites that have copied the content. They are distributors of my client. The first result is still my client (the manufacturer), but... should I recommend any action to this situation. We don't want to bother the distributors with obstacles. This situation could be a problem or is it a common situation and Google knows perfectly where the content is comming from? Any recommendation? Thank you!
Intermediate & Advanced SEO | | teconsite0 -
Consensus on disavowing low-quality auto-generated links (e.g. webstatsdomain.org etc) ?
Is there a consensus in the SEO world around the best practice on how to treat the multiple auto-generated links for a domain? With a lot of the link profiles we have been analyzing nearly 70% volume of the backlinks relate to these auto generated links (e.g. similarweb.com, informer.com, webstatsdomain.org etc) I can see arguments for disavowing them (low-quality links) as well as keeping them (skew anchor text distribution towards URL mentions, natural link profile) but would be interested if people have run experiments or prefer strongly one way or the other.
Intermediate & Advanced SEO | | petersocapro1 -
Duplicate content reported on WMT for 301 redirected content
We had to 301 redirect a large number of URL's. Not Google WMT is telling me that we are having tons of duplicate page titles. When I looked into the specific URL's I realized that Google is listing an old URL's and the 301 redirected new URL as the source of the duplicate content. I confirmed the 301 redirect by using a server header tool to check the correct implementation of the 301 redirect from the old to the new URL. Question: Why is Google Webmaster Tool reporting duplicated content for these pages?
Intermediate & Advanced SEO | | SEOAccount320 -
What do I do about sites that copy my content?
I've noticed that there are a number of websites that are copying my content. They are putting the full article on their site, mentioning that it was reposted from my site, but contains no links to me. How should I approach this? What are my rights and should I ask them to remove it or add a link? Will the duplicate content affect me?
Intermediate & Advanced SEO | | JohnPeters0 -
Duplicate Content in News Section
Our clients site is in the hunting niche. According to webmaster tools there are over 32,000 indexed pages. In the new section that are 300-400 news posts where over the course of a about 5 years they manually copied relevant Press Releases from different state natural resources websites (ex. http://gfp.sd.gov/news/default.aspx). This content is relevant to the site visitors but it is not unique. We have since begun posting unique new posts but I am wondering if anything should be done with these old news posts that aren't unique? Should I use the rel="canonical tag or noindex tag for each of these pages? Or do you have another suggestion?
Intermediate & Advanced SEO | | rise10 -
Duplicate Content on Blog
I have a blog I'm setting up. I would like to have a mini-about block set up on every page that gives very brief information about me and my blog, as well as a few links to the rest of the site and some social sharing options. I worry that this will get flagged as duplicate content because a significant amount of my pages will contain the same information at the top of the page, front and center. Is there anything I can do to address this? Is it as much of a concern as I am making it? Should I work on finding some javascript/ajax method for loading that content into the page dynamically only for normal browser pageviews? Any thoughts or help would be great.
Intermediate & Advanced SEO | | grayloon0