What Should I Do With Low Quality Content?
-
As my site has definitely got hit by Panda, I am in the process of cleaning my website of low quality content.
Needless to say, shitty articles are completed being removed but I think lots of this content is now of low quality because it is obsolete and dated.
So what should I do with this content?
Should I rewrite those articles as completely new posts and link from the old posts to the new ones? Or should I delete the old posts and do a 301 redirect to the new post?
Or should I rewrite the content of these articles in place so I can keep the old URL and backlinks?
One thing is that I've got a lot more followers than I used to so publishing a new post gets a lot more views, like and shares and whatnot from social networks.
-
I wouldn't re-write old posts. If they can be refreshed or added to with recent updates go ahead and redirect (if it can be redirected without losing any additional info) or link to the new version.
Things get tricky if there's nothing new that can be written about the post. First, kill the really bad stuff, as Mike suggested, and keep the good stuff. The stuff on the borderline is probably not worth keeping unless it was still receiving traffic. In my experience with Panda, using 410s on bad pages is better than redirecting, but you will probably want to 301 redirect to the next-best page if you have good links.
If it was still receiving organic traffic, think about what you can do to make it better or provide additional resources and reading. Try to save traffic-generating pieces by improving them and making them useful to the people who were landing on them. For high-traffic pieces, you will want to look at the organic keywords and make sure the page somehow answers the query.
As always with Panda, make sure your design doesn't turn people off and that you're not filling the template with too many ads.
-
No 404's are fine they will just not pass link juice and Google will eventually stop following them but ideally they will phase out regardless as the internet moves along.
-
And what about real crappy content? If I delete it, I end up with lots of 404 errors. Does that pose a problem to Google?
-
In this case, would you change the date to post it as "new" content? Because even if I rewrite it, I can't post an article from 2008 to the website's Facebook page.
-
I think that it all depends on how bad the content is. If you have content that is complete and total crap (10+ instances of the same keyword, reads like a toddler wrote it, etc.), it is better just to kill it and redirect your pages elsewhere. On the other hand, if the content is salvageable, then take the time to re-write it and make it good. The benefit of this is that at the end of the day you have good content instead of a bunch of links re-directed to pages that don't necessarily have anything to do with the old content.
Good luck!
P.S. Don't forget the disavow tool if you need it!
-
There is absolutely nothing wrong with multiple 301's pointing to the same page.
-
What if I have two dated articles that I merge into one updated article. Does it matter if I have two 301 redirects to the same URL?
-
with a 301 redirect? that's gonna be a huge htaccess file
-
Kill it and redirect if there are any backlinks incoming. Definitely.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content - Pricing Plan tables
Hey guys, We're faced with a problem that we want to solve. We're working on the designs for a few pages for a drag & drop email builder we're currently working on, and we will be having the same pricing table on several pages (much like Moz does). We're worried that Google will take this as duplicate content and not be very fond of it. Any ideas about how we could integrate the same flow without potentially harming ranking efforts? And NO, re-writing the content for each table is not an option. It would do nothing but confuse the heck out of our clients. 😄 Thanks everybody!
On-Page Optimization | | andy.bigbangthemes0 -
Suggestions on dealing with duplicate content?
What are the best ways to protect / deal with duplicate content? I've added an example scenario, Nike Trainer model 1 – has an overview page that also links to a sub-page about cushioning, one about Gore-Tex and one about breathability. Nike Trainer model 2,3,4,5 – have an overview page that also links to sub-pages page about cushioning , Gore-Tex and breathability. In each of the sub-pages the URL is a child of the parent so a distinct page from each other e.g. /nike-trainer/model-1/gore-tex /nike-trainer/model-2/gore-tex. There is some differences in material composition, some different images and of course the product name is referred multiple times. This makes the page in the region of 80% unique. Any suggestions welcome about the above example or any other ways you guys know of dealing with duplicate content.
On-Page Optimization | | punchseo0 -
Ecommerce product page duplicate content
Hi, I know this topic has been covered in the past but I haven't been able to find the answers to this specific thing. So let's say on a website, all the product pages contain partial duplicate content - i.e. this could be delivery options or returning policy etc. Would this be classed as duplicate content? Or is this something that you would not get concerned about if it's let's say 5-10% of the content on the page? Or if you think this is something you'd take into consideration, how would you fix it? Thank you!
On-Page Optimization | | MH-UK0 -
Does hreflang restrain my site from being penalized for duplicated content?
I am curently setting up a travel agency website. This site is going to be targeting both american and mexican costumers. I will be working with an /es subdirectory. Would hreflang, besides showing the matching language version in the SERP´s, restrain my site translated content (wich is pretty much the same) from being penalized fro duplicated content? Do I have to implement relcannonical? Thank ypu in advanced for any help you can provide.
On-Page Optimization | | kpi3600 -
Duplicate Content with ?Page ID's in WordPress
Hi there, I'm trying to figure out the best way to solve a duplicate content problem that I have due to Page ID's that WordPress automatically assigns to pages. I know that in order for me to resolve this I have to use canonical urls but the problem for me is I can't figure out the URL structure. Moz is showing me thousands of duplicate content errors that are mostly related to Page IDs For example, this is how a page's url should look like on my site Moz is telling me there are 50 duplicate content errors for this page. The page ID for this page is 82 so the duplicate content errors appear as follows and so on. For 47 more pages. The problem repeats itself with other pages as well. My permalinks are set to "Post Name" so I know that's not an issue. What can I do to resolve this? How can I use canonical URLs to solve this problem. Any help will be greatly appreciated.
On-Page Optimization | | SpaMedica0 -
Content in forum signatures being spidered, does it matter?
Hello, first post here, just started with SEOmoz so hope it's relevant. Searched a fair bit on this without getting a good answer either way so interested to get some opinions. The core of the site I run is a forum dedicated to collecting, for the sake of argument let's say cars. A good percentage of the users have signatures which list their collection, for example 1968 Car A - 1987 Car B - 1998 Car D and so on.... These signatures lists can be 20 items or more, some hotlink the signautres back to the relevant post on the forum, some not. The signatures show on every post on which the user makes. What I'm noting is a) SEOMoz is reporting a LOT of links on every forum page, due mainly to these signatures I guess. and of more interest b) The content of the signatures is being spidered. So for example of you search for '1968 Car A' you might get a couple of good results directly relevant to '1968 Car A' from my site, but you also get a lot of other non-relevant threads as results because the user just happens to have posted on them. Obviously this is much more apparent on the site google search. So what is the best approach? Leave as is? Hide the signatures from the BOTs? Another approach?
On-Page Optimization | | rutteger0 -
Duplicate Page Content Should we 301 - Best Practices?
What would be the best way to avoid a Duplicate Page Content for these type of pages. Our website generates user friendly urls, for each page..
On-Page Optimization | | 365ToursSafaris
So it is the same exact page, just both versions of the url work.. Example: http://www.safari365.com/about-africa/wildebeest-migration http://www.safari365.com/wildebeest-migration I don't think adding code to the page will work because its the same page for the incorrect and correct versions of the page. I don't think i can use the URL parameter setting because the version with /about-africa/ is the correct (correct as it it follows the site navigation) I was thinking of using the htaccess to redirect to the correct version.. Will that work ? and does it follow best Practices ? any other suggestions that would work better ?0 -
Duplicate content "/"
Hi all, Ran my website through the SEOMOZ campaigns and the crawl diagnostics give me a duplicate error for these urls http://www.mysite.com/cat1/article http://www.mysite.com/cat1/article/ so the url with the "/" is a duplicate of the one without the "/" Can someone point me out to a solution to solve this ? regards, Frederik
On-Page Optimization | | frdrik1230