Whether our shared articles caused Panda hit to our high quality site
-
Hello,
We are a quality site hit by Panda
Our article collection:
http://www.nlpca(dot)com/DCweb/NLP_Articles.html
is partially articles written by the site owners and partially articles that are elsewhere on them web.
We have permission to post every article, but I don't know if Google knows that.
Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up.
Two of our main keywords are:
NLP
NLP Training
Thanks!
-
You have some valid points to consider... things seem to be improving and the articles that you might cut do pull in some traffic.
I can't tell you how to make your decision but here is how I made mine..
I had hundreds of republished articles but a lot more that I had written myself. Deleting lots of republished articles would cut my traffic and cut my income. Noindexing them would cut my traffic and cut my income. However, although those were serious losses they were small in comparison to other content on my site. So, knowing that google does not like duplicate content I got rid of them. There is still lots of great content on my site, visitors still find stuff to read, I know which of the things that I cut I should author a customized version for my own site.
The upside.... My site is more compact but still has thousands of content pages and the content that remains should be a lot stronger. After making the cuts my rankings, income and traffic increased. Not quite to previous levels but back to nice numbers.
I have reduced risk and am pleased with that. Everything that I cut was redirected to similar content. The most valuable of what was cut will be replaced with custom content with 301 redirects from the old content.
============================
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What from this is unique? Definitely keep that. Keep what is not struggling in Google. Keep what is essential to your site but replace with better that you create yourself.
Do you see further risk in future panda updates?
Yep... that's why I cut off my foot.
My thoughts are to rel=author each of our own articles,
YES... In the past I wanted all of my content to be anonymously written. I have changed my mind on that and used rel=author on the best stuff.
no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles.
heh.... Here I would be chopping off two of those sites and merging them into one. I would have done that years ago before panda was ever heard of.
I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
I agree.
-
Hi EGOL,
We are getting a lot of traffic off of some of these articles, so the site owners are not sure they want to no-index them just in case that's not causing the problem. Our rankings have come up from 40 to 26 on our main term, and similar for other terms, even though we still have duplicate content. We were originally at 19 before a big drop in November/December
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What would you suggest I let the owner's know? Do you see further risk in future panda updates?
My thoughts are to rel=author each of our own articles, no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles. I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
However, it's not my decision, your thoughts?
-
I don't know. Everything that I have done is an experiment.
If you are really scared, delete... if you have some tolerance for uncertainty then play around with noindex or canonical. I deleted from a really important site.... used canonical where I the ranking loss was small and the risk was not jugular.
-
Hi EGOL,
When is no-indexing enough and when would you suggest deletion?
-
Can we no-index all the duplicate stuff? Or is some deletion necessary?
On one of my sites I deleted a lot and noindexed followed everything else that was a duplicate. We saw rankings recover in about a month.
On another site i had a lot of .pdf documents that were used to control printing of graphics. We used rel=canonical on them. That works very very slowly to remove them from the index. We are seeing slow recovery on that site.
if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
If the article belongs to someone else then I would noindex or delete. (Just saying what I would do it if was on my site). If it was my content I would set up a google + profile and use rel=author and rel=me to attribute them to a verified author.
-
Perhaps you could add a link to the original source on some of these where you have the permission. This should send a signal to google that you are showing it on your site for the convenience of users, but it is from a different source.
-
Can we no-index all the duplicate stuff? Is that enough to save our arse? Or is some deletion necessary?
I assume if we are not first in google for the content and title of an article, it is a potential duplicate content problem, correct? For example, if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
Any advice is appreciated. You're one of the best EGOL.
-
We have permission to post every article, but I don't know if Google knows that.
Google probably does not know and certainly does not care. If you have duplicate content on your site you are a potential target.
What type of link-building have you been doing? You might have been hit by the overoptimization penalty.
I was republishing some third-party content on a couple of my sites. I deleted most of it and no indexed the rest. Cut off your foot to save your arse.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO question regarding rails app on www.site.com hosted on Heroku and www.site.com/blog at another host
Hi, I have a rails app hosted on Heroku (www.site.com) and would much prefer to set up a Wordpress blog using a different host pointing to www.site.com/blog, as opposed to using a gem within the actual app. Whats are peoples thoughts regarding there being any ranking implications for implementing the set up as noted in this post on Stackoverflow: "What I would do is serve your Wordpress blog along side your Rails app (so you've got a PHP and a Rails server running), and just have your /blog route point to a controller that redirects to your Wordpress app. Add something like this to your routes.rb: _`get '/blog', to:'blog#redirect'`_ and then have a redirect method in your BlogController that simply does this: _`classBlogController<applicationcontrollerdef redirect="" redirect_to="" "url_of_wordpress_blog"endend<="" code=""></applicationcontrollerdef>`_ _Now you can point at yourdomain.com/blog and it will take you to the Wordpress site._
Intermediate & Advanced SEO | | Anward0 -
Site dropped after recovery
Hi everybody! I've been working for http://www.newyoubootcamp.com for some time now. They came to me as they had dropped heavily for their main term, "boot camp". This turned out to be due to a manual penalty, which was in part due to their forum being hacked, as well as some bad link building. Here's an example of the dodgy forum links - http://about1.typepad.com/blog/2014/04/tweetdeck-to-launch-as-html5-web-app-now-accepting-beta-testers.html. The anchor is "microsoft". They've all been 410'd now. Also, we cleaned up the other bad links as best we could, and got through the manual penalty. The site then returned to #5 for "boot camps", below its pre-crash peak of #2, but OK. Over the past few weeks, it has started to slide though. I'm certain it is not down to a lack of quality links - this site has great PR and links from national newspapers and magazines. There's been a few on-site issues too, but nothing outrageous. I'm getting a bit stumped though, and any fresh eyes would be much appreciated!
Intermediate & Advanced SEO | | Blink-SEO0 -
SEO site Review
Does anyone have suggestions on places that provide in depth site / analytics reviews for SEO?
Intermediate & Advanced SEO | | Gordian0 -
SEO Priorities for Ecommerce Sites
Hello All! What is the best way to rank SEO tasks by PRIORITY for Ecommerce sites to improve?? It can be quite overwhelming with all the types of projects/tasks needed to improve organic rankings... How would you rank the most CRITICAL tasks to spend the MOST TIME on to the tasks you spend less on. Appreciate your input in advance 🙂 Thank you! Mark
Intermediate & Advanced SEO | | wickerparadise0 -
Are Their Any SEO Dangers When Cleaning Up a Site
I'm doing some housekeeping on my website. Removing old blogs that are out of date (2008) or things have moved on. The blogs I'm removing are being 301'd to relevant newer blogs. Can this type of clean up cause any problems that affect the optimisation of a site? Looking forward to hearing your views. Christina
Intermediate & Advanced SEO | | ChristinaRadisic0 -
Removing A Blog From Site...
Hi Everyone, One of my clients I am doing marketing consulting for is a big law firm. For the past 3 years they have been paying someone to write blog posts everyday in hopes of improving search traffic to site. The blog did indeed increase traffic to the site, but analyzing the stats, the firm generates no leads (via form or phone) from any of the search traffic that lands in the blog. Furthermore, I'm seeing Google send many search queries that people use to get to the site to blog pages, when it would be much more beneficial to have that traffic go to the main part of the website. In short, the law firm's blog provides little to no value to end users and was written entirely for SEO purposes. Now the law firm's website has 6,000 unique pages, and only 400 pages of the site are NON-blog pages (the good stuff, essentially). About 35% of the site's total site traffic lands on the blog pages from search, but again... this traffic does not convert, has very high bounce rate and I doubt there is any branding benefit either. With all that said, I didn't know if it would be best to delete the blog, redirect blog pages to some other page on the site, etc? The law firm has ceased writing new blog posts upon my recommendation, as well. I am afraid of doing something ill-advised with the blog since it accounts now for 95% of the pages of the website. But again, it's useless drivel in my eyes that adds no value and was simply a misguided SEO effort from another marketer that heard blogs are good for SEO. I would certainly appreciate any guidance or advice on how best to handle this situation. Thank you for your kind help!
Intermediate & Advanced SEO | | gbkevin0 -
Panda recovery. Is it possible ?
Dear all, To begin, english is not my native language, so I'm very sorry if I make some mistake. On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic. The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that. So, in may, I have made a new version. Here you can see the most important modifications : A smallest header (-100px height). 2 columns website (the oldest website had 3 columns) I have deleted the category menu with the list of all categories and the alphabetical menu. less ads on the website (since few days I have also deleted the 2 adense blocks) The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only. I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex. I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used). I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam. Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment. All the merchants pages without promo codes have a noindex on the robot tag. Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes. Affiliate links are created on JS which open a new window (a redirect page with noindex). That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS... At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery. I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites... I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do. I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations. Many thanks for all. Sincerely, Florent
Intermediate & Advanced SEO | | Floroger0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0