Whether our shared articles caused Panda hit to our high quality site
-
Hello,
We are a quality site hit by Panda
Our article collection:
http://www.nlpca(dot)com/DCweb/NLP_Articles.html
is partially articles written by the site owners and partially articles that are elsewhere on them web.
We have permission to post every article, but I don't know if Google knows that.
Could this be why we were hit by Panda? And if so, what do we do? We've dropped way down in rank but have worked our way half-way back up.
Two of our main keywords are:
NLP
NLP Training
Thanks!
-
You have some valid points to consider... things seem to be improving and the articles that you might cut do pull in some traffic.
I can't tell you how to make your decision but here is how I made mine..
I had hundreds of republished articles but a lot more that I had written myself. Deleting lots of republished articles would cut my traffic and cut my income. Noindexing them would cut my traffic and cut my income. However, although those were serious losses they were small in comparison to other content on my site. So, knowing that google does not like duplicate content I got rid of them. There is still lots of great content on my site, visitors still find stuff to read, I know which of the things that I cut I should author a customized version for my own site.
The upside.... My site is more compact but still has thousands of content pages and the content that remains should be a lot stronger. After making the cuts my rankings, income and traffic increased. Not quite to previous levels but back to nice numbers.
I have reduced risk and am pleased with that. Everything that I cut was redirected to similar content. The most valuable of what was cut will be replaced with custom content with 301 redirects from the old content.
============================
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What from this is unique? Definitely keep that. Keep what is not struggling in Google. Keep what is essential to your site but replace with better that you create yourself.
Do you see further risk in future panda updates?
Yep... that's why I cut off my foot.
My thoughts are to rel=author each of our own articles,
YES... In the past I wanted all of my content to be anonymously written. I have changed my mind on that and used rel=author on the best stuff.
no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles.
heh.... Here I would be chopping off two of those sites and merging them into one. I would have done that years ago before panda was ever heard of.
I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
I agree.
-
Hi EGOL,
We are getting a lot of traffic off of some of these articles, so the site owners are not sure they want to no-index them just in case that's not causing the problem. Our rankings have come up from 40 to 26 on our main term, and similar for other terms, even though we still have duplicate content. We were originally at 19 before a big drop in November/December
How likely is this list of 60 articles out of 200 pages causing or will cause a major problem with past or future panda updates? 17 of 60 are by us, a few are written for us, and several more show up as only us when you type in the title into google surrounded by quotes.
What would you suggest I let the owner's know? Do you see further risk in future panda updates?
My thoughts are to rel=author each of our own articles, no-index the duplicates between our 3 sites (We have 3 sites that share a few articles) and no-index the remaining articles. I think that the drop in traffic will be outweighed by the lack of risk of current or future ranking drops.
However, it's not my decision, your thoughts?
-
I don't know. Everything that I have done is an experiment.
If you are really scared, delete... if you have some tolerance for uncertainty then play around with noindex or canonical. I deleted from a really important site.... used canonical where I the ranking loss was small and the risk was not jugular.
-
Hi EGOL,
When is no-indexing enough and when would you suggest deletion?
-
Can we no-index all the duplicate stuff? Or is some deletion necessary?
On one of my sites I deleted a lot and noindexed followed everything else that was a duplicate. We saw rankings recover in about a month.
On another site i had a lot of .pdf documents that were used to control printing of graphics. We used rel=canonical on them. That works very very slowly to remove them from the index. We are seeing slow recovery on that site.
if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
If the article belongs to someone else then I would noindex or delete. (Just saying what I would do it if was on my site). If it was my content I would set up a google + profile and use rel=author and rel=me to attribute them to a verified author.
-
Perhaps you could add a link to the original source on some of these where you have the permission. This should send a signal to google that you are showing it on your site for the convenience of users, but it is from a different source.
-
Can we no-index all the duplicate stuff? Is that enough to save our arse? Or is some deletion necessary?
I assume if we are not first in google for the content and title of an article, it is a potential duplicate content problem, correct? For example, if I take the first 2 sentences of an article, and type it into google, if someone is showing up above us, we need to no-index that article?
Any advice is appreciated. You're one of the best EGOL.
-
We have permission to post every article, but I don't know if Google knows that.
Google probably does not know and certainly does not care. If you have duplicate content on your site you are a potential target.
What type of link-building have you been doing? You might have been hit by the overoptimization penalty.
I was republishing some third-party content on a couple of my sites. I deleted most of it and no indexed the rest. Cut off your foot to save your arse.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Duplicate URLs on eCommerce site caused by parameters
Hi there, We have a client with a large eCommerce site with about 1500 duplicate URLs caused by the parameters in the URLs (such as the sort parameter where the list of products are then sorted by price, age etc.) Example: www.example.com/cars/toyota First duplicate URL: www.example.com/cars/toyota?sort=price-ascending Second duplicate URL: www.example.com/cars/toyota?sort=price-descending Third duplicate URL: www.example.com/cars/toyota?sort=age-descending Originally we had advised to add a robots.txt file to block search engines from crawling the URLs with parameters but this hasn't been done. My question: If we add the robots.txt now and exclude all URLs with filters - how long will it take for Google to disregard the duplicate URLs? We could ask the developers to add canonical tags to all the duplicates but these are about 1500... Thanks in advance for any advice!
Intermediate & Advanced SEO | | Gabriele_Layoutweb0 -
Moving multiple Sites to One Site and SEO Impact/Ideas
Hi there, We are in the process of moving 2 sites with higher page authority to another site we own (that is our company brand), so essentially 3 sites into one. We're at risk of losing a lot of SEO from the original 2 sites that have all the product information. We are doing this since we merged companies a couple years back and need one web precense. Anyhow, the site launch date is in 3 months and the recommendation is to start moving content over prior to that for top pages, which is a big undertaking when we are launching all the pages again with new content, redeisgn and moving sites in 3 months. If it's the right move, we should do it, but I just wanted to get opinions on how others have handled something similiar when moving to a site with lower site authority and trying not to lose rankings.
Intermediate & Advanced SEO | | lauramrobinson320 -
Google cache is showing my UK homepage site instead of the US homepage and ranking the UK site in US
Hi There, When I check the cache of the US website (www.us.allsaints.com) Google returns the UK website. This is also reflected in the US Google Search Results when the UK site ranks for our brand name instead of the US site. The homepage has hreflang tags only on the homepage and the domains have been pointed correctly to the right territories via Google Webmaster Console.This has happened before in 26th July 2015 and was wondering if any had any idea why this is happening or if any one has experienced the same issueFDGjldR
Intermediate & Advanced SEO | | adzhass0 -
Publishing Articles + Plagiarism
Everybody at some point will write a feature rich article and publish it on their website.
Intermediate & Advanced SEO | | Mark_Ch
What is stopping your competitors from blatantly stealing your article and publishing it on their own website virtually word for word.
If your competitors website gets indexed by Google before yours than surely Google will see your hard work and cost as duplicate content. Question:
Should site owners be worried about this type of practice?
How do we safeguard ourselves from this type of practice? Any other good advice would be appreciated... Thanks Mark1 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Why specify robots instead of googlebot for a Panda affected site?
Daniweb is the poster child for sites that have recovered from Panda. I know one strategy she mentioned was de-indexing all of her tagged content, fo rexample: http://www.daniweb.com/tags/database Why do you think more Panda affected sites specifying 'googlebot' rather than 'robots' to capture traffic from Bing & Yahoo?
Intermediate & Advanced SEO | | nicole.healthline0 -
On-Site Optimization Tips for Job site?
I am working on a job site that only ranks well for the homepage with very low ranking internal pages. My job pages do not rank what so ever and are database driven and often times turn to 404 pages after the job has been filled. The job pages have to no content either. Anybody have any technical on-site recommendations for a job site I am working on especially regarding my internal pages? (Cross Country Allied.com) Any help would be much appreciated.
Intermediate & Advanced SEO | | Melia0