Is it Panda?, how to deal with AP etc newswire articles
-
A site I have lost 30% of its traffic in June then another 10% in July, is it Panda?
The site has 10's of thousands of AP or other syndicated articles on it, they are not there for SE benefits, they are categorized and relevant to the people who read them, the site gets half of its traffic from type ins/bookmarks.
Should I nofollow the articles or rel="canonical" them? what can help......
Cheers
-
Thanks again, I guess I will have to look through keywords and see what traffic these news pages are still getting from google, then weigh up whether to tag them.
-
Panda isn't a penalty per se. It is a algorithmic change to how Google ranks sites and pages. If your site has duplicated content on it, you will need to fix all of it. Once your site has been cleaned up, it will can take a month or more for Google to fully re-index your entire site and see all of the duplicated content gone or properly handled (i.e. noindex or canonicalized).
It's not as if you have 1% of duplicated content that your site is affected but no one knows for sure what exact percentage triggers this effect, so your best course of action is to clean it all up.
By using the canonical tag, these pages will be removed from the index for your site. The "harm" would be that if someone searches for the pages your site wont be listed unless you have relevant comments for the search query.
-
Thanks, is there any way that I could trial this on the site by just adding the tag to a few pages or sections? Is it domain level metrics google is using, they have decided that the site is junk now as it has so much duplicated content?
The articles are slightly changed and there are comments on them.
What harm could I cause by trialing the canonical tag? if I took it of there later could there be some recovery time?
thanks a lot
-
If this content is merely duplications of articles which exist elsewhere then yes, you can add the canonical tag pointing to the source.
You would definitely not want to "nofollow" these pages. By adding a nofollow tag you are telling search engines not to flow page rank to the other links they find on the page. That is not the result you desire.
You could noindex the pages as well. Prior to doing such I would ask if you are offering comments or other user generated comments. If you are not, then the noindex tag is fine. If you do offer UGC, then I would recommend the canonical tag.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Where does rel=canonical go? One file that manages sort order, view, filters, etc...
Where do I put the rel=canonical when the search.cfm (using URL re-write) page is the one and only page, just using url parameters to control sort, filter, view, etc. Do I just put the rel=canonical at the top of the search.cfm page? The duplicate content issues I am getting are: https://www.domain.com/tx/austin/ https://www.domain.com/tx/austin/?d=25&h=&s=r&t=&v=l&a= Just want to be clear since Moz Pro is picking up both URL's but it's only really one file, search.cfm Thanks in advance for your help.
Technical SEO | | ErnieB0 -
Website affected by Penguine / Panda
Dear All,
Technical SEO | | omverma
We have some websites. How we can check if site got affected by penguine / panda. We have observed few things since last few days like impressions are going down and keyword ranking is going down too. Any tools or any steps, to detect it will help us.
Thanks,
Om1 -
Panda Recovery ETA?
I have a blog hit by Panda in 2011 and 2012. The thing is, I've no-indexed around 1000 posts out of 11xx. No-indexed tags and archives. But, Google was taking a very long time to remove them from their indexes. So, I had to do a manual removal from Google WMT. Removed /2011/ and /2013/ as directories, and removed /pages/ (this is an WordPress site) so all of them are now no longer in their index. It was a smartphone blog started in 2011 which I turned into an tech blog on a new domain (I let the old PR3 DA 30+ domain expire and now someone's asking me $200 if I am to get it). I had a team when it was a smartphone blog. Our articles had been featured on places like Engadget, PhoneArena, UberGizmo etc. So, with the loss of the domain, we've lost quite a few important backlinks as well. Also, Authorship doesn't work for the site. The Rich Snippets testing tool says everything's all right, but it never really works / shows up on SERPs. I fear it's because of a penalty. It seems to me like no one has ever thought about a penalty that affects Authorship. So, now you know the problem, and the things I did in order to fix it, could you tell me if: Google will lift the penalty whenever they wish. (And an ETA?) They'll lift it when the next major algorithmic update occurs. (I made the changes on September 28th) But I don't see how this is a possibility since Panda has now been integrated into the core algorithm. Anything else. Thanks in advance everyone!
Technical SEO | | RohitPalit0 -
A few misc Webmaster tools questions & Robots.txt etc
Hi I have a few general misc questions re Robots.tx & GWT: 1) In the Robots.txt file what do the below lines block, internal search ? Disallow: /?
Technical SEO | | Dan-Lawrence
Disallow: /*? 2) Also the sites feeds are blocked in robots.txt, why would you want to block a sites feeds ? **3) **What's the best way to deal with the below: - old removed page thats returning a 500 response code ? - a soft 404 for an old removed page that has no current replacement old removed pages returning a 404 The old pages didn't have any authority or inbound links hence is it best/ok to simply create a url removal request in GWT ? Cheers Dan0 -
Panda and unnatural links caused ranking drop
Hi I have been approached to do some SEO work for a site that has been hit badly by the latest panda update 3.3, they have also had a warning in their Google webmaster tools account saying they had unnatural looking links to their site, they received this in 26 Feb and that prompted them to stop working with their excising seo company and look for a new one. Apparently their rankings for the keywords they were targeting have dropped dramatically, but it looks like just those they were actively building back links for, other phrases do not look affected. Before I take them on I want to be clear that it is possible to help them reclaim their rankings? I have checked the site and the on-page seo is good, the site build is good, just a few errors to fix but the links that have been built by the seo company are low quality with a lot of spun articles and the same anchor text so I see what the Google webmaster tools message is refuring to. I do not think these links can be removed as there is no contact details on the sites I checked I have not checked all of them but a random sample does not show promise, they are from low authority domains. So if I am to take them on as a client and help them to regain their previous rankings what is the best strategy? Obviously they want results yesterday and from our phone call they would rather someone else did the work than them, so my initial response of add some better quality content that others in your industry would link to as a reference did not go down well, to be fair I think it is a time issue there are only 3 people in the company and they are not technical at all. Thanks for your help Sean
Technical SEO | | ske110 -
How ehow beat Panda?
Looks although ehow was hit by Panda 2.0, its traffic has increased back to previous levels. Does anyone know of an article / study that goews over what ehow did.
Technical SEO | | nicole.healthline0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
How do i deal with duplicate content on the same domain?
I'm trying to find out if there's a way we can combat similar content on different pages on the same site, without having to re write the whole lot? Any ideas?
Technical SEO | | indurain0