What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a benefit to hiding words from Google on ecommerce sites?
I think we all know that ecommerce sites have a lot of repeating functional texts on them. Is there a benefit from hiding this text from Google's crawler? Take this page for example, http://storage3.static.itmages.com/i/16/0805/h_1470425505_1090542_224cc344d4.png Some of the most dense words on the page are "Add to cart", "Add to Wishlist", "New", and "Sale". Is there any benefit to hiding those words from Google? The method of hiding I am talking about is not cloaking, but this, https://www.google.com/support/enterprise/static/gsa/docs/admin/70/gsa_doc_set/admin_crawl/preparing.html#1076243 using the google:off index tag. So the content will be there still, but it will not be indexed.
On-Page Optimization | | LesleyPaone0 -
Google Console returning 0 pages as being indexed
HI there, I submitted my site notebuster.net to Search Console over a month ago and it is showing 0 pages as being indexed under the index status report. I know this isn't right as I can see that in google alone by typing in (site:notebusters.net) there are 113 pages indexed. Any idea why this might be? Thanks
On-Page Optimization | | CosiCrawley0 -
Fetch as Google
Are there any pros or cons with using Google fetch and submit? I realise Google will likely find it of its own accord in due course but I have found it may take a couple of weeks if at all. Fetch and submit seems to speed this process up, sometimes anyway.
On-Page Optimization | | seoman100 -
If others are reposting your content, how do you tell Google you are the original author?
A huge amount of my content is being copy and pasted to facebook, tumblr, wattpad etc... Is there an easy way to tell Google I am the original author? (How can I tell if this content theft affecting my rankings?)
On-Page Optimization | | brianflannery0 -
90 days for Google
Hi, I'm new to Moz so still getting a feel of the forums. If my question has been answered then please point me in the right direction. I have noticed with many SEO companies they advertise that they can get you on google front page in 90 days. I'm not really interested in their techniques but more of why google takes 90+ to even appear. I have been working on my site for over a month, adding content, building good links, social media, blogs etc... but have not even come close to appearing in the top 50 pages for google. Is this normal? Is it just a matter of time before it starts to appear? Also, I have checked my backlinks and there is about 8 links that are coming from random pages in the US and some from China and india which i have no idea of. I tried to visit on of the sites but it had malware. I added all these back links to google disavow so hopefully that will fix it. Could that be the reason google would not even list my site? Thanks... Rick
On-Page Optimization | | pureozone0 -
Google picking up old pages
I recently redesigned a site that had all the keywords it was ranking for going to the home page. Now I have specific pages for each of these keywords but I'm seeing the home page (not the page that, if I do an on page optimization by hand in MOZ gives me an A rating) showing up in the auto reports (assuming pages Google sees for these keywords related to the url) as F's. They're all pointing to the home page. I've redirected the old index.html home page to the new but I suspect the reason is actually these pages (were) ranking for these terms (though none too well - all but one were not in the top 50 and one was 45) because these rankings are all dropping as well. I'm at a loss, with the site replaced, as to how to correct this and tell Google these keyword phrases all have their own pages now. I've dug through this forum and the only applicable answer I can see would be to add these phases to the home page (where they all rank for now) with anchored links to their new (A rated by Moz for these terms when I hand enter them) singular pages? Or is it just a waiting game?
On-Page Optimization | | adworksofboca0 -
Removing syndicated duplicate content from website - what steps do I need to take to make sure Google knows?
Hey all, So I've made the decision to cancel the service that provides my blog with regular content / posts, since it seems that having duplicate content on my site isn't doing me any favors. So I'm on a Wordpress system - I'll be exporting the posts so I have them for reference, and then deleting the posts. There are like 150 or so - What steps should I take to ensure that Google learns of the changes I've made? Or do I not need to do anything at all in that department? Also - I guess I've assumed that the best decision would be to 'remove' the content from my blog. IS that the best way to go? Or should I leave it in place and start adding unique content? (my guess is that I need to remove it...) Thanks for your help, Kurt
On-Page Optimization | | KurtBullock0 -
Can A Blog post impact negatively on Google rankings?
We used to rank fairly highly on Google for our preferred search term, but once we introduced a blog (or rather the blog excerps) onto our home page we dropped from page 1 to page 3 for our preferred search term. We have made the assumption this may be down to diluting of keywords and therefor the key phrase or message to the Search Engines was not as clear as it was before. Is this generally accepted as being the case ? If so - should each and every blog post excerpt start with the key seo search phrase to maintain the integrity of that search phrase ? or perhaps even better still is it best to have the blog on a dedicated page only with no excerpts on the home page ? We are using Wordpress and also using multiple categories within the blog - is it perhaps best to just have all blog postings on the one page and use TAGS more often? Thanksing you in advance.
On-Page Optimization | | kiswebsltd0