What´s the penalization extent applied by Google?
-
Hi!
I still don´t get this web site penalization applied by Google due to duplicate content.
My site has many of pages that were among the first positions for top keywords (A Photoshop web site).
Those pages were linked by sites like LifeHacker, BoingBoing, Microsiervos, SmashingMagazine, John Nack, and many other well known blogs.
After mid February 2012 everything went down the drain. I lost half of my traffic and my well ranked pages are now almost nowhere to be found.
I have plenty of ads in some pages of my site, and duplicate content (amazon product description only) in other pages of my site.
So, the good quality pages my site has, are no longer considered as good quality just because I have some duplicate content or ad filled pages?
I´m not complaining. I´m trying to understand this.
Google needs to serve good information to their visitors. But since they found some trash in my site, they decide to remove both the trash and the good information from the search engine?
That doesn´t sound logical to me. Why don´t they just remove the trash and leave the good content?
Of course, I understand that information is added everyday and some may come up with something better than mine, but dropping 40 or more places in the ranking sounds more like a penalty to me.
Again, I´m not complaining (although it sounds like I am!), just want to understand the reasons behind this.
Thanks,
Enrique
-
Yes, thanks Anthony. I will post back as soon (soon..?) as I find something.
Enrique
-
sometimes what you call natural google calls spammy or unnatural. Just sayin. Good luck. Post back with your findings. Im interested to see how things work out for you. Best regards.
-
Yes, thanks, I will check that. I was planning to add nofollow to the amazon pages, I will also check the anchors, but since they are all natural, any change I make will look artificial.
Enrique
-
Have you tried removing the amazon data feed from those pages? Just to see if that's is in fact what is impacting your rankings? What about the thousands of natural links pointing to your site? Are they all using varied anchor text or is it just five for your five main pages? If just five that could also be affecting your ranking.
-
Yes, I know that's the thing to do, but you must agree with me that it's something unnatural.
I have thousands of incoming links, and only exchanged or asked for less than 20 of those. The rest are natural. If I spend time analyzing links it would be something absolutely artificial.
The same goes with quality pages. Let's say that I have four or five pages that are the most referenced in my industry (just and example, of course). Visitors that read those pages get really good, top class information. But I have an Amazon datafeed in my site.
Suddenly, the information of those top quality pages are hidden from search results in Google because my site has an Amazon datafeed?
I know it's a simplistic example, but it can be translated as:
"A good article isn't good anymore just because of a site penalty"
It seems that Google is saying something like "Hey, you can't read this amazing article because it is from a site has lots of junk. So suck it up and read this article of a lesser quality but from from a pristine site!"
It is not about my site anymore, but about trying to understand the concept of it all. And of course it is an extreme example, but I think it is relevant.
-
No, google does care about good quality pages. Its just if you throw in a bunch of bad pages they dilute the goodness of your good pages. Once you clean yp duplicate content then i would suggest running a report on your inbound links. Check to see if your anchor text is spammy, or concentrating on only a few choice keywords. When it comes to link building you want to spread out the keywords to there isn't one or two money keywords banking on the anchor text.
Also, I would remove any inbound links from questionable directories. Once you do that I wold think you should see some significant gains in rankings.
-
Thanks! So it is clear that Google doesn´t care about single, good quality, pages with good quality links.
A good quality page needs a quality site to back it up.
Is that the criteria?
It sounds reasonable to me, but very difficult to repair.
Just for the records, my site isn´t trash or low quality, but it is an old site and has some quirks from old times: lots of directory entries with little content and datafeeds that used to work very well some years ago.
-
The trash part of your site affects the site as a whole, not specifically just the trash parts. If they did just that, then you would still benefit from using trash to promote your good pages.
Now from what I understand about penalties, there is a manual penalty and an algorythm or natural penalty.
The algo penalty can be easily fixed by addressing your penalty issue, which would be duplicate content. Clean up all duplicate content and you will be on your way to flying under the penalty radar so to speak. However, you will still need to add more quality content to make up for the removed or cleaned up duplicate content.
Once that takes place you should notice your ranking drop stabilize, and over time begin the crawl back up. This would be a good time to implement other strategies like social media and quality link building.
Now if its a manual penalty, then you need to clean up all duplicate content and ask for a manual review, and pray. Manual penalties are heard to overcome and will require much more work. Sometimes its best to just start with a new domain from scratch.
Hope this helps some.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
There is a copy of our website that is ranking. How can I let Google know our website is the authentic site?
I just found another copy of my old website and have no way to take it down. Unfortunately, it's ranking so he didn't place it as a nofollow. (My boss hired someone to redevelop our website before I came on board and never finished the project). So, could this be hurting us? I tried to look to see if we were being penalized and couldn't find that we were. Also, ever since we migrated to a new domain name, our ranking is tumbling. I've redirected properly and tested to make sure they're resolving correctly and they are. I have no idea what is going on. We've virtually lost all ranking. Any help would be much appreciated.
On-Page Optimization | | npuffer790 -
If Product Pages Perform Well In Google then Is It Possible That Category Page Can perform Well In Google?
Hi All, For my ecommerce site I have optimized my product pages very nicely like good images, detailed information about products, good reviews, implemented schema for my product and reviews and very perfect onpage. Now my query is if my products pages performing well in google then there are chances that my category page rank well in google too? Thanks!
On-Page Optimization | | wright3350 -
How google treats my two different domains with the same content ?
I have two internet stores for two different markets but in the same language (English), the same content and the same url (only domains different). They are in different servers one in USA another in UK. Example: sample.com (global) and sample.uk (for UK).
On-Page Optimization | | VaidasLinen
Currently sample.com (7 years old) is doing better but not very very well, sample.uk (2 years old) is rated poorly. My question is if it's possible that google will rank both stores well in the future ? Thanks Vaidas0 -
Is there a benefit to hiding words from Google on ecommerce sites?
I think we all know that ecommerce sites have a lot of repeating functional texts on them. Is there a benefit from hiding this text from Google's crawler? Take this page for example, http://storage3.static.itmages.com/i/16/0805/h_1470425505_1090542_224cc344d4.png Some of the most dense words on the page are "Add to cart", "Add to Wishlist", "New", and "Sale". Is there any benefit to hiding those words from Google? The method of hiding I am talking about is not cloaking, but this, https://www.google.com/support/enterprise/static/gsa/docs/admin/70/gsa_doc_set/admin_crawl/preparing.html#1076243 using the google:off index tag. So the content will be there still, but it will not be indexed.
On-Page Optimization | | LesleyPaone0 -
Why my website some pages is not index in google ?
Hi, I have submitted my pages in Google fetch for consideration tool but they are not indexed yet in the Google search. Additionally, there is also no error shown by the Google.
On-Page Optimization | | seo.kishore890 -
Google Page Rank has no any rankings as of now. what to do?
my domain and page authority is working well right now but my Google Page Rank has no any rankings as we speak. What to do now? can some of you give me advice on this? Thank you very much in advance.
On-Page Optimization | | Panoramictrip0 -
Google vs. Bing
We are having some really good results with our SEO strategies for local search and long tail searches for our clients on Google; however, Bing is a completely different story in a couple of cases. What is Bing looking for that Google is not? We have even noticed that something may rank well in Bing then not show up at all in Google? Any help would be greatly appreciated.
On-Page Optimization | | Strategexe0 -
How to design a site map page for users (not for Google)
I would like to design a site map for my visitors so they can have a quick view on the whole content of the website. 2 questions : 1 - is this kind of site map can help in terms of SEO ? 2 - if so, what are the best practices to design it ? Thanks in advance.
On-Page Optimization | | betadvisor0