When do Panda ranking factors apply when Google deindexes a page
-
Here is 2 scenarios
Scenario 1
Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them.
Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty.
Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed?
Scenario 2
I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?
What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty?
The pages are useful to my users so I need them to stay.
Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban?
Hope this makes sense
-
I have over 800,000 pages total that contain duplicate content "if" that is an issue with my definitions. I would assume that Panda would slap me hard for that, again "if" that is the issue. Since I have never tried to deindex this many pages I am hoping this works and I will take a few coffee breaks waiting because its going to be a while lol
I have nothing to lose and I feel like I have tried a ton. Thanks so much
-
"Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty."
Panda will not deindex pages. It might move them to the supplemental index, but they're not deindexed. Technically, Panda is not a penalty. It's an algorithmic demotion. If you've got a bunch of duplicate content, Google may choose not to index some of that content, or, more likely, to just show users the most appropriate page of that content.
Now, if Panda has affected your site because Google feels that the site consists of mostly duplicate or thin content then you'll need to noindex or significantly change or remove that content in order for Google to see that the quality has improved. You can't say that the content is essentially gone because Google is not showing it. That wouldn't change the factors that caused you to be affected by Panda. (Now, this is assuming that this is what the problem is, because we don't know that.)
"I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off?"
If these pages were the primary reason for Panda to visit your site, then what would happen is that as Google recrawls your site they will start to recognize that the quality is improved. Then, at some point with a future Panda refresh (it may take several if there is a lot of content to crawl), you should see an increase in traffic. If the duplication was the only factor that Panda was concerned about then you'd likely see a dramatic improvement. If it was just one of the factors, you might see a smaller improvement. If you had a lot of factors, you may see very little or just some improvement.
If I understand the question right, I would say that the answer is to go ahead and add the noindex tag to these pages.
-
No offense man, I really want to figure out what the heck happened with my site, I really feel like I was hit from unfortunate circumstances.
My website is http://www.freescrabbledictionary.com/
The duplicate content I am referring too is that I generate my definitions for words from an API provided by https://www.wordnik.com/
I do site the resource on each definition at the bottom of every page (which was required by https://github.com/wordnik), an example is http://www.freescrabbledictionary.com/dictionary/word/testing/
I have never had a manual penalty from Google, I check Google Webmaster Tools all the time. I also use tools like Google analytics as well as moz.com, ahrefs.com and monitorbacklinks.com.
I used to rank for the keyword "scrabble dictionary" in the top 4 spots on average. For a long time I was #2 which was my biggest keyword traffic.
I remember when the first Panda update came out I was not hit. I notice the negative changes in my rankings after the second panda update and so on. Since Penguin was in the mix as well I cant even tell if I was hit with penguin.
I never paid or did black-hat backlinking
Again I never was hit with a manual penalty, this is 100% algorithmic
If you notice with the keyword "scrabble dictionary" my homepage does not rank for that keyword at all. Not anywhere in all the search results, where I used to rank top 4 spots.
Since I have been hit so hard I have nothing to lose so I have "noindex" 100% of each word definition, sentence example's and quotes which even though those are not copied (except for definitions) I did that just in case. This equates for about 90% of my site's pages indexed by Google.
I have changed my site design to equate for the "refresh" ranking factor, I have desperately comb through my site 1,000,000 times trying to figure out what happened, I have disavowed link 10 fold and nothing seems to affect my rankings. At this point I will try anything...I have nothing to lose.
-
Can you describe what happened to your site and why you believe you got a penalty.
make sure Type site:www.example.com in to Google it will show you what is indexed.
Be certain that you do not have a robots.txt file or something similar blocking your website go to https://www.feedthebot.com/tools/ type your domain in and it will tell you if you are blocking anything with your robots.txt do this on the URLs that you think not indexed.
Because if you are affected by a true panda penalty it would be a manual penalty you would receive word inside your Google Webmaster tools account. If you do not have one set one up.
https://www.google.com/webmasters/tools/home?hl=en
if you think you've been hit by an algorithm penalty not manual you can check by using tools listed in this URL
http://www.iacquire.com/blog/5-tools-to-help-you-identify-a-google-slap
now obviously because you're talking about duplicate content which it seems like you may have known existed somewhere else may be and please don't take offense you copied it?
In that case Google takes the domain with the most authority and gives it to that domain.
so time.com could probably take your entire site and you would be the one that looked like you stole their content.
remember Google does also consider the first time it was indexed however site authority trumps it.
Google will only acknowledge a difference if you actually have a manual penalty if you received a manual penalty it would come with instructions on what to do next.
My advice to you is if you have duplicate content that is taken from another website and not yours please remove the content second choice no index that content.
It could be that you have the misfortune of somebody finding out that you took their content and they do digital millennium takedown in many cases would damage or domain beyond repair. You would know if this occurred as well. I'm just letting you know it's not smart to have someone else's content on your site you should write it uniquely to meet your end-users needs and if the current content is very helpful to them I recommend you use that to create your own unique content not spinning it but unique.
please know that if you tell me you didn't take the content I will apologize right away. I do not mean to imply.
respectfully,
Tom
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is showing erroneous results on SERPs page
Hello, All, In April, two months ago, we caught a hack on a client's website. It created about 40 pages in what looked to be a black hat link tactic. We removed the pages, resubmitted the sitemap.xml (it reprocessed) and ran it through screaming frog to confirm all the pages were gone, but the forty pages still show up in the search results for a site search. We have both the www. and non www. version of sites claimed and set a preference. Nothing is awry with the robots.text. We're not really sure what to do to resolve it. We asked Google to recrawl (fetch) the site. I'm not sure what's going on with it. The website's name is fortisitsolutions.com The site search bringing up the pages from the hack is below. site:www.fortisitsolutions.com Any ideas?
On-Page Optimization | | Cazarin-Interactive0 -
Is it better to try and boost an old page that ranks on page #5 or create a better new page
Hello Everyone, We have been looking into our placements recently and see that one of our blog posts shows on page #5 for a popular keyword phrase with a lot of search volume. Lets say the keyword is "couples fitness ideas" We show on page 5 for a post /couples-fitness-ideas-19-tips-and-expert-advice/ We want to try and get on the first page for that phrase and wanted to know if it is better if we did one of the following: 1. Create a new page with over 100 ideas with a few more thousands of words. with a new url (thinking /couples-fitness-ideas) 2. Create a new page with a new url (thinking /couples-fitness-ideas) with the same content as the currently ranking post. We would want to do this for more freedom with layout and design of the page rather than our current blog post template. Add more content, let's say 100 more ideas. Then forward the old URL to the new one with a 301 redirect. 3. Add more content to the existing post without changing the layout and change the URL. Look forward to your thoughts
On-Page Optimization | | MobileCause0 -
Does Google penalize a page with the image tag with alt and without src?
Hi, I am curious whether Google penalizes a page with the image tag with a value in the "alt" attribute and without one in the "src" attribute? Would this count as stuffing? Sometimes you cannot put an image but you would like to get SEO benefit by having a keyword in an image?
On-Page Optimization | | Plivo0 -
Form Only Pages Considered No Content/Duplicate Pages
We have a lot of WordPress sites with pages that contain only a form. The header, sidebar and footer content is the same as what's one other pages throughout the site. Each form page has a unique page title, meta description, form title and questions but the form title, description and questions add up to probably less than 100 words. Are these form pages negatively affecting the rankings of our landing pages or being viewed as duplicate or no content pages?
On-Page Optimization | | projectassistant0 -
Keyword in URL: Ranking Factor?
I've got a site about a specific topic, which we'll call "themes" for the sake of this discussion. I personally like to keep the url structure short and clean (for usability purposes, but mainly because I'm a perfectionist and a minimalist). I feel that adding "themes" to the url structure is a bit redundant. However, nearly every keyword phrase that my site should rank for includes the word "themes." So I'm wondering how much I'm handicapping myself by not including the keyword "themes" in the url? The domain name itself sort of includes the keyword . . . although it's in Italian (I chose the domain for it's brand-ability, not for the keyword). A quick example: My Url Structure: www.themo.com/topic/abc My Competitor's Url Structure: www.sitesample.com/themes/topic/abc For many of the keywords, the competitors with the keyword in the url rank highest. But, I'm not sure how much emphasis to place on this, because from my understanding Google doesn't pay as much attention to url keywords anymore . . . and those sites might just be ranking high because they've been around for so long (which also happens to be the reason why they coincidentally also include the keyword in the url, because they started the site when that was a high ranking factor). Thoughts? Should I just trash my perfectionism and add the keyword to the url structure? (By the way, the site is only a couple months old and doesn't have any significant backlinks to inner pages yet, so changing the url structure wouldn't be a big deal if I decided to do that).
On-Page Optimization | | JABacchetta0 -
How does Google view frequent changes to the copy of an ecommerce product page?
Is there any reason to think that adding to or improving copy might harm seo?
On-Page Optimization | | Brocberry0 -
How to fix duplicate page content and page titles?
Apologies in advance if this has already been answered (it probably has) - I'm just not seeing it. Is there a guide on here for how to fix the issues brought up by the crawler - specifically, things like duplicate page content, or duplicate page titles? A lot of these seem to have been created by wordpress.org combos that I didn't anticipate - i.e., category pages, author pages, etc. The crawler brings up the problems, but I don' t know where to start to go about fixing them. Also, any guide on best SEO practices or fixing optimization problems, specifically for wordpress.org blogs, would be greatly appreciated. Thanks!
On-Page Optimization | | prospects1 -
Google SERPS showing wrong page.
I am new to SEO and trying to rank for keyword 'corporate entertainment' and my site is currently at 26. However google is showing the homepage http://www.musicliveuk.com in SERPS as opposed to my optomised page http://www.musicliveuk.com/home/corporate-entertainment. Any ideas why it is choosing so show the home page as the most relevant result?
On-Page Optimization | | SamCUK0