I think Panda was a conspiracy.
-
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy.
Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase."
And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix.
Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda.
So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well.
Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query.
I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place.
What do you think?
-
hahahaha. I agree with you. First Google manipulates results to see if Bing is copying. Soon after this update comes with all the care that is creating its own content sites optimized for the robots and we can not think of duplicate content. hehehehe
-
Actually I really liked the "content registry" idea.
An library of content where you could register what you have created and, optional, a link to where you want to be considered the main source.
At least it would be 10x more usefull than the google knol idea..
-
I would pay a fee to protect my best content in the Google SERPs.
-
Actually you have a point there... if JC Penny was indeed the catalyst (which I could easily imagine it being) then the time between that and the update would surely mean it would have to have been rushed. I never considered that before.
-
ha ha... I think they did rush this out.... they were quickly trying to pull up their pants after getting embarassed from the JCPenny problem... they needed to bust a few heads quickly...
-
Ha ha, maybe
I think it's something infinitely less planned out and they simply rushed this change out the door without understanding fully what it would do to the SERPs.
Although I do think you're right that in a few months (in what will be claimed to be a second Panda sweep) that things will go back and only the very worst offenders will stay penalised.
-
Yes I like the content registry idea! It would probably be necessary to pay for it as a service though, and to cover dupes that are okay maybe they could just allow dupes as long as they reference back to the source in the registry (for news, quotations, etc... where dupes can't be avoided).
-
Interesting ideas. Thanks for sharing them.
I think that Google is talking a lot about this as a "quality website update"... and that is getting them attention in the media but it is also kicking a lot of webmasters in the butt to clean up their websites.
I think that google should make a "content registry" where I can submit my content and say "this is mine" and then copies or spins of that content will not get traction in the SERPs.
And, I think that they should take a closer look at websites in the Adsense program because the ability to monetize crap and theft is driving lot of bad odor in the SERPs.
-
Haha I like it!!
Well, if it's not what happened, they'll wish they thought of it anyway lol
My view on why other sites got hit is just that they had at least some links coming from sites that got hit... i.e. got 100 backlinks, 10 are from articles on article sites, article sites get hit... lose 10 backlinks (or at least lose some of the value from some of those backlinks)... hence, good site takes a hit too
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda...Should I consolidate...Like this...
I'm torn. Many of our 'niche' ecommerce products rank ok, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Algorithm Updates | | saultienut0 -
Help for a webstore with Google Warnings for Watermark Images and Panda
I have not had too much experience with helping websites that have been hit by Panda - any tried and tested formulas I can pass to website owner would be great. He does not want to reveal domain name - its in the area of children/baby products 'Web site featured on page 1 of Google search results for many years (website 5 years old- Australian domain) . In April/May 2014, Google suspended our Google Shopping account because we used watermarks on all our images. We were advised that the suspension would remain in place indefinitely or until such time the watermarks were removed. We wrote back to Google to explain that these watermarks were put in place by our store back 2005 with the sole purpose of protecting our intellectual property. Needless to say, their attitude was unwavering. And as a result, revenue plummeted. However, the perfect storm was about to hit our store without warning. In the same month, Panda 4.0 was unleashed and our store was hit once again. This update alone reduced visitor numbers by around 50% overnight. The Panda 4.0 algorithm update was designed to target poor quality, duplicate content and unfortunately we had some of it. We have now begun creating original content with many of the new products we're uploading onto our web site. It's slow and tedious. We have modified our web site to now include a tag on a the home page (this was missing). We have removed many duplicate links from our footer (it was too big and contained hundreds of links that were also repeated from the header). We introduced a blog and we have engaged the services of a local seo company to disavow any bad backlinks and add missing or improve existing content to category and brand pages. No improvement in our situation is yet visible and with Christmas just 3 months away, poor sales during our 'bread and butter' period will mean even tougher times for our store in 2015. ANY PANDA EXPERTS who can help please email me [email protected] - looking for independent freelancers rather than agencies
Algorithm Updates | | GardenBeet0 -
Panda 4.0 Suggestions
My site was hit pretty negatively by Panda 4.0 and I am at a loss for the best way to address it. I have read like every article that I can and I know there is some duplicate manufacturer product descriptions but I don't hear many other ecoms complaining about Panda so I figure it must be something else. Also, the pages that seem most negatively affected are category and product list pages. Any help or suggestions would be much appreciated. Thanks! http://bit.ly/1plgOzM
Algorithm Updates | | Gordian0 -
Do panda/penguin algorithm updates hit websites or just webpages ?
If I have a website that been affected by the panda/penguin update, do bad links affect the entire site or just the page the bad link(s) are linked to? If it is the latter and penguin/panda actually affect webpages, not websites (as is the common reference/conception), then wouldn't simply creating a new URL, targeting this new URL, shifting meta-tags and restarting link-building efforts again (this time using the right quality strategies) be a really common-sense approach instead of the tediousness of the disavow approach that so many go down?
Algorithm Updates | | Gavo0 -
Post penguin & panda update. what would be a good seo strategies for brand new sites
Hi there. I have the luxury of launching a few sites after the penguin and panda updates, so I can start from scratch and hopefully do it right. I will get SEO companies to help me with this so i just want to ask for advices on what would be a good strategies for a brand new site. my understand of the new updates is this content and user experience is important, like how long they spend, how many pages etc social media is important. we intent to engage FB and twitter alot. in New Zealand, not too many people use google+ so we will probbaly just concentrate on the first two hopefully we will try to get people to share our website via social media, apparent that is important should only concentrate on high quality backlinks with a good diverse set of alt tags, but concentrate on branding rather than keywords. Am i correct to say that so far? if that is the principle, what would be the strategy to implement these goals? Links to any articles would also be great please. Love learning. i just want to do this right and hopefully try to future proof the sites against updates as possible. i guess quality content and links will most likely to be safe. Thank you for your help.
Algorithm Updates | | btrinh0 -
Can AJAX implementation affect the rankings in Google Panda?
Hi there, I have the following situation with one of our job sites. We migrate the site to a new application, which is better from design point of view and also usability. For this we use a lot AJAX especially in searches. So every time a user is filtering down their search new results will be shown on the page, at the same url and with no page load. But, having this implementation. affected Bounce rate - which increased from 38% to nearly 60%, PI/visits - which are now half, at 3 and also Avg Time on Site is half that is used to be coming to 2,5 min from nearly 6 min. From Rand post, it is clearly that the content is very important in Google Panda, and all of these parameters we should consider, as it is telling the quality of the content. So, my question will be, can this site be hit by Panda updates (maybe later on) because Bounce Rate, PI/Visits and Avg Time on site, decreased in such way? At the moment we don't measure the Ajax impresion, but as I understood that we can do that though virtual pages in GA, does anyone of you have the experience how to handle this? Won't be this an artificial increase? Thanks, Irina
Algorithm Updates | | InformMedia0 -
Yet another Panda question
Hi Guys, I'm just looking for confirmation on something..... In the wake of Panda 2.2 one of my pages has plummeted in the rankings whilst other similar pages have seen healthy improvements. Am I correct in thinking that Panda effects individual pages and doesn't tar an entire site with the same brush? Really I'm trying to see if Panda is the reason in the drop on one page or whether it could be something else. The page in question has dropped 130 positions - not just a general fluctuation. Thanks in advance for your responses!!!
Algorithm Updates | | A_Q0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1