ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
50% Visibility drop following June 2021 Google Update
Hello everyone,
Algorithm Updates | | yacpro13
We've observed a 50% drop in our Visibility score in the last week. This is our biggest drop ever, which coincides with June Google updates. We're an established ecommerce website located in Canada. This has obviously severely impacted sales. I'm frantically searching for information regarding fixes / implementations to recover asap, but if anybody could point us in the right direction, that would be hugely appreciated. Thanks!0 -
Do we need to Disallow profiles from discussions or forums?
Hi, We have a forum where users create different threads like any other community...ex..Moz. Thousands of pages are getting created. New threads and comments are Okay as they have relevant content. We are planning to "Disallow" all profile pages as they do not help with content relevancy and may dilute the link juice with thousands of such profile pages. Is this right way to proceed? Thanks
Algorithm Updates | | vtmoz0 -
Advice regarding latest Google Algo Update please, if possible...
Hi there I wonder if anyone can advise on this. Since the latest google update on 1st Sept or whenever it rolled out, we noticed an initial spike in hits on our site, which was great. However now we are noticing levels going back to where they were and less people visiting the site. It also seems to be very sporadic. So we have a period of say a couple of hours with no one on the site, then suddenly loads visiting. We have also noticed a big dip in enquiries, despite the site having roughly the same amount of visitors. All our stats on Moz, Webmaster Tools, Ahrefs, Serpfox and various other rank trackers are showing that we have had an increase in visibility on our tracked keywords. There is a definite spike on all, but where is our traffic and where are our enquiries? Usually we are able to work out where the problem is when updates occur but with this we have no idea. We are utterly baffled. Is this normal? Is this just fluctuations and will settle down? Has anyone else noticed weird things happening? If anyone has any ideas or experience of this then would be most grateful for any advice. Feeling rather desperate at the moment. Many thanks in advance. Clojo
Algorithm Updates | | Clojobobo0 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
How on earth is a site with ONE LINK ranking so well for a competitive keyword?
Ok, so I'm sure you get the gist of what I'm asking about in my question. The query is 'diy kitchens' in Google UK and the website is kitchens4diy[dot]com - which is ranking in third from my viewing. The thing is, the site has just ONE BACKLINK and has done for a good while. Yet, it's ranking really well. What gives?
Algorithm Updates | | Webrevolve0 -
Ranking well for main key terms but site traffic has dropped sharply?
Hello All, Just a quick question. Since the penguin update our site www.caravanguard.co.uk has seen some pretty fluctuating movement in Google, many of our key terms dropped over night, but over the last few weeks they have slowly started to move back up the rankings. The bizarre thing is despite the recover in rankings our unique traffic has taken a fairly large whack in numbers. Seasonality? Weather? ( it's been nice in the UK for a change) I can only assume the longer tail terms are taking more time to recover. I have tried to look into our back link profile and have noticed a little too much in terms concise keyword targeting, How do you go about changing these terms and removing the really bad links (struggling to identify the worst cases) on totally irrelevant sites or poor directories. Put in place before I started here 🐵 Any help truly appreciated. Regards Tim
Algorithm Updates | | TimHolmes0 -
Has Panda update made you lose your ranks but put them back again?
I noticed recently that one of the main sites I run dropped ranks quite heavily across the board. I then noticed that with very link building during the time that the ranks were down (about 1 month) that my ranks went back up again really quickly. All this with very little link building effort, and its the same link building campaign I've been running for a while. So I'm wondering has any been experiencing ranking flux between jan and feb? I know that people reckon if you fix some things your ranks can improve again, but I barley fixed anything on the site and yet it dropped some keywords from 1st page to 3rd page and then back to 3rd page; some keywords went back to original position some were lower but non were higher.
Algorithm Updates | | upick-1623910 -
Any ideas why our category pages got de-indexed?
Hi all, I work for evenues, a directory website that provides listings of meeting rooms and event spaces. Things seemed to be chugging along nicely with our link building effort (mostly through guest blogging using a variety of anchor text). Woke up on Monday morning to find that our City pages have been de-indexed. This page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington used to be at the top of page #2 in the SERPs for the keyword "Meeting Rooms in Seattle" I doubt that we got de-indexed because of our link building efforts, as it was only a few blog posts and links from profile pages on community websites. My guess is that when we did a recent 2.0 release of the site, there are now several "filters" or subcategory pages with latitude and longitude parameters in the URL + different page titles based on the categories like: "Meeting Rooms and Event Spaces in Seattle" --Main Page "Meeting Rooms in Seattle" "Classroom Venues in Seattle" "Party Venues in Seattle" There was a bit of pushback when I suggested that we do a rel="canonical" on these babies because ideally we'd like to rank for all 4 queries (Meeting Rooms, Party Venues, Classrooms, in City). These are new changes, and I have a sneaking suspicion this is why we got de-indexed. We're presenting generally the same content. Thoughts?
Algorithm Updates | | eVenuesSEO0