Penguin Update Issues.. What would you recommend?
-
Hi,
We've been pretty badly hit by this penguin Update. Site traffic is down 40-50%.
We suspect it's for a couple of reasons
1)Google is saying we have duplicate content. e.g. for a given category we will have 4-5 pages of content (products). So it's saying pagenum=2 , pagenum=3 etc are duplicate pages. We've implemented rel=canonical so that pagenum=2 point to the original category e.g. http://mydomain/widgets.aspx
We've even specified pagenum as a url parameter that pagniates. Google still hasn't picked up these changes. How long does it take - it's been about a week
2)They've saying we have soft 404 errors. e.g. we remove a category or product we point users to a category or page not found. is it best to block googlebot from crawling these page by specifying in robots.txt. because we really don't care about these categories or product pages. How best to handle?
3)There are some bad directory and crawlers that have crawled our website but have put incorrect links . So we've got like 1700 product not found. I'm sure that's taking up a lot of crawling time. So how do we tell Google not to bother with these link coming from specific sources e.g. ignore all links coming from xxx.com.
Any help will be much appreciated as this is Killing our business.
Jay
-
Hey Ben,
Thank you so much for your response.
I 'm pretty sure it was the Penguin update that brought our rankings down.
We don't participate in any paid linking, no blog networks etc.
The only thing we did was submit to article directories- which i understand are frowned upon now so we'll move away from that.
We'll try to get all the non existent pages to show 404 codes and any clear any duplicate page title and page content errors and hope that we'll get back in google good graces.
-
Hi Jay,
Sorry to hear it's hurting your business so much.
Have you double checked the dates of your decrease in traffic against the Penguin update? There were a lot of big changes going on around that time so it's worth being sure it was Penguin.
In answer to question 3 - If they're external sites then I don't think those 1700 404s are having a negative effect on your SEO. If those directories are hurting you at all through the Penguin update then it would be through over-optimised anchor text (although I haven't seen any definitive data on this).
In answer to question 2 - Would I be right in thinking that you're using a 301 or a 302 to send users to a generic error page? However you're generating soft 404s the best fix is to make them real 404 errors so the server returns a 404 code. The details of setting up a custom 404 page are pretty well documented around the web so you shouldn't have much problem with it.
In answer to question 1 - Have you tried checking to see if Google has re-cached your pages since the change? It's also probably worth looking at the rel=prev rel=next markup as well. Maile Ohye from Google has released a pretty comprehensive video on the topic of pagination and SEO so I'd recommend checking that out.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issues with the apperance of cross country sitelinks
Hi Moz community, My questions is related to the international SEO, esepecially sitelinks. The problem is that the users from US see in the search results the sitelinks which comes from different countries, e.g. users from US see the sitelinks from Australia or the sitelinks from our international website, which has obviously no specification. I must say, that we've done everything to be in accordance with Google interantional SEO recommendations, hraflang & lang attributes, properly set location in GSC. All of these were tripplechecked. I also need to say, that it happens only to the websites that include content written in English nad French. All other branches show proper sitelinks. It think Google can't properly locate the content, if the language is the same regardless country. Previously it could be solved with disavow tool, but today's I don't know about any manual action that could deal with the issue. I also noticed that some other pages are affected with the same issue. To better understand the issue, please see the image link. The image shows the results from US. Despite the location, it shows sitelinks form UK or International website. Do you have similar experience? I will be thankful for any help. 1NNtJ
Intermediate & Advanced SEO | | eset0 -
Reviewing Category & Tag policy - Update
I recently (http://moz.com/community/q/less-tags-better-for-seo) started reviewing my category and tag policy, and things have been going very well. I thought I would share what I have done: Removed all tags from site Added unique descriptions for each post for the category excerpt. Only had the category description on the first page and use the description like a post to summarise and interlink to sub-categories or posts. This keeps pages from slipping down the number of clicks until it can be reached, improving link juice distribution. I also reduced the number of posts showing to 5, to allow more focus on the description (main part) of the category post. To add the category description on the first category page only in Wordpress, you need to go to the category.php or archive.php and change: to The overall aim was to have a hierarchal resource contained in the category page description. Whilst this is still a work in progress, you can see an example of what I am trying to achieve here: https://www.besthostnews.com/web-hosting-tutorials/cpanel/ https://www.besthostnews.com/web-hosting-tutorials/cpanel/mail/ If you have any further tips and advice as I continue to implement this (with good results so far), please feel free. Also, you can use the Visual Term Description Editor plugin to allow the wysiwyg editor for the category descriptions.
Intermediate & Advanced SEO | | TheWebMastercom1 -
Better to publish regular new pricelist articles or update the existing ones ?
Hello Moooooooooooooz ! I could not sleep yesterday because of a SEO nightmare ! So I came up with the following question: "Is it better to release regular new articles or update the existing ones" I explain more. Our company release regular pricelists (every month new pricelists available for a month, with the same brands. ex: January pricelist for brand A, etc.) Right now those pricelists are ranking good on google. So I wondered: Would it better to do: Make the pricelist articles stronger: Our company - Brand A pricelist (title) blog/offer/brand-A-pricelist.html (url) -> every month I update the text. So I just have one article /link to work on **Make more content on the pricelist: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A-pricelist-january.html (url) -> So google keeps indexing new fresh content **Work on a extra category: **Our company - Brand A pricelist - January 2014 (title) blog/offer/brand-A/pricelist-january.html (url) -> So I work on one link over the web blog/offer/brand-A where Google finds lots of new relevant contents I know that Matt Cutts said it's good to udpate an old article but in this case it's a bit different. Has anyone experiment the same ? Tks a lot !
Intermediate & Advanced SEO | | AymanH0 -
Penguin 2.1\. Bad links removed - do I need to wait for next Penguin upgrade to see recovery?
Hi - I have read conflicting advice about this issue - after taking action and removing bad links following a Penguin 2.1 hit, will the site need to wait for the next Penguin upgrade before the link clean-up has any effect? Or will the cleaning of the links be acknowledged and "rewarded" with a ranking improvement before that (assuming all bad links were cleared out)?
Intermediate & Advanced SEO | | StevieD0 -
Penguin 2.0 Recovery - Penguin Update Rerun yet or not
I have been hit by the penguin 2.0 update some five months back. I believe that I have an algorythmic penalty applied to my sites. While the work to cleanup etc has been done, there is certainly no recovery. I also notice a lack of recovery stories. In fact I think anyone affected cannot recover because a recalculation has not happened? Does anyone think that a recalculation of the penguin 2.0 penalties has happened? If so why do they think that.
Intermediate & Advanced SEO | | Jurnii0 -
Hey guys i have this issues on my crawling report what should i do to exlude the pages? are d
Overly-Dynamic URL Overly-Dynamic URL Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in a given URL. Search engines may also see dynamic versions of the same URL as unique URLs, creating duplicate content.
Intermediate & Advanced SEO | | adulter0 -
Penguin: Recovery from Algorithm
Hi Mozzers, A quick question regarding Google Penguin recovery. A domain I have was hit by Penguin and we got a message in Webmaster Tools. We went to work fixing links we thought were most harmful, documented the evidence and did a reconsideration request. Google's reply was "No manual spam actions found". If reconsideration requests aren't the way to go then of course I will continue to build good natural links. But if I remove more links which I consider to be harmful will I ever know if the penalty is removed? Is there a point at which the algorithm would remove the penalty and inform me? Thanks!
Intermediate & Advanced SEO | | panini0 -
Two Brands One Site (Duplicate Content Issues)
Say your client has a national product, that's known by different brand names in different parts of the country. Unilever owns a mayonnaise sold East of the Rockies as "Hellmanns" and West of the Rockies as "Best Foods". It's marketed the same way, same slogan, graphics, etc... only the logo/brand is different. The websites are near identical with different logos, especially the interior pages. The Hellmanns version of the site has earned slightly more domain authority. Here is an example recipe page for some "WALDORF SALAD WRAPS by Bobby Flay Recipe" http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1 http://www.hellmanns.us/recipe_detail.aspx?RecipeID=12497&version=1 Both recipie pages are identical except for one logo. Neither pages ranks very well, neither has earned any backlinks, etc... Oddly the bestfood version does rank better (even though everything is the same, same backlinks, and hellmanns.us having more authority). If you were advising the client, what would you do. You would ideally like the Hellmann version to rank well for East Coast searches, and the Best Foods version for West Coast searches. So do you: Keep both versions with duplicate content, and focus on earning location relevant links. I.E. Earn Yelp reviews from east coast users for Hellmanns and West Coast users for Best foods? Cross Domain Canonical to give more of the link juice to only one brand so that only one of the pages ranks well for non-branded keywords? (but both sites would still rank for their branded keyworkds). No Index one of the brands so that only one version gets in the index and ranks at all. The other brand wouldn't even rank for it's branded keywords. Assume it's not practical to create unique content for each brand (the obvious answer). Note: I don't work for Unilver, but I have a client in a similar position. I lean towards #2, but the social media firm on the account wants to do #1. (obviously some functionally based bias in both our opinions, but we both just want to do what will work best for client). Any thoughts?
Intermediate & Advanced SEO | | crvw0