I think this website has been hit by Panda, but I would appreciate your opinion
-
I've been asked to check a possible SEO problem with a website, that has been loosing organic traffic during more than 2 years. I have attached a screen capture from analytics, showing how the organic traffic impact.
This website publishes over 15 articles per week, and 12 of them are news with less than 150 words. I think that maybe Panda is hitting the website because of these practice. You can check the website: crazyminds.es
I would like to know your opinion about the cause of this lost of organic traffic.
On January, 21st 2013 they changed the website design, but the lost of traffic seems to have started before that date.
If panda is hitting the website, what should be the best way to correct this situation? They have began now to write news with more than 200 words, but what happens with the old news? Maybe a no-index tag? blocked by robots? how should they manage those?
Thank you!
-
Here's more info on blocking javascript and the effects it can have on ranking:
https://yoast.com/google-panda-robots-css-js/
It's certainly possible that it's affecting the site, but there could be other issues as well.
Regarding the news articles, the number of words is not what matters most but whether these are useful to people. If most of these have very low user engagement, then they could possibly cause Panda to affect the whole site.
-
Hi Danny,
google search console shows this:
- 14527 indexed pages
- 19 resources blocked (js files, wp-includes is being blocked by robots.txt)
Errors, desktop:
- 3 server errors
- 1 soft 404 error
- 51 404 errors
Errors, news:
4 errors: article fragmented and article too short.sitemap.xml (146 warnings)
14439 urls, sent
13886 urls, indexed
No security problems.
No more issues, I would like to know whay you say it comes from a technical problem... I think there must be something that is making you think that way. I would appreciate your opinion on this issue. Thank you so much!
-
Hi Marie,
Thanks for your anwser.
The news they are publising are being created by them. Not just a copy and paste from other sites, created 100% by them, but those news are just 3 or 4 parragraphs long.I have checked that there are 19 resources blocked in 3185 pages (js files as you said). I can tell them to let googlebot explore the wp-includes directory, to avoid this issue. Do you really think this can cause such a big effect?
And finally, robots.txt file has also a Crawl-delay:30. Search console show that as a warning. What do you think, should we remove that? It shouldn't be an issue... but never knows...
Thank you so much!
-
I just noticed one other thing that could potentially be an issue. You've got wp/includes blocked by robots.txt. If this is where your javascript lies, then it's a good idea to actually allow Google to crawl this directory.
-
I took a quick look at this domain.
While there does seem to be a few duplicate content issues my hunch is that it is more likely to be due to some technical issues.
I would first go take a look in Google Search Console and check whether Google is reporting any crawl, sitemap or robots.txt issues.
If that doesn't show any issues then you may need to get a full technical audit by a reputable SEO
-
This is a question that would take a few hours to answer properly. Here are my brief thoughts.
What you've described with the short news articles sounds like Panda fodder for sure to me. If they're republishing news stories without adding value of their own then this is not a good idea. Too many of these can cause Panda to demote the entire site (not just the news articles.) In most cases I'd noindex content like this. Sometimes you can consolidate content into a thorough piece of content, but usually either noindexing or removing the content is best.
Also, it's not the number of words that matters. If they decided to write 1000 word articles that were essentially repeating what could be found on other news sites this would also be thin content. If they produce content that they want to have in the Google index, then Google has to have a reason to send readers to their content rather than to the hundreds of other people who wrote about the same news story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Am I hit by an update????? Please help
Hello. Is possible to have just a page or two penalized?? From 24th april ( around that date i heard was an update) two of my pages ( out of 37) do not rank for their main keywords but are indexed. Also, lately a lot of my pages drop out of top 100 and get back on their position. Any help? Am I in trouble? Anyone has the same problem?
Algorithm Updates | | Ag96adc0 -
Old school SEO tools / software / websites
Hey Mozzers, I am doing some research and wonder if you can help me out? Before Moz, Hubspot, Majestic, Screaming Frog and all the other awesome SEO tools we use today what were the SEO tools / software / websites that were used for aiding SEO? I guess we can add the recently closed Yahoo! Directory for starters! Thanks!
Algorithm Updates | | RikkiD220 -
Does omitted results shown by Google always mean that website has duplicate content?
Google search results for a particular query was appearing in top 10 results but now the page appears but only after clicking on the " omitted results by google." My website lists different businesses in a particular locality and sometimes results for different localities are same because we show results from nearby area if number of businesses in that locality (search by users) are less then 15. Will this be considered as "duplicate content"? If yes then what steps can be taken to resolve this issue?
Algorithm Updates | | prsntsnh0 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Why a terrible website ranks number 1??
Hi, I'm an SEO newbie. A couple of months ago I launched a new E-Commerce website for my client : http://www.corporategiftsshop.co.za The site has over 1000 pages indexed in Google. I've done some link building and on-page SEO for the keyword terms : corporate gifts
Algorithm Updates | | MarnusW
promotional items
promotional gifts Currently the website ranks number 31 for "Corporate Gifts" in Google.co.za What I cannot comprehend, is that the site which ranks number 1 is simply shocking! http://www.corporategifts.co.za/ It is a single, static webpage with all links pointing to another website : http://www.promogifts.co.za It has 1 back link and a page rank of zero, yet it still ranks number 1? Can anyone give me a reason or some insight into this as it has me stumped.. Some of the other sites in the top 5 are also poor, yet they still rank high. Our site has a Page rank of 5 and 67 unique domains which links to it ( according to our webmaster tools ) yet it still only manages a 31 ranking?? Any advise would be greatly appreciated as I need to make sense of this, otherwise hang up my SEO gloves.. Regards, Marnus.1 -
Hit hard by Penguin
We've been hit hard by the Penguin update. To start - we have never paid for a link or participated in any link scheme. Never. I'm sure you've heard that - but in this case it's true. Our site - www.annuity.com - has been in existence since 2004. Because of our name, we've always been ranked on page 1 of Google for the search term "annuity". Now we aren't even on page 50. We got the dreaded letter about links, but we've never bought, exchanged, paid, etc. or participated in any link scheme. We have not needed too. We are referenced by sites such as Wall Street Journal, USA Today, Bankrate.com, etc. Our content is original and our own. We've spent years writing it. Because of our sites name - Annuity.com - we get a lot of web spam. Looking at some of our inbound links - companies are linking to us because of our name (keyword - looks like a auto link generation or something like that) or our reputation. Any suggestions or advice on how to get our ranking back without becoming the Web Police? We don't buy, purchase, spam, or need links that are not earned by the quality of our content. I'm at a loss.
Algorithm Updates | | galis421 -
How important are links after Panda
I have noticed that the sites in my niche that were at the top of the SERP's are still at the top of the SERP's after panda. I have also heard people theorizing that links are no longer important, its now all about bounce rates, time on site, etc. Is there any consensus about how important links are after Panda? thx Paul
Algorithm Updates | | diogenes1