I think this website has been hit by Panda, but I would appreciate your opinion
-
I've been asked to check a possible SEO problem with a website, that has been loosing organic traffic during more than 2 years. I have attached a screen capture from analytics, showing how the organic traffic impact.
This website publishes over 15 articles per week, and 12 of them are news with less than 150 words. I think that maybe Panda is hitting the website because of these practice. You can check the website: crazyminds.es
I would like to know your opinion about the cause of this lost of organic traffic.
On January, 21st 2013 they changed the website design, but the lost of traffic seems to have started before that date.
If panda is hitting the website, what should be the best way to correct this situation? They have began now to write news with more than 200 words, but what happens with the old news? Maybe a no-index tag? blocked by robots? how should they manage those?
Thank you!
-
Here's more info on blocking javascript and the effects it can have on ranking:
https://yoast.com/google-panda-robots-css-js/
It's certainly possible that it's affecting the site, but there could be other issues as well.
Regarding the news articles, the number of words is not what matters most but whether these are useful to people. If most of these have very low user engagement, then they could possibly cause Panda to affect the whole site.
-
Hi Danny,
google search console shows this:
- 14527 indexed pages
- 19 resources blocked (js files, wp-includes is being blocked by robots.txt)
Errors, desktop:
- 3 server errors
- 1 soft 404 error
- 51 404 errors
Errors, news:
4 errors: article fragmented and article too short.sitemap.xml (146 warnings)
14439 urls, sent
13886 urls, indexed
No security problems.
No more issues, I would like to know whay you say it comes from a technical problem... I think there must be something that is making you think that way. I would appreciate your opinion on this issue. Thank you so much!
-
Hi Marie,
Thanks for your anwser.
The news they are publising are being created by them. Not just a copy and paste from other sites, created 100% by them, but those news are just 3 or 4 parragraphs long.I have checked that there are 19 resources blocked in 3185 pages (js files as you said). I can tell them to let googlebot explore the wp-includes directory, to avoid this issue. Do you really think this can cause such a big effect?
And finally, robots.txt file has also a Crawl-delay:30. Search console show that as a warning. What do you think, should we remove that? It shouldn't be an issue... but never knows...
Thank you so much!
-
I just noticed one other thing that could potentially be an issue. You've got wp/includes blocked by robots.txt. If this is where your javascript lies, then it's a good idea to actually allow Google to crawl this directory.
-
I took a quick look at this domain.
While there does seem to be a few duplicate content issues my hunch is that it is more likely to be due to some technical issues.
I would first go take a look in Google Search Console and check whether Google is reporting any crawl, sitemap or robots.txt issues.
If that doesn't show any issues then you may need to get a full technical audit by a reputable SEO
-
This is a question that would take a few hours to answer properly. Here are my brief thoughts.
What you've described with the short news articles sounds like Panda fodder for sure to me. If they're republishing news stories without adding value of their own then this is not a good idea. Too many of these can cause Panda to demote the entire site (not just the news articles.) In most cases I'd noindex content like this. Sometimes you can consolidate content into a thorough piece of content, but usually either noindexing or removing the content is best.
Also, it's not the number of words that matters. If they decided to write 1000 word articles that were essentially repeating what could be found on other news sites this would also be thin content. If they produce content that they want to have in the Google index, then Google has to have a reason to send readers to their content rather than to the hundreds of other people who wrote about the same news story.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does Google indexes the website after algorithm update?
I've noticed that the page ranks for some queries were improved unexpectedly, without any actions from my side. Is it possible that this improvement is connected with with Google algorithm update Sep, 4-5?
Algorithm Updates | | AurigaPR0 -
Domain location is a ranking factor? Back links & website?
If a website trying to rank in US and it has received many back-links from domains hosting from other countries; how it will impact website ranking? Can a website hosted in country will rank well in other country? How much the hosted location matters? Like....domain hosted in Germany but trying to rank in US?
Algorithm Updates | | vtmoz0 -
Searching for Compelling Hard Data on why B2B Websites Should Be Responsive
I am being asked to provide hard data in support the migration to a responsive website for a large B2B website. I have searched for any case studies showing before/after comparisons - no luck. I can easily show: Current data on desktop vs mobile visitors, their bounce rate, pages per visit, etc. Google Analytics Benchmark data - really compelling stuff there! In the past year, 100K visitors have come to the site from mobile devices. GWMTs shows the client not receiving mobile impressions for important keywords, All the close competitors have gone responsive. In APAC regions, mobile is more widely used than in the USA. BUT, I can’t show that making this expensive and time-consuming transition will result in more revenue. The client is a financial services software company, with a 2-3 year sales cycle. Has anyone seen data to support this transition? Thanks everyone! Have a great long weekend.
Algorithm Updates | | RosemaryB0 -
Panda...Should I consolidate...Like this...
I'm torn. Many of our 'niche' ecommerce products rank ok, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Algorithm Updates | | saultienut0 -
Lost 75% of my traffic on Oct 25, help appreciated
So I've been running coolquotescollection.com since 1997 (!) as a hobby project. I lost about 75% of my organic search traffic on the 25th of October, literally overnight. I've been doing a lot of research but I still don't know why I was penalized. Image attached. I naturally thought this was because of Penguin (Oct 17, my drop was Oct 25). However, after checking backlinks I only discovered 11 domains with about 100-400 links each, the major ones were forum signatures and blog sidebars, 6 domains were spam sites / directories. They almost exclusively used the same anchor text (domain name or similar), so this doesn't seem like a black hat attack. Some of the directories used keywords in their urls however (like "funny quotes").
Algorithm Updates | | Sire
1. Is this really enough for such a heavy penalty?
I added these domains to be disavowed today, I'm aware this might take weeks or months to change. I've automated so that pictures gets uploaded to my Facebook page with a link to my page. This started in early 2014.
2. Can Facebook links be considered link spam?
They don't even show up in webmaster tools.
Example: https://www.facebook.com/CoolQuotesCollection/photos/a.510328825689624.1073741825.326096120779563/615403025182203/?type=1&theater I analyzed keywords and the major ones dropped between 2 and 6 positions. Notable exception: I seem to still rank nr 1 for "cool movie quotes" even though page is not optimized for that keyword. Moz warned about over 5000 pages with duplicate content. It was a single page that used a querystring url parameter I have excluded in webmaster tools. I have now entered a canonical link on these pages. Example:
http://coolquotescollection.com/Home/TShirts?url=http-url-example...
http://coolquotescollection.com/Home/TShirts?url=http-another-url.......
3. Could the Google algo penalize this even though I have excluded the "url" parameter? I have a lot of internal links in the page navigation. Can this cause problems? See the absolute bottom of this page where I have 94 links for example: http://coolquotescollection.com/laughs
4. Could a lot of internal links (navigation to page numbers) be the problem? Some more facts: Site is http://coolquotescollection.com/ Domain is 14 years old. The web site launched in Sep 1997, a year before Google! (Not relevant but you might understand why this is important to me). I haven't done any SEO work for at least 12 months, probably closer to two years. The only SEO work I've done is to optimize the pages, no link building at all, no black hat stuff. I'm automatically building a sitemap that contains all pages, see here: http://coolquotescollection.com/robots.txt I've used webmaster tools for years, haven't gotten any warnings. I checked backlinks there, also here from moz and ahrefs. I'm annoyed that a quality content site can be penalized so hard (75% drop) when there are no, or just smaller issues. I'm just lucky this is not my business site, if so I would have gone out of business. Any help in this matter would be greatly appreciated! z3yFNdb.png Cci7vfI.png0 -
Recommended action for site hit by penguin ?
What is more advisable, though there surely could be debate on this? Back in '07 till sometime around a year ago it seems our site got hit by google's updates, no manual action though, and have seen in past few months disavowed what we could find as well as deleted a lot of links. We are also working on getting word out on the brand as well and trying to get on some business websites to have articles and offer some discounts. Our keyword rankings seem stuck in limbo the past year or so though. Some main keywords for example seem stuck around page 8 when they used to be on page 1. Question is, can what seems to be a penguin update be recovered from? Is Google likely to refresh the algorithm? Also could starting a new site be more worth the investment - starting fresh with natural links, etc And if googles system could pick up that the site is run from same ip, etc. would they care? Also the keyword competition one of Moz's tools said around 46% if that makes a difference for one of the main keywords. Thanks
Algorithm Updates | | xelaetaks0 -
Mutliple Websites for Same Company (different areas of practice)
My company has had one primary website for a number of years, a few years back we created a second website separate and apart from the first one to generate more business in a niche market that we cater to. Since then we ended up adding 3 more websites to help increase our footprint with more content. Each of the new websites deal with a major aspect of our business and the content generated on those websites are related to those areas of our business. My question is - is it bad idea to have a network of 5 websites for SEO-purposes? What are the pros and cons and why? Any supporting resources to back up your position would be greatly appreciated. Note there is no "duplicate content" problem here, all content we create is unique to the site it is hosted on.
Algorithm Updates | | goldbergweismancairo0 -
Considering the Panda algorithm updates, would you recommend reducing high amounts of inbound links from a single website?
My website has a significant number of inbound links (1,000+) from a single website, due to a sponsorship level contribution. Both my website and the other are authorities in the industry and in search results (PR of 5). Since even ethical websites can suffer a penalty from each iteration of Panda, I'm considering significantly removing the number of links from this website. Do you think that measurable change would be seen favorably by Google or would the drop in links be detrimental?
Algorithm Updates | | steelintheair0