Panda, rankings and other non-sense issues
-
Hello everyone
I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G:
http://www.virtualsheetmusic.com/
Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while.
Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it.
Take for example this competitors of ours:
They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them.
Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first.
As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st.
So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around.
Thoughts???!!!
-
Thank you very much Julie, I really appreciated your words. I have wondered so many times what Google think of "quality", and why before us there are always very low quality websites distributing the exact same music for free (often copyrighted music, which is illegal) and most of those sites are full of ads. Is that quality?
We could open a new discussion thread on the "What is quality to Google?" topic, I think it'd be very popular!
Thank you again.
-
Thank you Donna, glad to know that I am not completely mad!!
As for the fact they have done a great job with title and alt tags, anchor texts, I agree, but you know what? That's another realm where I became paranoid for, the so called "over-optimization"... We used to have perfectly optimized titles, descriptions, H1,s ALTs, anchor text, etc... the whole enchilada perfectly optimized, then we began to lose rankings for an unknown reason (Panda? Over-optimization? Too many pages? What else?), and I began becoming paranoid about everything, so we started "de-optimizing" here and there, etc... Here is additional proof that when things are NOT clear, we all become paranoid and lose control on everything.
I may also add that some of the blame should be probably given to the SEO industry that has spread a lot of fear about all this stuff, without giving an absolute "quantification" of what means "too much optimization", or... too much duplicate content, or too much thing content, or too much bad links, etc... how much is "too much"? That's the question for which I am afraid there's not easy answer, but maybe they scared us too much about all this.
Thank you again, and please, let me know if you have any more ideas.
-
I agree with your observations. I don't see why you'd suffer a Panda penalty and your competition wouldn't.
My only other observation (again, not Panda related) is 8notes have made much more extensive use of link title and alt tags to reinforce their keywords and subject matter.
-
Oh, I also forgot to make you note that when we have more than one version for the same title and same instrument, we add a canonical to the first version.
For example, both items below:
http://www.virtualsheetmusic.com/score/HL-170563.html
http://www.virtualsheetmusic.com/score/HL-119157.html
Are canonicalized to this item:
http://www.virtualsheetmusic.com/score/HL-180308.html
Thanks
-
Thank you Donna!
Yes, I am aware we have much more segmentation, but why in the heck that should be bad? And, more importantly, why that should be worse than having much more actual duplicate content than they have if you consider they never use canonicals nor noindex to avoid duplicate issues.
Take for example one of their instrumental pages like this one:
https://www.8notes.com/violin/sheet_music/?orderby=5d
You can order the results by title, artist, level, etc... they don't even care to have duplicate issues by canonicalizing the URL when parameters are added to it. What do you think may be worse for Panda? I am just asking, I'd really like to understand what can be worse.
They have similar issues with their product pages, if you click on their tabs at the top, parameters get added to the URLs but no canonical is present there. We do have canonicals instead, as by Google's manual.
As for having duplicate titles (because of different instrumental versions), yes, that's something we have tackled in the past several times, tried to remove the duplicates (noindexed) but didn't help. I mean, we didn't notice any change by doing that waiting several months after the change. We just got much lower traffic because of that, but nothing positive.
Also, have a look at our competitor how many different version they have for some of the most popular titles:
Bach's Air on G: https://www.8notes.com/scores/air_on_the_g_string_bach_johann_sebastian.asp
Bach's Minuet: https://www.8notes.com/scores/minuet_bach_johann_sebastian.asp
Beethoven's Fur Elise: https://www.8notes.com/scores/fur_elise_beethoven_ludwig_van.asp
Beethoven's Ode to Joy: https://www.8notes.com/scores/ode_to_joy_(9th_symphony)_beethoven_ludwig_van.asp
Now, I understand we may have many more titles than them, our catalog is much bigger than theirs, but still they have similar issues, right? If so, why they have a very privileged spot in SERPs compared to us? I am sorry to sound like a broken record, but I am still not convinced the problem here is all this...
If you have the chance, I am eager to know your after thoughts... thank you again very much for your help and time Donna. Appreciated very much.
-
Yes, I see what you've done with genres, specials, etc. That looks good.
If I compare you to 8notes, you've got a lot more segmentation when it comes to instruments and those pages are NOT noindexed.
For example, you have 29 different versions of the "Dust in the Wind" sheet music page, all very similar. Here are a few:
- http://www.virtualsheetmusic.com/score/HL-328374.html (Dust in the Wind sheet music for violin)
- http://www.virtualsheetmusic.com/score/HL-328387.html (Dust in the Wind sheet music for trumpet solo)
- http://www.virtualsheetmusic.com/score/HL-301822.html (Dust in the Wind sheet music for choir and piano)
- http://www.virtualsheetmusic.com/score/HL-170563.html (Dust in the Wind sheet music for guitar (chords))
- http://www.virtualsheetmusic.com/score/HL-26501.html (Dust in the Wind sheet music for piano solo)
- http://www.virtualsheetmusic.com/score/HL-119157.html (Dust in the Wind sheet music for piano solo V2)
NOT noindexed doesn't mean they're getting indexed by Google. When I did a site command for http://www.virtualsheetmusic.com/score/HL-328387.html (Dust in the Wind sheet music for trumpet solo), it wasn't returned as a search result. When I searched for ""dust in the wind" trumpet solo virtualsheetmusic" it was not returned as a search result.
So maybe you need to consider noindexing all the instrument variations as well, but still offer them up on the site for visitors. I'd check analytics to see if anyone's landing on those pages from search. As I said earlier, I can understand why you'd want them indexed but if they're causing you more harm than good, you might have to balance that out.
I love a challenge but that's the best I can come up with Fabrizo.
-
Thank you Donna for your reply.
Well, I see what you mean, but if you look how those drill-downs are handled on our site, all pages generated dynamically that way are excluded by robots.txt. That's why I am puzzled to see the competitor's website having a similar kind of browsing without worrying about duplicate issues. Also, I apply any possible rule to reduce duplicate content as much as I can even for those few indexed pages such as the use of canonicals (when parameters are present in the URLs) or the use of rel="prev" or rel="next" for paginated content.
Please, let me know if that's what you meant or I am missing anything here.
Thank you again very much!
-
"I'd really like to know from you if you see anything on my website that could trigger a "Panda" kind of penalization, compared to my mentioned competitor above (8nots.com)."
_Key phrase being "compared to my mentioned competitor". Cause yes, I can see things that might trigger a Panda penalization. You have a lot of overlapping / duplicate content but so do your competitors. _
The only thing that comes to mind is if there's a threshold you're exceeding that your competitors aren't by virtue of the fact that you many (and more) ways to tag/filter your content, for example, genres, specials, and ensembles.
-
Yes, I agree with you, I don't see much logic beyond that. Of course, if they also count images, we have hundreds of thousands... we are a pretty big website, what do you expect, right?
My website situation makes people "scratch their heads", all the time...
I'd really like to know from you if you see anything on my website that could trigger a "Panda" kind of of penalization, compared to my mentioned competitor above (8nots.com). So far, no one on this thread has given me any hints on that.
Thank you again for your insights, I appreciated it very much.
-
I wonder if "indexed URLs" is an accurate label. I looked at the URLs Majestic found, and a significant number are redirected files, images, and mp3s.
You have a perplexing problem Fabrizo...
-
Donna, as a side note, I have no idea where Majestic pulled out over 944,000 indexed pages for our website. By spidering it with Screaming Frog we couldn't crawl more than 387,726 pages... unless they crawled all links dynamically generated by our internal search engine, which by the way, is blocked by the robots.txt file, therefore all those dynamic pages should be not counted.
Also, on the actual Google index, if you use the site: command, you'll see that Google has indexed just 123,000 pages from our site (because most of them are canonicalized), whereas you'll see over 545,000 for 8notes.com.
Actual data seems to be a little different...
-
Thank you Julie for your posting and for participating in this discussion.
Well, what you say might be true, but being an algorithmic penalization that shouldn't really happen... unless the system is flawed in some way to catch the wrong guys (every time?)
Also, thin and duplicate content is so much more obvious and noticeable on our competitors that makes me completely mad trying to find a logical explanation of why me and not them!
Unless, Panda or other similar "quality" updates are looking now for something else nobody has clearly understood yet...
-
I think it's like the Highway Patrol.
Everyone speeds, but everyone doesn't get caught. You point to a similar web page that hasn't been penalized, but there are many who have.
-
Thank you Kristen,
I have just put down a plan to re-architect our website that way and create "sub-categories" in the same way our competitor has done to push-up the main category pages as well (according to the "soloing" technique).
As I wrote yesterday below, this is also something clearly NOT related to Panda... just another needed tweak to the site.
Thank you again, appreciated your help!
-
Thank you Donna! Yes, I am aware of our different back link profile. We are a commercial website, therefore we have many backlinks from hundreds of affiliates... and that could cause issues, I am aware of that. I have worked a lot with my affiliate to put nofollow links where necessary, and to not pass page rank as much as possible... But again, we are talking about issues NOT Panda related... right?
So... again, this can't explain why the first Panda in 2011 as well as the last quality update released in June (was that really Panda?) has hit us hard. I am getting convinced that it is not Panda the beast hitting us once in a while, but something else... my point is: I could be under "several" penalties, ok, I get that... I could be under some Panda penalization, or other quality penalization, and maybe Penguin to some extent (I could never find a clear relation between my traffic loss and the release of Penguin updates though)... but if I am really in the eye of Panda when that happen, back to my original question, why my competitor has never been touched by the white & black bear when its content should be much more prone to Panda than mine? That's the whole point of my conversation here and the answer I am trying to find. I am trying to find a logical explanation of why my traffic dropped with the release of Panda updates, whereas my competitor wasn't touched at all.
Thank you again for your help, appreciated!
-
You're right. Your site and 8notes seem to be guilty of the same practices, assuming there's anything wrong with them, and yet it is ranking well. Although according to MajesticSEO, it has half the pages you have indexed (520,439 vs 944,432).
Your link profiles are significantly different. Again according to Majestic, you have way more backlinks (649,076 vs 234,122) but from half as many referring domains, IPs and subnets. You have 1/10th of the educational backlinks of 8notes. And the majority of your backlinks, roughly 55%, are nofollow whereas 90% of 8notes are the opposite (follow). 8notes seems to have more deep links as well.
Maybe it's worth looking a little more closely at your link profile?
8notes is also https. That might also have a bearing given you're both ecommerce.
-
In any case, I created this thread to discuss about Panda and its "possible" and "not-possible" implications... so, the discussion is still be open.
In the meantime I wait with hope for an answer to my questions above from Kristen (thank you again Kristen!), I'd like to get back to the topic: Panda. It is clear to me that my situation could be improved as Kristen has suggested above, but it is also clear that if so, that's nothing to do with Panda, isn't it? That's just about "content consolidation" and "topical relevance".
Then, back to my original discussion topic, what about the classical Panda issues such as "thin content", or "duplicate content"? As I wrote above, my competitor has plenty of that kind of content, but doesn't seem to have been touched by Panda whatsoever. So... what's the deal with Panda then? And in my particular niche, should I worry more about "topical relevance" and keep optimizing my site under other aspects (usability, user intent, etc), and stop worrying to much about thin and duplicate content?
If you were me, what would you do considering my competitor's evidence? How many other site owners like myself have become "paranoid" about Panda (wasting tons of time, resources and money) and have instead lost focus on other (probably more important!) issues such as topical relevance, content organization, usability, user intent, etc.
More thoughts on that?
-
This is a very good answer, thank you Kristen. The more I look at the "site structure" of my competitor compared to ours, the more I realize we need to work on that.
I have also started to think about the so called "siloing technique" Bruce Clay introduced a few years ago, and it looks like 8notes.com has done a very good job to follow that kind of concept, whereas we are probably "spreading" too much of our juice around thousands of different pages and categories... what are your thoughts on that?
Just a thought about the fact I have put those pages to be no indexed in the robots.txt file: If you look at those pages, you see they are generated dynamically from our internal search engine. And as you can see, you can filter results by clicking the filters on the left side of the page... which is a great thing for users, but can be problematic for search engines. Right? So.. that's why I decided to simply no-robot those pages, to avoid any possible indexing and crawling issues. So... how would you suggest tackling that problem? My first idea would be to create "static pages" for those dynamic pages linked from the category pages, and then block via no-robot the links of the filters on the left side... do you have any different ideas?
Thank you again for your help! Super-appreciated!
-
Thank you Kristen for your kind reply.
Yes, of course, I have considered that a thousand times. Do you really think that could cause so much trouble to make most of our rankings slip over the 10th or 20th page of the SERPS? Those pages you have mentioned on our website, are actually blocked by robots.txt, with the exception of the first page of course. My concern is about those first pages that should be able to rank anyway... unless you tell me that the contextual weight of the "subsequent" pages could play a role to "boost" the first page in some way if Google can spider and index them... but then, I'd be concerned with "too much similar or thin content" because by doing what our competitors are doing, I'd create thousands of additional pages with inside pretty much the same content (lists) organized in different ways... you see what I mean? Of course it seems to work for our competitor, but hence the contradiction and absurdity I was talking about above with the Panda algorithm: shouldn't all those thousands of extra and similar pages be bamboo for Panda?
I hope to have not confused you... I am just trying to find the elephant here that is causing the problem...
Thank you for your help!
-
Thank you Donna for your reply, but that's exactly why I posted my concerns here: How do I know if a page or a set of pages are "causing" me to not rank? That's the purpose of this discussion... how do I actually know that any action of that kind, by "nuking" pages, will be causing me more benefit than damage?
A couple of years ago, I thought to be under a Panda penalty, and I started to remove from the index many of our product pages that I thought were not bringing us traffic or had low user engagement... well, the only result we had was a steady decline of traffic because of the removed pages, that's all. I put all pages back after 6 months because we would have died miserably. After 6 months I concluded that we were NOT under a Panda penalty. IN fact, I put back all pages, traffic got back and we started to rank better than before (fortunate event??)
Also, if I we are under a Panda or similar penalty, why do we still rank well for some keywords? And, back again to square one: Why should we think to be under a Panda penalization if our content is actually less thin, less duplicate, and better handled with canonicals, noindex tages, etc, in order to cope with any possible Panda penalty than our well-ranked competitors??!
-
The only thing I can think to suggest is to look at how much inbound search traffic you are receiving on your category pages (e.g. genres, instruments, skill levels, exclusive, specials, etc.) to assess whether any of those could be noindexed. I understand why you'd want them indexed, but if they're causing you to not rank at all, then you might have to balance that out.
-
Thank you Julie, appreciated!! I really don't understand why most of the times our website content is buried in the search results, whereas "crappy" websites are shown prominently before us... do you call that "quality" big G??!!
Thank you again for your kind words
-
I feel bad for you. I don't have any answers, but I'm a singer, and your website is excellent. This is not an example of Google rewards quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword rank and redirecting
I'm creating a new amazon affiliate site. I've researched other successful sites. I've noticed that they are ranking for 1000s of keywords, but many of these long tail keywords are redirected back to a main page. I can see how this can reduce the overall total amount of content pages on the site. How are you able to rank for the keyword in the first place if the the page is redirected?
Intermediate & Advanced SEO | | lkomontt760 -
Tags and Categories Ranking
Hello A few times I have come across tags and categories ranking for some relevant search terms. This seems strange as all category and tag pages would have 0 external links back to them - whereas pretty much every other page on the site would have external links. In most cases thy seem to rank along with the page, for example: for the search term "voiceover recording xxxxx" The voice over recording page will rank, followed by a similar tag page... If I make the tag links 'nofollow' would that help stop this? Cheers
Intermediate & Advanced SEO | | wearehappymedia0 -
Subpage ranking for homepage keyword
Hi all, May seem like a simple scenario and I might be missing something, but my subpage seems to be ranking for my main homepage keyword. The subpage PR is 28 and my domain authority is 17, how can I get my main home page to rank instead of the sub page (product page)? I want to stay away from exact match anchor text links, any suggestions?
Intermediate & Advanced SEO | | SO_UK0 -
Cannot find reason for drop in ranking
One of our clients is a lingerie retailer, that also sells swimwear and nightwear. They have a webshop, with landing pages such as /lingerie, /swimwear and /nightwear (only in Dutch). The nightwear page is doing ok (#5 in Google NL), but for the swimwear and lingerie pages, something strange has happened and I have no clue what it is. On May 7, the /swimwear page ranked 24 for swimwear in Google NL. In that period, the homepage was ranking at #40-50. One week later, on May 14, the homepage was ranked 24 for swimwear, and the /swimwear page was nowhere to be found. The homepage is currently ranking 17/18 for swimwear. Something similar seems to have happened for lingerie. The homepage was ranking at around 11-13 for a few weeks in April/May, the /lingerie page was ranking at 8 in May, and then after the 4th of June, both the homepage and the /lingerie page no longer seem to have any decent rank for lingerie. So, trying to find the problem: I checked the Moz On-Page Grader for all three pages/keywords. All pages get an A. Robots.txt hardly contains anything, and there's no robots meta tag to block the page. Checked Google Webmaster Tools just in case, nothing out of the ordinary. The page contains 342 URL's which is a lot, but that shouldn't be too problematic, I thought? I used the Moz Keyword Difficulty Tool to analyze the current top 10 for lingerie, and compared it to our client's stats. Page Authority in the top 10 is 37-68, our clients is 33. Domain Authority in the top 10 is 27-100 (wiki page), our clients is 43. MozRank for the top 10 is 6,6-8,0, our clients is 6,7. MozTrust for the top 10 is 7,0-7,1, our clients is 7,0. The basics all look fine! Now, the page could use some link building, the number of links is a little low compared to the top 10 entries. We're also in the process of redesigning the page, adding more products but also more content (text and images) to inspire visitors a little more. But being low on links doesn't seem a good explanation for such a relatively sudden change in ranking. As far as we can tell, looking back at commits by our developers, nothing changed around that time that could have this impact. Of course it's possible that we missed something, we're still looking. Also, it's strange that these sudden changes took place for swimwear and especially lingerie, but not for nightwear. I'd really appreciate any tips, hints etc. to find out what's going on here. Thanks in advance! Michel
Intermediate & Advanced SEO | | DocdataCommerce0 -
Changing your URL? Impact on rankings?
I have been thinking about changing our webadres for quite a while but I'm too afraid of the impact on my SERP. I understand I would need to use the Google Change of Address tool & 301 redirects. Am I missing something? What is your experience with changing the URL of a website? How has this impacted your SERP? In the past I heard someone say it will damage the linkjuice by 20%. Is that accurate? If you change the URL, is there a blank period of where your old site nor your new site are indexed? Or does Google handle this transition well?
Intermediate & Advanced SEO | | wellnesswooz0 -
Issues with Sub domains for dealers
I'm starting a new SEO project and am feeling a little overwhelmed due to the scale of it. I am not sure where to start and hope that someone has some ideas. Thousands of dealer websites reside as sub domains on gravelymower.com/ (e.g. http://quality-mowers.gravelymower.com/) The particular sub domain mentioned above is not showing up at all for any searches and is not cached by Google: http://webcache.googleusercontent.com/search?q=cache:http://quality-mowers.gravelymower.com/ I realize that pretty much zero SEO best practices are followed on page and the location is not on the page, but why is this sub domain not even being indexed by Google? Any help is appreciated. Thanks!
Intermediate & Advanced SEO | | BridgelineDigital880 -
Duplication Issue?
One of our copywriters has just written a blog to be posted on our own company blog to be reviewed by myself, however I had noticed that the blog post has some duplication issues with one of our own product pages, about 60% duplication, is it still worth posting? Will search engines still index the blog post? Kind Regards,
Intermediate & Advanced SEO | | Paul780 -
Canonicalization issue I cant work out
Seo Moz have kindly brought to my attention some canonicalization issues with my site. Firstly I've adjust http://capitalalist.com to 301 redirect to http://www.capitalalist.com via htaccess. But the crawl has shown for every page in my site the problem below: http://www.capitalalist.com/cirque-du-soir http://www.capitalalist.com/cirque-du-soir/ It's just that last / that's causing the problem. But I can't seem to see anyone having the same issue before. BTW im using wordpress if that makes a difference. Can anyone elaborate on the issue? How would i adjust my htaccess file to redirect a request with a / on the end of it? Thanks in advance!
Intermediate & Advanced SEO | | AdenBrands0