Would Panda target this?
-
Hi guys,
We suffered a massive rankings drop in September 2012, same date as Panda 20, so we've been trying to fix the issues since with no little to no success.
I think these Q&A's work best if I ask a specific question, instead of just screaming for help, so hopefully we're looking in the right place at least.
One area I've been looking into, is of course, content (being a Panda penalty). However, I'm not sure what about our content is causing a problem. We provide a phone unlocking service and have over 6000 handsets that we can unlock. We only allow search engines to index 5 of them, due to these being those with unique product descriptions (there are over 100 more but we want to start getting our rankings back a bit at a time). We also let them index our manufacturer pages, news and support pages, 160 approx in total.
On our handset and manufacturer pages we have much of the same content, with a few words difference to alter the price or the name of the manufacturer/phone. We also change our delivery times for some, as it can vary and have a "Why use us" section which is the same for each handset page.
In my mind there is no point changing these areas to be unique to each page as they clearly describe our service and what we offer. Changing each one for each page, especially if we wanted to start adding our other remaining 5995 handset would be ridiculously. It would also clearly be manipulative is we're just rewriting the same thing in a slightly different way to benefit a search engine and not the users.
Does anyone know if this type of content would be seen as duplicate content and would result in a penalty? And is there anything we can do about it?
Thanks,
Darren. -
Hey Chris and Kurt,
Please do be aware that we've noindexed the majority of those pages, even around 100 that have unique content on. Our aim is to concentrate the site to a handful of select pages to see if that works first (as it should in theory).
We've certainly paid a lot of attention to our content, which isn't a new thing for us. We've included videos of people unlocking the handset where we can - I've created some myself to broaden our reputation on Youtube, which also make their way onto our site. We have instructions for a large number of handsets on the pages, along with unique images and descriptions. I've even worked on numerous other "big" pieces for publication on our news pages, which have got links from the Washington Post, Huffington Post, TUAW, IntoMobile, etc and even phone networks. To manage it all, we noindex the pages with "thin" content and previously blocked Googlebot from going on them in the first place.
Having unique content on all pages shouldn't be, and I don't believe is necessary (there are various examples to support that). We've even stuck close to the rules by incorporating noindex like a boss for content we don't think should be included.
However, one of the kickers is that I've been reviewing our industry and our competitors, noticing time and time again that a lot of the ranking sites have even less unique content than we do. They stick solidly to a script, never creating a hint of unique content, even posting unlocking videos for unrelated phones on certain pages. Yet, they rank. They focus no attention to robots.txt or noindexing crap content and yet away they go blissfully into the SERPs sunset.
These competitors certainly don't have the same links as us or comparably useful content. I'd consider myself pretty knowledgeable at SEO, yet this makes no sense.
It's difficult to include all the same information on what we've done in these posts without making them stupidly long but a lot of the "standard" stuff, we've done. We've been working on getting this right for over a year, even tossing aside an extremely powerful previous domain (with some success).
If Google were a person, I'd put a bag over their head and see how they like it.
-
Great response Chris.
Darren,
The issue for Google, especially when dealing solely with the algorithm, is that they don't want to display duplicate content and they don't always get the subtlety of a situtiation like a phone unlocking process. From your perspective (and the user's) you know that each page is necessary because there are subtle differences in the unlocking process of different phones, maybe just one different step or a different menu title, etc. It's a small change, but for a user who's never done it before, it's significant.
For Google, though, If the content on the 6000 pages is essentially the same with just a few words changed in each, it looks like spam. There have just been too many low quality sites that put up a page for every keyword in their industry, but use essentially the same content on all of them to try to rank for everything. Google has to deal with the percentages. While it may make sense in your situation to have different pages with only subtle differences in content, it doesn't for most niches, so they penalize everyone who does it. That's just the world of Google that we live in.
So, the question becomes, "How do I create unique pages for each of these pages that I want indexed to appease Google?" Chris has some great suggestions for this. I know that it seems daunting when you are talking about 6000 pages, but that's just the reality. Also, keep in mind that, as you are doing now, you don't need to deal with all your pages at once. This is what has to be done for each page you want indexed and ranked in Google. So, you can take it a bit at a time if you need to...or outsource some/all of the work someone on oDesk or find a way to get your customers to create the content for you.
Kurt Steinbrueck
OurChurch.Com -
purple,
5 pages like that isn't enough to get you a penalty, although 6000 thin/dupe pages like that would be enough to put your site in a class with spam sites. So the question is--how do you deal with those other 5995 pages when the time comes to put them back up, if it comes. I'd be thinking of breaking them down first by manfacturer (with a strong page on your unlocking service for each one) then by model type (with a strong page on your unlocking service for each one) and then you might start breaking them down by specific model number. With so many new model numbers coming out each year, it seems like you'd keep your hands full creating content for just the new ones, let alone old ones but you could work your way back in priority of those that give you the most business.
As far as the content itself, it could be videos about how the consumer could unlock each specific one themselves, interviews with owners of unlocked phones, information about the phones themselves (development history, carriers that sell them, sales specs, technical specs, OS's used....) There's a whole ocean of information you could be giving your audience that pertains to the genre of cell phones, unlocking, carriers, mobile devices, manufactures, OS's, etc. Standardize on a number of specific points of data from those areas that you think best gives the audience a picture of your brand's philosophy about being in business and include them in the content for each new page you create. Remember, you've got to think like a publisher if you're going to pull yourself out of that penalty.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
When do Panda ranking factors apply when Google deindexes a page
Here is 2 scenarios Scenario 1 Lets say I have a site with a ton of pages (100,000+) that all have off site duplicate content. And lets say that those pages do not contain any rel="noindex" tags on them. Google then decides to de-index all those pages because of the duplicate content issue and slaps me with a Panda penalty. Since all those pages are no longer indexed by Google does the Panda Penalty still apply even though all those pages have been deindexed? Scenario 2 I add a rel="noindex" to all those 100,000+ off site duplicate content pages. Since Google sees that I have decided to not index them does the Panda penalty come off? What I am getting at is that I have realized that I have a ton of pages with off site duplicate content, even though those pages are already not indexed by Google does me by simply adding the rel="noindex" tag to them tell Google that I am trying to get rid of duplicate content and they lift the Panda penalty? The pages are useful to my users so I need them to stay. Since in both scenarios the pages are not indexed anyways, will Google acknowledge the difference in that I am removing them myself and lift the panda ban? Hope this makes sense
On-Page Optimization | | cbielich0 -
Targeting Cities in Different States on a Landing Page
I'm having one issue with my SEO rankings. If I target let's say four cities in the same state for one keyword in a title, it ranks well. let's say one of those cities ends up being outside of the state though for another landing page, then it usually ends up on page two. Does anyone have a solution to this issue? Thanks!
On-Page Optimization | | OOMDODigital0 -
Is it better to target fewer keyword terms more often throughout a site or more keyword terms less often?
For example we have 5 different briefcases styles on our site with 5 different colors each. Is it better to have them all target the same keyword term: ie. Men's Leather Briefcase Bag - Examiner No. 5 Black Leather | Ghurka Men's Leather Briefcase Bag - Examiner No. 5 Brown Leather | Ghurka Men's Leather Briefcase Bag - Examiner No. 5 Tan Leather | Ghurka Men's Leather Briefcase Bag - Examiner No. 5 Black Twill | Ghurka Men's Leather Briefcase Bag - Examiner No. 5 Navy Twill | Ghurka etc. OR Men's Leather Briefcase Bag - Examiner Leather Bag for Men | Ghurka Leather Men's Briefcase Bag - Examiner Leather Bag for Men | Ghurka Leather Handmade Briefcase - Examiner Leather Bag for Men | Ghurka Men's Designer Business Bag - Examiner Leather Bag for Men | Ghurka Leather Men's Laptop Bag - Examiner Leather Bag for Men | Ghurka Advice would be greatly appreciated! Thanks, Taylor
On-Page Optimization | | Ghurka0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
How important is it to include the target keyword phrase in the page URL?
If I want to target a keyword phrase to a particular phrase, but do not want to change the URL of that page, will that negatively impact my rankings? I am also wondering if I can get around it by creating a new, short URL that 301 redirects to the original URL. Would that be as effective as including the keyword in the original URL?
On-Page Optimization | | susannajbost0 -
How are your "Service Area" pages handling Penguin/Panda?
We just got a new client because of recent Penguin/Panda changes. A national "SEO" firm decided it was a good idea to set up a page for each service town or county they serve with nothing but duplicate content. Needless to say, on the week of the 23rd, their rankings tanked from 1st page (it's not a competitive niche) to 4th. I'm not bringing this up to brag, but rather because it got me thinking... How are your geographically targeted "service area" pages doing? Have the recent changes caused you to rethink your geographic targeting in any way?
On-Page Optimization | | BedeFahey0 -
City targeting on home page
Client has a site that ranks well for "Town_A_KW", "Town_B_KW" and "Town_C_KW". The home page is the page that's ranking. These towns are part of the larger metro area for Portland. They want to start ranking for "Portland_KW" and normally, I'd recommend optimizing the home page for this phrase, and better optimizing the sub-pages for town A, B and C KW's. The client is understandably nervous about messing with re-targeting the home page since it already ranks well. Is it best to: Add "Portland_KW" to home page meta titles, content, etc. to try and rank for that phrase? (so home page would be optimized for Town A, B and C KW's + Portland_KW). Re-target home page for "Portland_KW" only, and better optimize sub-pages for town A, B and C? Leave home page as is, and create a "Portland KW" sub-page? (client's original idea). Thanks in advance for your insights!
On-Page Optimization | | 540SEO0 -
What are the benefits of targeting one keyword phrase per page vs. multiple keywords per page
What are the benefits of optimizing a page for one keyword phrase versus a group of similar keywords, like this one that Rand posted on another blog entry http://bit.ly/7LzTxY: Ted Baker Ted Baker London Ted Baker Clothing Ted Baker Mens Ted Baker Mens Clothing Ted Baker Mens Collection
On-Page Optimization | | EricVallee340