All Thin Content removed and duplicate content replaced. But still no success?
-
Good morning,
Over the last three months i have gone about replacing and removing all the duplicate content (1000+ page) from our site top4office.co.uk.
Now it been just under 2 months since we made all the changes and we still are not showing any improvements in the SERPS.
Can anyone tell me why we aren't making any progress or spot something we are not doing correctly?
Another problem is that although we have removed 3000+ pages using the removal tool searching site:top4office.co.uk still shows 2800 pages indexed (before there was 3500).
Look forward to your responses!
-
Thanks for your responses. We are talking over 3000 pages of duplicate content which we have no removed and replaced with actual relevant unique and engaging content.
We completed all the content changes on the 6/06/2013. Im thinking to leave it for a while and see whether our rank improves within the next month or so. We may consider moving the site to another domain since its features lots of high quality content.
Thoughts?
-
I've had two sites with Panda problems. One had two copies of hundreds of pages in both .html and .pdf format (to control printing format). The other had a few hundred pages of .edu press releases republished verbatim at their request or with their permission.
Both of these sites had site-wide drops on Panda dates.
We used rel=canonical on the .pdf documents on one site using .htaccess. On the site with the .edu press releases we used noindex/follow.
Both sites recovered to former rankings a few weeks after the changes were made.
If you had a genuine Panda problem and only a Panda problem then a couple months might be about the amount of time needed to see a recovery.
-
That's hard to say. A recent history and link profile like yours won't give your site the authority it needs for index updates at the frequency you would like. It's also possible that a hole has been dug that you cannot pop out of simply by reversing the actions of your past SEO.
You really need a thorough survey of your site, it's history, and it's analytics to determine the extent of the current problem and the best path to take to get out of it. Absent that, shed what bad back links that you can and develop a strategy to build visitor engagement with your brand.
-
The site has not received a manual penalty from Google.
However traffic and generic keywords fell when the previous developer decided to copy all of the products directly from our other site top4office.com.
The site was ranking pretty well in the past. Do you have any kind of ETA of when the updates will take effect
-
Hi Apogee
It can certainly take several months for your pages to drop from the index so if you've removed the pages in GWT and removed the URLs they'll eventually fall out of the index.
Was the site penalized and that's why you removed/replaced the dupe content--meaning were you ranking well and then, all of a sudden your rankings tumbled or are you just now working to build up your rankings? This is an important distinction because there are few examples of sites that received a panda penalty (thin/duplicate content) coming back to life.
If you don't think you've been penalized and you're just working to optimize your site and pull it up in the rankings for the first time, consider how unique your content is and how you're communicating your unique value proposition to the visitor. Keep focusing on those things.
Also, your back link profile looks a bit seedy--in fact, your problem could well be penguin-related. If you were penalized and it was a penguin penalty, you should be looking to clean up some of those links and working to build new ones from more thematically relevant sites.
-
Removing duplicate content won't necessarily increase your search positioning. It will however, give your site the foundations needed to start a (relevant, natural and organic) link building campaign - which if done correctly should increase your SERP's.
You should see content as part of the foundations. Good quality and unique content is usually needed in order to be rankable but it doesn't make you rank necessarily.
Having good quality unique content will also minimise the chances of being hit by an algo update.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content created by website Calendar - A Penalty?
A colleague of mine asked me a question about duplicate content coming from their event calendar. I don't think this will affect them negatively, but I would love some feedback and thoughts. ThanksOne of my clients, LifeTech Academy, is using my RavenTools software. Raventools has reported a HUGE amount of duplicate content (4.4K instances).The duplicate content all revolves around their calendar and repeating events (http://lifetechacademy.org/events/)The question is this - will this impact their SEO efforts in a negative way?
Intermediate & Advanced SEO | | Bill_K0 -
Tools to scan entire site for duplicate content?
HI guys, Just wondering if anyone knows of any tools to scan a site for duplicate content (with other sites on the web). Looking to quickly identify product pages containing duplicate content/duplicate product descriptions for E-commerce based websites. I know copy scape can which can check up to 10,000 pages in a single operation with Batch Search. But just wondering if there is anything else on the market i should consider looking at? Cheers, Chris
Intermediate & Advanced SEO | | jayoliverwright0 -
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Product Syndication and duplicate content
Hi, It's a duplicate content question. We sell products (vacation rental homes) on a number of websites as well as our own. Generally, these affiliate sites have a higher domain authority and much more traffic than our site. The product content (text, images, and often availability and rates) is pulled by our affiliates into their websites daily and is exactly the same as the content on our site, not including their page structure. We receive enquiries by email and any links from their domains to ours are nofollow. For example, all of the listing text on mysite.com/listing_id is identical to my-first-affiliate-site.com/listing_id and my-second-affiliate-site.com/listing_id. Does this count as duplicate content and, if so, can anyone suggest a strategy to make the best of the situation? Thanks
Intermediate & Advanced SEO | | McCaldin0 -
Best Way to Incorporate FAQs into Every Page - Duplicate Content?
Hi Mozzers, We want to incorporate a 'Dictionary' of terms onto quite a few pages on our site, similar to an FAQ system. The 'Dictionary' has 285 terms in it, with about 1 sentence of content for each one (approximately 5,000 words total). The content is unique to our site and not keyword stuffed, but I am unsure what Google will think about us having all this shared content on these pages. I have a few ideas about how we can build this, but my higher-ups really want the entire dictionary on every page. Thoughts? Image of what we're thinking here - http://screencast.com/t/GkhOktwC4I Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
How to Avoid Duplicate Content Issues with Google?
We have 1000s of audio book titles at our Web store. Google's Panda de-valued our site some time ago because, I believe, of duplicate content. We get our descriptions from the publishers which means a good
Intermediate & Advanced SEO | | lbohen
deal of our description pages are the same as the publishers = duplicate content according to Google. Although re-writing each description of the products we offer is a daunting, almost impossible task, I am thinking of re-writing publishers' descriptions using The Best Spinner software which allows me to replace some of the publishers' words with synonyms. I have re-written one audio book title's description resulting in 8% unique content from the original in 520 words. I did a CopyScape Check and it reported "65 duplicates." CopyScape appears to be reporting duplicates of words and phrases within sentences and paragraphs. I see very little duplicate content of full sentences
or paragraphs. Does anyone know whether Google's duplicate content algorithm is the same or similar to CopyScape's? How much of an audio book's description would I have to change to stay away from CopyScape's duplicate content algorithm? How much of an audio book's description would I have to change to stay away from Google's duplicate content algorithm?0 -
Duplicate content resulting from js redirect?
I recently created a cname (e.g. m.client-site .com) and added some js (supplied by mobile site vendor to the head which is designed to detect if the user agent is a mobi device or not. This is part of the js: var CurrentUrl = location.href var noredirect = document.location.search; if (noredirect.indexOf("no_redirect=true") < 0){ if ((navigator.userAgent.match(/(iPhone|iPod|BlackBerry|Android.*Mobile|webOS|Window Now... Webmaster Tools is indicating 2 url versions for each page on the site - for example: 1.) /content-page.html 2.) /content-page.html?no_redirect=true and resulting in duplicate page titles and meta descriptions. I am not quite adept enough at either js or htaccess to really grasp what's going on here... so an explanation of why this is occurring and how to deal with it would be appreciated!
Intermediate & Advanced SEO | | SCW0 -
How much (%) of the content of a page is considered too much duplication?
Google is not fond of duplication, I have been very kindly told. So how much would you suggest is too much?
Intermediate & Advanced SEO | | simonberenyi0