Help with Duplicate Content Issue for pages...
-
I have pages with duplicate content, i want to put them on hold while i write unique content as i do not want to get marked down for it. I also want to keep the urls and use them again.
There are about 300 pages affected by duplicate content currently.Am i best doing 302 redirects as it is temporary? to the origional source of the content, or canonical tags no index?
The pages are currently indexed and cahced by google, i want to use the url in the future for unique content to get it valued by Google.
Any advice much appreciated.
Kind Regards,
-
Can this work as a temporary thing?
So if i put a canonical tag in then take it out, once taken out and the page is crawled Google will then value it as unique content?
I just don't want to do anything which makes Google think its permanent.
-
I would suggest using the canonical tag on duplicate pages until you come up with new contents.
-
Thanks for the advice so far everyone.
Some of the pages are getting traffic etc and all seem to be indexed - up to 1000 it seems.
Its 3/4 paragraphs of text so a little more than a product description.
It's a difficult one to call, as i need to save the urls but not get penalised for duplicate content.
-
Perhaps 302 to the folder (/destination/) until you're ready to use the content on a page. Implementing the canonical tag here probably won't make a difference as there will only be a small number of 'live' pages.
Good luck Paul
Rob
-
I would just focus your efforts on creating unique, optimsed content for these pages. If they are just duplicate product descriptions, they will know that and I doubt it having an major impact on the rest of the site.
What percentage of your site do these 300 pages account for? How quickly can you rewrite the content on the pages?
You could noindex but unless there was a proven impact that these pages are having on your unique pages I think you are likely wasting your time.
-
Thanks for the advice.
On further looking there's more of 100 pages which have duplicate content.
The urls are similar for example.
/destination
and then
/destination/keyword
I have been told to keep the urls and put unique content on as i go on progressing the site. I just don't know what the best way to do it is, whether it is 302, canonical, etc.
-
Sounds like a lot of work!
If there's a lot of dupe content are the URLs also quite closely matched in terms of keyword use? Are you going to end up just consolidating a lot of the content on a smaller number of pages? If that's going to be the case then perhaps a 301 to the root domain, or to the best of the current pages, would be better.
Cheers
Rob
-
Well i might need the re-direct to be there for up to 12 months, There's quite alot of content to do.
-
302 redirect does indicate a temporary move, but how temporary is the move? If you're going to have this content sorted quite quickly then you might leave the pages as they are for now.
Are you falling short on ranking currently because of dupe issues?
Don't forget about using internal anchor text to inform Google which pages are relevant for certain keywords.
Cheers
Rob
-
Well yeah i have a few pages which have been duplicated content and i dont want to get penalised for duplicate content. However I want to eventually write unique content for them and use the urls. As they are cached and indexed already I am wondering what the best solution is. I dont want a perm re-direct as i want to use the urls again.
-
Is there 300 pages that you considering 302 for?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Clarification on duplicate content
Hi, if I have a page that unintentionally ranks for a term that I want to create a page for - say "atlanta apartments" - should I still create a page specifically intended to rank for "atlanta apartments"? Will canonical tags be crucial in this case? Hoping to avoid creating duplicate content and instead create the correct content for a specific term.
Content Development | | smiller760 -
Updating Content - Make changes to current URL or create a new one?
I'm working with a content team on a job search guide for 2019. We already have a job search guide for 2018. Should we just edit the content of the job search guide for 2018 to make it current for 2019, which means the job search guide for 2018 would not exist anymore or should we keep the 2018 guide and just create a new web page for the 2019 guide that way both exist. We currently rank very well for the 2018 job search guide.
Content Development | | Olivia9541 -
Will having duplicate content on four websites cause a problem?
A client of ours has four websites for different shops they run in the surrounding area. Each website has original content as well as duplicate content. This is for things like product advice which needs to be the same Will having duplicate content on these four websites cause a problem? How can it be mitigated? We can't refer the visitor to another website to get the product information as this will break the user experience, and of course shopping cart sessions will not pass on.
Content Development | | Rebecca.Holloway0 -
Is there a tool for measuring content freshness?
i.e. crawling a site to identify last date of new or changed content? Thanks.
Content Development | | PeterTroast0 -
I want to remove some pages from my site with PR, what should I do with traffic?
I have a section of a site that I want to remove. It has a main page linked from the nav menu, and a half dozen subpages under that. The pages get some traffic and have ranks up to PR3, which is what my site's home page is. I'm no longer want to do these pages as they require tremendous upkeep and I'm not interested in keeping them going. So, I know if I just remove these pages and that's all, I'm going to pay for it somewhere with Google. What else should I do? I do't really have similar pages to direct them too.
Content Development | | bizzer0 -
Does this count as Copied Content ?
Hi, we are publishing news on our website blog. In the news we use excerpts from other websites but do mention the source like "according to XYZ news source" etc. Does it count as copied content as sometime copyscape shows alsmot 30% duplicated content due to inclusion of references from different sources in our news stories ? Regards, shahzad
Content Development | | shaz_lhr0 -
Getting Duplicated Content Removed
So I recently took over an in-house SEO role and began some house cleaning. I found a few places that had copied or duplicated our homepage content. Naturally I reached out to them to ask them to remove or change the content. Today I come to the office and one of the sites I had requested to remove the content, had fired back at me that I was being rude, threatening and I should go f**k myself. As if this wasn't enough this guys was an upper level manager (managing director) and knows the upper management where I work. Now I have people in my company pissed at me, when I thought I was doing the right thing. Am I in the wrong or what? I had simply asked to remove or change the content and that failure to do so could result in legal action. I understand that it could have been misconstrued as a threat (which it wasn't intended to be) but its seems like a pretty immature response from a higher level person. Any thoughts or advice on what to do next?
Content Development | | Gerad0 -
Please help me stop google indexing https pages on my wordpress site
I added SSL to my wordpress blog because that was the only way to get a dedicated IP address for my site at my host. Now I am noticing Google has started indexing posts both as http and https. Can some one please help how to force google not to index https as I am sure its like having duplicate content. All help is appreciated. So far I have added this to top of htaccess file: RewriteEngine on Options +FollowSymlinks RewriteCond %{SERVER_PORT} ^443$ RewriteRule ^robots.txt$ robots_ssl.txt And added robots_ssl.txt with following: User-agent: Googlebot Disallow: / User-agent: * Disallow: / But https pages are still being indexed. Please help.
Content Development | | rookie1230