How to deal with an media press content?
-
We have a company that create content and send as media press. We would like to use this content in our blog. We made it using RSS and having in our blog the same content. So right now we have some concern about duplicate content.
How do you guys deal with deal? Would we be penalized by duplicated content?
-
Are you distributing those press releases to others sites (via RSS)? In other words, do they appear on outside sites in their identical form?
If so, you could use rel-canonical cross-domain or a syndication-source tag and point your copies to the original press releases. That should help offset any duplicate content issues. It will keep you from ranking directly for those press releases, though.
It really depends on the goals/scope. It's not uncommon to cross-post press releases, and if you're talking about one a month or so on a site with dozens of pages (or more), it's not a big issue. If you're talking about 100s of press releases, then you could certainly run into trouble.
-
Hi Kauelinden
You could use the "link rel=canonical" tag or the "link rel=subsection" tag to help SE identify the most important of the two. You should look into the LINK REL tag to identify which one serves you best.
Regards
Alsvik
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tool to identify duplicated content on other sites
Hi does anyone know of a tool that could be used to identify if a site is using our content without permission? Thanks
Content Development | | turismodevino10 -
Stolen Content and a Panda Penalty
Hey Folks Question for those folks that have spent some time helping people with the recent penalties and the like. I have a client who has a clear Panda Penalty, huge drop in traffic on the initial Panda date and a further drop on the second date. Much smaller incremental drops on subsequent recent updates as well. From digging in it seems fairly cut and dry - copyscape shows another 250 or so sites with content from this site and there are nearly 2000 external URLs with duplicate content across these sites. We are talking complete, shameless copies of all of the text, sometimes the images as well. The client claims the content is all 100% unique and is his content and that the other blogs must have stolen his content resulting in the penalty - which, if it is true, and I have no reason to suspect otherwise, kind of sucks. Now, many moons ago, way before Penguin or Panda (maybe around 2006) I had a client that had suddenly lost all traffic and their historical rankings. No funny business, it was a small company, had been online since around 2000 and they were pretty much the first of their kind and always did very well from organic search. As it turned out, the content from the site had not really changed since it was set up and as lots of companies had sprung up offering a similar service they had seen their content copied wholesale, across many sites, all over the world. We attempted to contact many of these sites and got some results but many were just old, abandoned copy cat sites on advert supported hosting that had ceased to trade so we maybe got rid of about 20%. Well, in the end we just decided to rewrite the content, we did this and sure enough, the site bounced back to it's previous standing and has been pretty much there ever since. Now that was kind of easy, the site had maybe 20 pages, and it needed a sprucing up but in this case the site has around 500 pages so doing a rewrite is not going to be so easy. Problem is, I don't see removal requests being particularly successful either. So, I see the options and steps as being. Contact all the sites and request the removal of the content use the Google content removal facility:
Content Development | | Marcus_Miller
https://www.google.com/webmasters/tools/removals File a DMCA takedown for anything remaining Report Scraped Pages to Google:
https://docs.google.com/spreadsheet/viewform?formkey=dGM4TXhIOFd3c1hZR2NHUDN1NmllU0E6MQ&ndplr=1 Submit a spam report for all sites involved ? Submit a reconsideration request to let Google know what we have been doing (unlikely In a nutshell, do everything we can to get this content removed and then documenting this to Google in the hope we catch hold of someone who hears our plight. Interestingly enough, this is a sensitive one, so no URL but I would welcome any thoughts or experiences any of you may have had with similar problems. There is a little extra info here from Matt Cutts + Barry Schwartz that kind of tallies with my approach above but would really like to hear any feedback. http://www.seroundtable.com/google-stolen-content-13243.html Cheers all Marcus0 -
How to produced amazing contents
How to get started and produce some really fantastic contents on regular basis? I am into weight lose niche and don't want to produce garbage. where to start and get going?
Content Development | | Sajiali0 -
Content Marketing Strategy for a Solar Panel Shop
Hi everybody Everybody's talking about content marketing strategies these days and the videos from MozCon has definitely been a great inspiration for me. I have a customer who sells solar panels (wiki) here in Denmark and they are more or less hooked on the idea of investing long term. And this is where I wanna ask you guys for ideas to this strategy. I'm open to anything from overall guidelines to specific ideas on content. I hope you will chip in, thanks a lot 🙂
Content Development | | loevgaard0 -
How much content is needed
I have two clients whose websites have landing pages that feature a number of product links. In order to meet SEO/Google best practices, do I need to have additional content on these specific pages or will the links suffice? (Getpaper is an ecommerce; inpak is not) Any thoughts would be appreciated. http://www.getpaper.com/find-paper/inkjet-plotter-paper/color-bond-21-lb http://www.inpaksystems.com/bag-closing/bag-sewing
Content Development | | TopFloor0 -
Duplicate content on the homepage
Hello SEOMOZ Is giving me an error on duplicated content on my site. When viewing the details it is showing the following as duplicated content domain.co.uk/ domain.co.uk domain.co.uk/index.html Obviously these are the same pages. Why is it seeing them as seperate. Does anyone know how I can resolve this issue? Many thanks
Content Development | | lcdesign0 -
Best way to resolve duplicate content issue?
Not sure about what to do about this - I have a client who has a ton of pages (around 1200) which are all City specific pages, for long-tail search. These are all written with paragraphs in the format such as: Order to [City] today. So every page has essentially the same content. The site also only has 1562 pages, so with 1200 of them being City-specific same-content pages, that can't be good. However the problem is that these pages still rank very well (usually Position 1 or 2) for the terms they're targeting, and bring in enough traffic and revenue to justify their purpose. We also have Country specific pages, and these are all with unique content, rather than the scripted content on the City pages. So for example, for Italy we might have: Italy Page (Unique Content) Rome (Duplicate Content) Milan (Duplicate Content) Venice (Duplicate Content) etc. (Duplicate Content) For a low traffic country (Austria), we tried to 301 the City pages to the Country page, but that only resulted in us seeing a drop in search results for the city keywords, from (usually) Position 1 to more like Page 3 or 4, so quite a drop. So, without writing 1200 pages worth of unique content, what would your advice be?
Content Development | | TME_Digital0 -
Displaying archive content articles in a writers bio page
My site has writers, and each has their own profile page (accessible when you click their name inside an article). We set up the code in a way that the bios, in addition to the actual writer photo/bio, would dynamically generate links to each article he/she produces. Figured that someone reading something by Bob Smith, might want to read other stuff by him. Which was fine, initially. Fast forward, and some of these writers have 3,4, even 15 pages of archives, as the archive system paginates every 10 articles (so www.example.com/bob-smith/archive-page3, etc) My thinking is that this is a bad thing. The articles are likely already found elsewhere in the site (under the content landing page it was written for, for example) and I visualize spiders getting sucked into these archive black holes, never to return. I also assume that it is just more internal mass linking (yech) and probably doesnt help the overall TOS/bounce/exit, etc. Thoughts?
Content Development | | EricPacifico0