RSS Feed - Dupe Content?
-
OK so yesterday a website agreed to publish my RSS feed and I just wanted to check something.
The site in question is far more established than mine and I am worrying that with my content appearing on their website pretty much at the same time as mine, will Google index theirs first and therefore consider mine to be dupe?
They are linking back to each of my articles with the text "original post" and I'm not sure whether this will help.
Thanks in advance for any responses!
-
I'd personally be worried about this. I'd do this:
- Publish a few articles just on your site
- Note how long it takes Google to find them
- If this is a long time (ie more than 24 hours), maybe do a few social bookmarks - drop the link on Stumbleupon, Twitter, Facebook etc when you publish them. If using wordpress, you can get it to do this automatically.
- Now see what the time is for Google to index your pages
- Set up your RSS feed so it updates 24 hours or 48 hours after your site content is added, whatever it takes to get the post indexed on your site before their site.
- You could even control this individual feed's delay using http://wordpress.stackexchange.com/questions/1397/delaying-one-rss-feed-in-wordpress-but-not-the-others
- This way, you haven't got to get them to do anything at their end
-
Personally I don’t think so… but I guess to be on the safe side you can always ask him to add a canonical tag on your RSS feed and the issue will be resolved forever… btw I have seen many websites which copy others content and this thing usually don’t arises…
In my opinion Google is good at picking up this issue.
-
Yes, you are courting controversy here. There is no need to feature the same content in other external website. If you can manage to convince the website owners to place a rel syndicate tag on every article that gets published there [pointing to the original article], you would not have to fear anything.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Determining where duplicate content comes from...
I am getting duplicate content warnings on the SEOMOZ crawl. I don't know where the content is duplicated. Is there a site that will find duplicate content?
Technical SEO | | JML11790 -
Affiliate urls and duplicate content
Hi, What is the best way to get around having an affiliate program, and the affiliate links on your site showing as duplicate content?
Technical SEO | | Memoz0 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Risks and rewards of positioning content with CSS
Some SEO folks (such as http://www.stonetemple.com/articles/css-and-seo.shtml) advocate using CSS so that unique content in the body of your page appears earlier in the file than non-unique content (such as, for example, the content of an elaborate drop-down menu system). What say you? Do search engines now disregard position on page, and look at your pages holistically?
Technical SEO | | ahirai0 -
Squarespace Duplicate Content Issues
My site is built through squarespace and when I ran the campaign in SEOmoz...its come up with all these errors saying duplicate content and duplicate page title for my blog portion. I've heard that canonical tags help with this but with squarespace its hard to add code to page level...only site wide is possible. Was curious if there's someone experienced in squarespace and SEO out there that can give some suggestions on how to resolve this problem? thanks
Technical SEO | | cmjolley0 -
A lot of product pages with very similar content
I'm working with someone who's setting up an online jewelry store. The jewelry is available in many metal types, so we're creating filters to provide a good user experience in trying to narrow down their choice. Let's take an example of a wedding ring that's available these options: 10kt yellow gold
Technical SEO | | Leighm
10kt white gold
18kt yellow gold
18kt white gold
Palladium
Platinum These are all entered as separate products, so that they can be used in the filtering system. However, apart from some minor changes to the title and description most of the content will be identical, across these 6 product pages. Also, many wedding ring styles are going to be very similar, so we're going to have very similar descriptions for a lot of the rings. We're concerned about problems this might cause with the search engines in terms of duplicate content. There's 2 issues that I an see (there may be more!): They will not index many of the pages and we'll leak link juice to those pages that will never get indexed They do index all the variations, but the content is so similar, that we have different pages competing for essentially the same keywords Also, these products are likely to come and go, so investing heavily on creating really unique content for them isn't really sustainable, affordable. Any advise? Thanks,0 -
Duplicate Content via a product feed & data
We have uniquely created all of our product content on our website (Titles, product descriptions, images etc). However, we are also a manufacturer of these products and supply to a number of trade customers. These customers often wish to setup their own websites to re-sell these products. In the past we have quite happily given this content in order to assist our customers sell on their sites. Generally we give them a 'data dump' of our web data and images, but reading about duplicate content this will lead to the search engines seeing lots of identical content on these customer sites. Whilst we wish to support our customers we do not want to harm our (and their) site by issuing lots of duplicate content around the web. Is there a way we can help them with the data without penalizing ourselves? The other issue is that we also take this data feed and use it to sell on both Amazon & Googlebase. Will using this identical data also rank as duplicate content as a quick search does show both our website and amazon product page? When creating Amazon listing do these need to vary from the standard website descriptions? Thanks
Technical SEO | | bwfc770 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0