How to change URL of RSS Feed?
-
Hi,
There are some websites that keeps on scraping my content. I have blocked them already from accessing my website using .htaccess but they still get my content via RSS feed.
I have tried delaying the RSS feed but I think this affected google rankings. My question is, is there a way to change the URL of my RSS Feed?
From: http://www.mysite.com/feed to http://www.mysite.com/feed2
-
I think than you had better consider legal steps. If they have acces to your content from a third domain or ip they can also do that from a furth or fifth one. So no metter how many ips you will block, if they do know you feed address they can subscribe with a completely new one. In my opinion if this is the case than legal solution should be the best for you.Copyright your articles.
-
Hi Zsolt,
Unfortunately, the scrapers are smart enough to remove the links in the content that they copy. I don't know how they do it but even though my post has links and the h1 title is also a link, the are able to strip the url off.
-
Maybe one of the simplest solution is not blocking those sites to access your content. Make your h1 tag a link to the actual post. So if you have a post title post1 on the url domain.com/post1, the post1 heading on the top of the page should point to domain.com/post1. So if anybody steals your content than they will point back to the original content on your site, so of all copies your site will be he strongest one with the most links.
I would also place links in the content body pointing to my other pages, so anybody copying my content would be appriciated as they are giving handful of backlinks in return. I would also write a little info panel in the bottom: this atrticle was originaly posted on: www.domain.com and written by xy.com. Find similar articles here: domain.com/relatedposts.
In the mean time I would place a leagl statement that copying my content is all right, but just with the links included.
-
Hi Zsolt,
What might be the best solution to prevent the scrapers from stealing my content? Some examples of scrapers of my site are
and many more... I have tried blocking these sites in htaccess using both ip and domain name by the following code format:
Block by IP and Domain
order allow,deny
deny from 208.43.239.80
deny from hownews.infoBut though such code is already in the htaccess, they can still scrape my content. I know they are getting it from my RSS feed because when I put some code to delay my RSS Feed, they are not able to get my latest posts.
I hope you can help me find out on how to put a stop on this scrapers as they are really hurting my rankings. They even rank higher than my site which is the source of the original content.
Thanks in advance....
-
You have to cotact with your programmer to change the url for you, or in some cms systems you can do it in the backend.
What do you exactly mean by scraping? IF they steasl your content, than using a new url is not the best solution fo you.
Rss and rankings: rss usually contains the same information that is already available on your site on some url (not in all cases of course but usually). If that is the case than the feed only has negative affects on your rankings as it duplicates the content: the exact same text that you can find on domain.com/xy can be found on domain.com/feed/xy. So if that is the case you should not worry about your rankings.
If you change your url you should also redirect teh old one at the same time, if you do not do this than all of those who are yet subscribed will lose your feed, you do not want that. If you redirect, than anybody who knew the old url will get to the new one. I think it is pointless.
I would block the ip adresses i do not want to access my content. You could also try to apply legal stuff: say nobody is allowed to use your copys on their own sites. It is easy to find out if anybody does.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL has caps, but canonical does not. Now what?
Hi, Just started working with a site that has the occasional url with a capital, but then the url in the canonical as lower case. Neither, when entered in a browser, resolves to the other. It's a Shopify site. What do you think I should do?
Technical SEO | | 945010 -
Changing URIs
About 2 months ago, my client changed its domain—using 301 redirect—and took the hit in search results. Here's my question: If I convince the client to revert to their old URI, will that help or hurt their SEO? Follow-up, can it hurt to change interior URIs to something that better reflects the page content? Thanks.
Technical SEO | | mollykathariner_ms0 -
Best way to deal with these urls?
Found overly dynamic urls in the crawl report. http://www.trespass.co.uk/camping/festivals-friendly/clothing?Product_sort=PriceDesc&utm_campaign=banner&utm_medium=blog&utm_source=Roslyn Best way to deal with these? Cheers Guys
Technical SEO | | Trespass0 -
How to change primary language of the website?
Problem: there is a domain.com which primary language is Lithuanian, we want to switch it to English. The English content is on the website fully translated under domain.com/en/english-url. Question: How do i switch English content to domain.com while moving the Lithuanian one to domain.com/lt/lithuanian-url The purpose of course is NOT to loose neither English nor Lithuanian organic traffic Possible solution: the only solution I though of is to 301 English /en urls to domain.com ant to 301 the Lithuanian domain.com urls to /lt. Is that everything I should do or is there some other meta tags, server side or other stuff i should be worried about?
Technical SEO | | SEO_MediaInno0 -
500 Server Error on RSS Feed
Hi there, I am getting multiple 500 errors on my RSS feed. Here is the error: <dt>Title</dt> <dd>500 : Error</dd> <dt>Meta Description</dt> <dd>Traceback (most recent call last): File "build/bdist.linux-x86_64/egg/downpour/init.py", line 391, in _error failure.raiseException() File "/usr/local/lib/python2.7/site-packages/twisted/python/failure.py", line 370, in raiseException raise self.type, self.value, self.tb Error: 500 Internal Server Error</dd> <dt>Meta Robots</dt> <dd>Not present/empty</dd> <dt>Meta Refresh</dt> <dd>Not present/empty</dd> Any ideas as to why this is happening, they are valid feeds?
Technical SEO | | mistat20000 -
Category URL Duplicate Content
I've recently been hired as the web developer for a company with an existing web site. Their web architecture includes category names in product urls, and of course we have many products in multiple categories thus generating duplicate content. According to the SEOMoz Site Crawl, we have roughly 1600 pages of duplicate content, I expect primarily from this issue. This is out of roughly 3600 pages crawled. My questions are: 1. Fixing this for the long term will obviously mean restructuring the URLs for the site. Is this worthwhile and what will the ramifications be of performing such a move? 2. How can I determine the level and extent of the effects of this duplicated content? 3. Is it possible the best course of action is to do nothing? The site has many, many other issues, and I'm not sure how highly to prioritize this problem. In addition, the IT man is highly doubtful this is causing an SEO issue, and I'm going to need to be able to back up any action I request. I do feel I will need to strongly justify any possible risks this level of site change could cause. Thanks in advance, and please let me know if any more information is needed.
Technical SEO | | MagnetsUSA0 -
Ignore Urls with pattern.
I have 7000 warnings of urls because of a 302 redirect. http://imageshack.us/photo/my-images/215/44060409.png/ I want to get rid of those, is it possible to get rid of the Urls with robots.txt. For example that it does not crawl anything that has /product_compare/ in its url? Thank you
Technical SEO | | levalencia10 -
URL parameter reduction plug in
Anyone know of a good plug-in that reduces the amount parameters used in URLs? I need one for an ASP based system and a PHP based system
Technical SEO | | matmox0