RSS feeds- What are the secrets to getting them, and the links inside then, indexed and counted for SEO purposes?
-
RSS feeds, at least on paper, should be a great way to build backlinks and boost rankings. They are also very seductive from a link-builder's point of view- free, easy to create, allows you to specifiy anchor text, etc. There are even several SEO articles, anda few products, extolling the virtues of RSS for SEO puposes.
However, I hear anecdotedly that they are extremely ineffective in getting their internal links indexed. And my success rate has been abysmal- perhaps 15% have ever been indexed,and so far, I havenever seem Google show an RSS feed as a source for a backlink. I have even thrown some token backlinks against RSS feeds to see if that helped in getting them indexed, but even that has a very low success rate.
I recently read a blog post saying that Google "hates aRSS feeds" and "rarely spiders perhaps the first link or two." Yet there are many SEO advocates who claim that RSS feeds are a great untapped resource for SEO. I am rather befuddled.
Has anyone "crackedthe code" onhow to get them,and the links that they contain, indexed and helping rankings?
-
Actually, RSS feeds are also used as a defensive method of link building. YOAST makes a plugin for Wordpress that everyone should use (if they use wordpress), one of the features is inserting text and links into your RSS feed.
Obnoxious scraper sites use RSS feeds to populate their websites, they do not monitor the content, its all automated. By putting links and a citation in your RSS feeds, this lets you at least get a little benefit from their theft of your content.
Link Explorer shows feedburner and a couple other RSS agg sites as high value referring sites.
-
why would anyone need this service? I believe the original question was RSS feeds from the site owner being indexed? RSS feeds should be submitted too google webmaster tools to be index by google and Bing offers a similar service too webmasters, After initial submission the webmaster never has to submit again?
If I wanted to push my content using RSS feeds then I would use Ping.fm to push my content and links to third party sites and social media.......
I am at a loss why a webmaster would use the linkilecious site?
-
Really detailed overlook. Nice touching on everything.
-
If I understand the question correctly you would like your content to be spread to other sites through rss feeds and then be indexed there with a backlink to your site?
Number 1: there must be a reason for the other site to index and create a backlink to your site.
Number 2: these links are almost always "no follow" and therefore need to reach a very high amount of links to be of any real use for you if you want to affect the serp.
eg: You submit your site to several "ping" sites of your choosing that index certain content and then when you publish a new story these sites get pinged from your cms and a nofollow backlink is created for you on that site,
Just make certain that these sites that you ping actually has good content and have fills a puropose for the visitors.
A better way though to keep control over the material is to create an own site running wordpress where you write about your site as a blog. Just put a news section in a sidebar and put your RSS feed in there. wordpress sites are indexed extremly fast and when you own the site you can choose to use follow links in the section on the blog site.
This should lead to a faster indexing and you create backlinks that have a function and furthermore you own the site linking to you primary site.
A short summary:
RSS feeds are good to spread content and attract visitors. They're not a quick way to get backlinks.
-
We use an RSS feed for new product lists. We may have some lag time before a new product gets put in a category and able to be browsed to on our site. The RSS feed gives a few days head start getting these new products into the search engines. We redirect all RSS links back to the main site links that include canonical tags for the main product pages.
-
RSS should be designed primarily for your users and secondly to syndicate out using RSS Aggregators to distribute parts of your content (headlines and URLS)
Be careful about how much of the article content you include within the RSS Feed themselves. Whilst is it good for the user to include the full article within the feed by doing so you are also giving scrapers an an easy time to reproduce your content and thus might end up being penalising for duplicate content even though you are the original source (I've seen this happen).
I've used two techniques in the past the first was to publish a short additional body that contain a call to action to follow the link to the original article stub. I then switched to publishing the full content within the feed just for my users but I am thinking about going changing it again and publishing part of the content within the feed and then have a call to action for the reader to visit my site for the full article which will hopefully increase CTR on the feed whilst reducing the content duplication issue
-
The link building power of rss feeds is simply in getting other sites to feature and link to your content via rss. There would be no utility for a bot to crawl your feed stand alone, it would rather just look at the content itself. Try submitting your feed to rss directories or having other webmasters feature your feed on their site. I believe several web 2.0 sites like squid allow for feed publishing as well. Hope that helps.
-
Sorry, I'm a little confused as well. Why would you want people linking to your RSS feed instead of your original posts? Why would you even want the RSS feed to be indexed and returned in search results rather than the original posts? Wouldn't Google want to link people to the original post vs. the RSS feed? Aren't RSS feeds supposed to be a feed of content already on your site... so I don't see why Google would have much of an incentive to spider it or return it in search results?
-
You load links into it, it then creates an RSS feed on their end that gets pinged. You can load any kind of link into it and it'll ping them.
-
Thanks, but Linklicious turns links into RSS feeds- it doesn't help get the RSS feeds, or their internal links, to get indexed as far as I know. Am I not understanding the service correctly?
-
This service works well, I've personally tested it: http://linklicious.me/
Try that or another pinging service, there are a ton of them out there.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unknown index.html links coming to my site.
I'm getting a lot of domain/index.html urls on my site which I didn't create initially. We recently transfered to a new site so those links could come from the old site. Does any know how to get a comprehensive list of all the urls that lead to 404?
Intermediate & Advanced SEO | | greenshinenewenergy0 -
Reasonable to Ask URL of Link from SEO Providing New Links before Link Activation?
My firm has hired an SEO to create links to our site. We asked the SEO to provide a list of domains that they are targeting for potential links. The SEO did not agree to this request on the grounds that the list is their unique intellectual property. Alternatively I asked the SEO to provide the URL that will be linking to our site before the link is activated. The SEO did not agree to this. However, they did say we could provide comments afterwards so they could tweak their efforts when the next 4-5 links are obtained next month. The SEO is adamant that the links will not be spam. For whatever it is worth the SEO was highly recommended. I am an end user; the owner and operator of a commercial real estate site, not an SEO or marketing professional. Is this protectiveness over process and data typical of link building providers? I want to be fair with the provider and hope I will be working with them a long time, however I want to ensure I receive high quality links. Should I be concerned? Thanks,
Intermediate & Advanced SEO | | Kingalan1
Alan0 -
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
SEO question
Hi there! I'm the SEO manager for 5 Star Loans. I have 2 city pages running. We are running our business in 2 locations: Berkeley, CA & San Jose, CA. For those offices we've created 2 google listings with separate gmail accounts. Berkeley (http://5starloans.com/berkeley/) ranks well in Berkeley in Gmaps and it shows on first page in organic results. However the second city page San Jose (http://5starloans.com/san-jose/) doesn't show in the Gmaps local pack results and also doesn't rank well in organic results. Both of them have authentic backlinks and reviews. It has been a year already and it's high time we knew the problem 🙂 any comment would be helpful. thanks a lot
Intermediate & Advanced SEO | | moonalev0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
Can't get auto-generated content de-indexed
Hello and thanks in advance for any help you can offer me! Customgia.com, a costume jewelry e-commerce site, has two types of product pages - public pages that are internally linked and private pages that are only accessible by accessing the URL directly. Every item on Customgia is created online using an online design tool. Users can register for a free account and save the designs they create, even if they don't purchase them. Prior to saving their design, the user is required to enter a product name and choose "public" or "private" for that design. The page title and product description are auto-generated. Since launching in October '11, the number of products grew and grew as more users designed jewelry items. Most users chose to show their designs publicly, so the number of products in the store swelled to nearly 3000. I realized many of these designs were similar to each and occasionally exact duplicates. So over the past 8 months, I've made 2300 of these design "private" - and no longer accessible unless the designer logs into their account (these pages can also be linked to directly). When I realized that Google had indexed nearly all 3000 products, I entered URL removal requests on Webmaster Tools for the designs that I had changed to "private". I did this starting about 4 months ago. At the time, I did not have NOINDEX meta tags on these product pages (obviously a mistake) so it appears that most of these product pages were never removed from the index. Or if they were removed, they were added back in after the 90 days were up. Of the 716 products currently showing (the ones I want Google to know about), 466 have unique, informative descriptions written by humans. The remaining 250 have auto-generated descriptions that read coherently but are somewhat similar to one another. I don't think these 250 descriptions are the big problem right now but these product pages can be hidden if necessary. I think the big problem is the 2000 product pages that are still in the Google index but shouldn't be. The following Google query tells me roughly how many product pages are in the index: site:Customgia.com inurl:shop-for Ideally, it should return just over 716 results but instead it's returning 2650 results. Most of these 1900 product pages have bad product names and highly similar, auto-generated descriptions and page titles. I wish Google never crawled them. Last week, NOINDEX tags were added to all 1900 "private" designs so currently the only product pages that should be indexed are the 716 showing on the site. Unfortunately, over the past ten days the number of product pages in the Google index hasn't changed. One solution I initially thought might work is to re-enter the removal requests because now, with the NOINDEX tags, these pages should be removed permanently. But I can't determine which product pages need to be removed because Google doesn't let me see that deep into the search results. If I look at the removal request history it says "Expired" or "Removed" but these labels don't seem to correspond in any way to whether or not that page is currently indexed. Additionally, Google is unlikely to crawl these "private" pages because they are orphaned and no longer linked to any public pages of the site (and no external links either). Currently, Customgia.com averages 25 organic visits per month (branded and non-branded) and close to zero sales. Does anyone think de-indexing the entire site would be appropriate here? Start with a clean slate and then let Google re-crawl and index only the public pages - would that be easier than battling with Webmaster tools for months on end? Back in August, I posted a similar problem that was solved using NOINDEX tags (de-indexing a different set of pages on Customgia): http://moz.com/community/q/does-this-site-have-a-duplicate-content-issue#reply_176813 Thanks for reading through all this!
Intermediate & Advanced SEO | | rja2140 -
Link Platform Diversity? Did you get slapped rankings?
I have been working on a few clients websites and have recently had rankings plummet. Most of the links i get are coming from wordpress platform(blog posts), in the past it worked very well, has anyone else had a problem with just using wordpress platform i.e most of the links are coming from blogs......and had rankings drop?
Intermediate & Advanced SEO | | foreignhaus0 -
Links from Directories
We are currently listed in a number of Association Buyer's Guides. We signed up for these last year in an effort to increase our external links. With recent Panda changes, do these links carry any value for us? Here are some examples: http://buyersguideforeducators.com/ http://wasteindustrymarketplace.com/ http://wefbuyersguide.com/ We're in about 40-50 of these different directories. Thanks, Ben
Intermediate & Advanced SEO | | Colbys0