Is pulling automated news feeds on my home page a bad thing?
-
I am in charge of a portal that relies on third-party content for its news feeds. the third-party in this case is a renowned news agency in the united kingdom.
After the panda and penguin updates, will these feeds end up hurting my search engine rankings? FYI: these feeds occupy only 20 percent of content on my domain. The rest of the content is original.
-
So what do you suggest I do in this scenario, Brent? What's the right thing to do?
-
hmm..
In this case, for sites that are crawled more frequently by Googlebot, can I say that they might have an unfair advantage?
In the sense that, if they were to scrap or syndicate other sites content but due to Google crawling and finding the content on their site first (since they are crawled more frequently) Google will label them as the original while the actual content creator will be labelled as duplicate (if Google find the content on their site after...)
-
The first indexed version means:
1. When you make an original article and Google first crawls this article it is the "First Indexed Version" which means if another site picks up the content after you have it on your site it is duplicative content.
-
Could you explain a little bit more about what "first indexed version" means?
-
Ideally you want to have unique content on your website.
That is going to work best all of the time.
With News websites it becomes more complex, if you have Wires content or AAP content Google will treat the first indexed version as been the most trust worth version of the copy. Google may treat "syndicated" content in a sense that if it is only on 10 high quality websites it is going to be ok but in the end of the day it is still going to favour original content day in day out, the only benefit of Syndicated content is that it is used by businesses which may not have the time to produce the content.
I hope this helps.
Kind Regards,
James.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Script must not be placed outside HTML tag? If not, how Google treats the page?
Hi, We have recently received the "deceptive content" warning from Google about some of our website pages. We couldn't able to find the exact reason behind this. However, we placed some script outside the HTML tag in some pages (Not in the same pages with the above warning). We wonder whether this caused an issue to Google to flag our pages. Please help. Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Difference between anchor text pointing to an article in our section pages and the title of our article
My concern is described more in details in the following hypothetic scenario(basically this is the same method that CNN site applies to its site): In one page i have a specific anchor text e.g. "A firefighter rescued a young boy" and this one is linked to an article which if you enter you will see that it has a different title than the anchor text/short title that i mentioned above. So the internal titlte of the article is "A firefighte rescued a young boy in Philippines while it was rainy". I want to know whether this is a good SEO practice or not. Regards, Christos
White Hat / Black Hat SEO | | DPG_Media0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Duplicate content for product pages
Say you have two separate pages, each featuring a different product. They have so many common features, that their content is virtually duplicated when you get to the bullets to break it all down. To avoid a penalty, is it advised to paraphrase? It seems to me it would benefit the user to see it all laid out the same, apples to apples. Thanks. I've considered combining the products on one page, but will be examining the data to see if there's a lost benefit to not having separate pages. Ditto for just not indexing the one that I suspect may not have much traction (requesting data to see).
White Hat / Black Hat SEO | | SSFCU0 -
All pages going through 302 redirect - bad?
So, our web development company did something I don't agree with and I need a second opinion. Most of our pages are statically cached (the CMS creates .html files), which is required because of our traffic volume. To get geotargeting to work, they've set up every page to 302 redirect to a geodetection script, and back to the geotargeted version of the page. Eg: www.example.com/category 302 redirects to www.example.com/geodetect.hp?ip=ip_address. Then that page 302 redirects back to either www.example.com/category, or www.example.com/geo/category for the geo-targeted version. **So all of our pages - thousands - go through a double 302 redirect. It's fairly invisible to the user, and 302 is more appropriate than 301 in this case, but it really worries me. I've done lots of research and can't find anything specifically saying this is bad, but I can't imagine Google being happy with this. ** Thoughts? Is this bad for SEO? Is there a better way (keeping in mind all of our files are statically generated)? Is this perfectly fine?
White Hat / Black Hat SEO | | dholowiski0 -
Finding out why Bing gave page-level penalty?
In the last couple of weeks Bing has gradually removed 5 webpages of my website from their SERP's. The URL's are totally gone. They all had top 5 rankings and just got removed out of nothing. Have can I investigate what went wrong with these pages? Are here perhaps experts who are willing to investigate this for a fee? How can I restore a page-level penalty? I have no messages in my Bing Webmastertools account.
White Hat / Black Hat SEO | | wellnesswooz0 -
How to rank internal pages?
Hello, I have a website about consoles, on the homepage are a few thoughts about what consoles are and a short history. The main attraction are the pages about Xbox 360, PlayStation 3, Nintendo Wii, PSP Vita. So, I want to rank my homepage and my internal pages about the consoles ranking for "xbox360", "play station 3" each one on a separate page of course. Basically I want to rank brands. My main questions are: 1. How much link builing should I do for my homepage considering that I'm not really interested in ranking it as much as the internal pages? In percentage how it would look like? Random (stupid) example: 60% links to homepage, 10% to each internal page? 2. I guess I must do links for internal pages otherwise they won't rank good, only linking to homepage. 3. Considering the penguin update, my main keyword should be around what % of the overall anchors to each internal page? Thank you very much for your help!
White Hat / Black Hat SEO | | corodan0 -
Google Results Pages. after the bomb
So, ever since Google "went nuclear" a few weeks ago I have seen major fluctuations in search engine results pages. Basically what I am seeing is a settling down and RE-inclusion of some of my web properties. Basically I had a client affected by the hack job initially, but about a week later I not only saw my original ranking restored but a litany of other long tails appeared. I wasn't using any shady link techniques but did have considerable article distribution that MAY have connected me inadvertently to some of the "bad neighborhoods." The website itself is a great site with original relevant content, so if it is possible, Google definitely recognized some error in their destructive ranking adjustment and is making good on it for those sites that did not deserve the penalty. Alternatively, it could just be random Google reordering and I got lucky. What are your experiences with the updates?
White Hat / Black Hat SEO | | TheGrid0