Is using outbrains legal in Googles eyes?
-
Is using outbrains legal in Googles eyes?
I have been using outbrains and drive 1K traffic each day for 50$ is this something good?
Is using outbrains legal in Googles eyes?
-
As long as the 1k of traffic is strongly engaging with your site, then yes it is good. Alan details the no follow attributes.
-
It's not necessarily a matter of whether Outbrain is "legal" according to Google as a single consideration.
If the code is implemented in a way that doesn't redirect, and if that linking is not nofollowed, then that is in violation of Google policies. That shouldn't happen though.
Where the problem becomes more complex is in how Google's algorithms might process a site that uses Outbrain or Taboola or other similar services, and where the end result is that site's ranking signals decline.
Several scenarios exist that can cause this.
1. 3rd party "hey, here's a bunch of links to other places" widgets can often add heavy page processing delays - especially when there's code bloat, or when at the code level, several server calls go out to that 3rd party server network (and often to multiple different servers in that network), and where bottlenecks can come up over the web eco-system.
2. 3rd party widgets of this type can make it that much more difficult for search algorithms to separate out the on-site content (both visible and within code that isn't seen) from 3rd party, irrelevant, and often absolutely garbage-quality content contained in those widgets. This doesn't always happen, yet it can - and sometimes does cause topical focus confusion, leading to misunderstood topical dilution.
3. Users often click on 3rd party widget links of this type, yet many other users hate it - find it insulting, and downright obnoxious when the quality of those links, and the images they stick in the user's face are grotesque or near-porn in quality. That can sometimes then impact overall User Experience and weaken site quality and trust signals.
It's Outbrain and Taboola who are among the leading causes of ad-blocking now being a major problem for publishers and revenue. The lowest quality ads, especially those disguised as "related content" get geeks and nerds and intellectual site visitors boiling mad. In some ways they aren't as obnoxious as auto-play video ads, or fly-over ads that block reading, yet in quality terms, they are much worse. If the advertising industry doesn't clean up its act with quality, and if publishers don't do the same thing, the battle is only going to grow.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
My Website stopped being in the Google Index
Hi there, So My website is two weeks old, and I published it and it was ranking at about page 10 or 11 for a week maybe a bit longer. The last few days it dropped off the rankings, which I assumed was the google algorithm doing its thing but when I checked Google Search Console it says my domain is not in the index. 'This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed.' I click request indexing, then after a bit, it goes green saying it was successfully indexed. Then when I refresh the website it gives me the same message 'This page is not in the index, but not because of an error. See the details below to learn why it wasn't indexed.' Not sure why it says this, any ideas or help is appreciated cheers.
Technical SEO | | sydneygardening0 -
Should I use the Google disavow tool?
Hi I'm a bit new to SEO and am looking for some guidance. Although there is no indication in Webmaster tools that my site is being penalised for bad links, I have noticed that I have over 200 spam links for "Pay Day Loans" pointing to my site. (This was due to a hack on my site several years ago). So my question is two fold. Firstly, is it normal to have spammy links pointing to your site and secondly, should I bother to do anything about it? I did some research into the Disavow tool in Webmaster tools wonder I should use it to block all these links. Thanks
Technical SEO | | hotchilidamo0 -
Dev Site Was Indexed By Google
Two of our dev sites(subdomains) were indexed by Google. They have since been made private once we found the problem. Should we take another step to remove the subdomain through robots.txt or just let it ride out? From what I understand, to remove the subdomain from Google we would verify the subdomain on GWT, then give the subdomain it's own robots.txt and disallow everything. Any advice is welcome, I just wanted to discuss this before making a decision.
Technical SEO | | ntsupply0 -
When do you use 'Fetch as a Google'' on Google Webmaster?
Hi, I was wondering when and how often do you use 'Fetch as a Google'' on Google Webmaster and do you submit individual pages or main URL only? I've googled it but i got confused more. I appreciate if you could help. Thanks
Technical SEO | | Rubix1 -
How to know which pages are indexed by Google?
So apparently we have some sites that are just duplicates of our original main site but aiming at different markets/cities. They have completely different urls but are the same content as our main site with different market/city changed. How do I know for sure which ones are indexed. I enter the url into Google and its not there. Even if I put in " around " it. Is there another way to query google for my site? Is there a website that will tell you which ones are indexed? This is probably a dumb question.
Technical SEO | | greenhornet770 -
Change in how Google displays SERPs
Hi All, Recently our SERPs have changed in Google results to show product prices from our pages rather than the meta description. This just started to happen in November with no change (that we know of) on our side. I have attached a from and to SERP image if that helps. Does any one have any ideas as its starting to effect our rankings? Thanks, Tony. Tkeou,6jg6Q Tkeou,6jg6Q#1
Technical SEO | | tstauntonwri0 -
How to disallow google and roger?
Hey Guys and girls, i have a question, i want to disallow all robots from accessing a certain root link: Get rid of bots User-agent: * Disallow: /index.php?_a=login&redir=/index.php?_a=tellafriend%26productId=* Will this make the bots not to access any web link that has the prefix you see before the asterisk? And at least google and roger will get away by reading "user-agent: *"? I know this isn't the standard proceedure but if it works for google and seomoz bot we are good.
Technical SEO | | iFix0