When clients slap 3rd party benners on their website...
-
Ciao from Latitude 53.92705600 Longitude -1.38481600
Ive got a naughty cluster of clients who are slapping third banner ads on their home pages for reasons that only marketing executives understand. So here i am on the SEO side Ive added _balnks and no do follows on the rogue banners but i'm looking for a plausible argument to sh@t a client up into not doing this.
So my question is please:
"What is the number 1 reason why a client should not place third party banners pointing too non relevant sites (eg a web site focused on furniture placing a bed fred banner on their home page)"Let the games begin!
Ciao,
David -
Looks like nofollow ads to me.
Nothing wrong with it. I show ads on almost every page that I have published.
Rankings are great so I think that it is good for SEO.
Almost every large commercial site on the web shows ads.
-
Hi Guys,
Thanks for ansers thus far, to help out this is the offending site:
http://www.thirskracecourse.net/grazie tanto,
David -
There might be an underlying business model (i.e. paid advertising or affiliate scheme) for displaying 3rd party banners. As long as you nofollow the banner links there is no SEO damage and you will "only" suffer the churn from customers.
On a technical level you can implement Analytics click-tracking on the banners and then show your clients if the click-through is high or not and then motivate the use of banners accordingly.
-
...and possibly draw visitors away from your site and your conversion goals (unless your goal is to earn revenue from advertising....)
Main problem is the damage they can do to credibility/quality as Chris says.
-
They lower the visitor's perception of the quality of the site and the value they may get out of it.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best strategy for dissolving an innocently created link network with over 100 websites?
Hello Moz Community, Over many years 120 websites were created all under a couple different organizations around the globe. The sites are interconnected via anchor text and domain name links and some redirect to larger sites. The teachings have a central theme and many tools, training programs, events, locations and services are offered on many different websites. Attached is a slice of a Majestic Link Graph showing the network. God bless Majestic for this new tool! We are looking for solutions that are efficient and effective in regards to usability, rankings and being achievable. Thank you so much for your help! Donna EJhNPqT
White Hat / Black Hat SEO | | Awakening-Mind0 -
Do the links from top websites' forums boost in-terms of backlinks?
If we get any backlinks from discussions/forums of top websites like wordpress and joomla forums; do they count as valid and authority improving backlinks? I mean about the dofollow links.
White Hat / Black Hat SEO | | vtmoz1 -
Competitor is interlinking between his websites
I have a competitor who ranks in the first page for all his keywords and i found out in open site explorer that he has been interlinking between websites and it is obvious because he owns the same domain but different countries. for example, www.example.id (indonesia) www.example.my (malaysia) www.example.sg (singapore) (asian countries domain) my question here is this even consider "white hat"? I read one of the blog post from moz and here is the quote "#7 - Uniqueness of Source + Target The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to): A large number of shared, reciprocated links
White Hat / Black Hat SEO | | andzon
Domain registration data
Shared hosting IP address or IP address C-blocks
Public acquisition/relationship information
Publicized marketing agreements that can be machine-read and interpreted If the engines determine that a pre-existing relationship of some kind could inhibit the "editorial" quality of a link passing between two sites, they may choose to discount or even ignore these. Anecdotal evidence that links shared between "networks" of websites pass little value (particularly the classic SEO strategy of "sitewide" links) is one point many in the organic search field point to on this topic." will interlinking between your sites will be ignored by google in the future? is this a time bomb method or it is fine doing so? Because as far as concern my competitor is actually ranking on the first page for quite some time.1 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
How can do I report a multiple set of duplicated websites design to manipulate SERPs?
Ok, so within one of my client's sectors it has become clear that someone is trying to manipulate the SERPs by registering tons of domains that are all keyword targeted. All of the websites are simply duplications of one another and are merely setup to dominate the SERP listings - which, at the moment, it is beginning to do. None of the sites have any real authority (in some cases 1 PA and DA) and yet they're ranking above much more established websites. The only back links they have are from dodgy-looking forum ones. It's all a bit crazy and it shouldn't be happening. Anyway, all of the domains have been registered by the same person and within a two-month time period of each other. What do you guys think is the best step to take to report these particular websites to Google?
White Hat / Black Hat SEO | | Webrevolve0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
HELP! My client got a DDOS Attack! Need advice
Here the setup: Server is hosted inhouse. It got attacked using a DDOS from 20+ IP addresses spoofing in different counries. Our server overloaded and didn't work anymore. URL is registered at GoDaddy. Signed up at Dreamhost. We pointed DNS to Dreamhost successfully. Attacks kept coming and messed up other sites on the Dreamhost shared server. We didn't know we were being followed at first. We originally thought they were attacking the IP address on our inhouse server. Dreamhost noticed the attack and put us on a seperate IP and disabled our URL until the attacks 'stopped'. MY QUESTION IS: What do I do if they don't stop? Close shop? 99% of the business is internet driven. This has to be the blackest Blackhat SEO ever.
White Hat / Black Hat SEO | | Francisco_Meza0