Linking C blocks strategy - Which hat is this tactic?
-
This related to a previous question I had about satellite sites. I questioned the white-hativity of their strategy. Basically to increase the number of linking C blocks they created 100+ websites on different C blocks that link back to our main domain. The issue I see is that-
- the sites are 98% exactly the same in appearance and content. Only small paragraph is different on the homepage.
- the sites only have outbound links to our main domain, no in-bound links
Is this a legit? I am not an SEO expert, but have receive awesome advice here. So thank you in advance!
-
Thank you Robert! Let me take try your suggestions and then I will report back.
-
C3,
One of the things I would suggest is to start by having success defined utilizing KPI's, analytics, etc. Did you have an engagement with what they were to accomplish and so forth. Have a baseline of where the site was prior to the newcom coming on board. When did the changes take place (were they put into GA on the dates they occurred?)? What is the result since then? What else was done during that period? Now you have a starting point.
Next, I would suggest you get the lower cost ahrefs membership (even if only for a month) and run your site through ahrefs. You will have a near complete list of links to the site. Where are the 100 within this? How do they compare to the other links coming to the site? Also, look at the microsites and see if your site is the only one being linked to. Remember if you have your link and another, they gave half the value of the link away.
If this was the key strategy, when was it implemented and what has changed since then. Remember that data is your friend. With our clients we are careful to get a baseline, talk about the issues they are facing, delineate potential risks, etc. With these sites, run them in copyscape and see if even the unique content is unique. Did you pay for unique?
Next, I would run the site through a moz campaign and see what I see. I would look at GWMT and see if the linking sites are showing in GWMT and I would look to see how many new pages are being indexed subsequently. If someone is saying that this linking strategy is key and you have duplicate meta descriptions, Title Tags, no H1, etc. (run the site through Xenu and you will have all of that and more), I think you can find a dozen places where someone in SEO says, if you do not do the on page, etc. there is no reason to do the other.
So, the data will be your friend if you want to show whether or not this is working. Hey, if it is let us know and how and maybe we will all say, they are right, I was wrong.
Best,
Robert
-
Don't worry about any "major damage to our domain authority". Those sites/links as you described aren't helping any and in light of a potential penalty, you're better off removing them.
-
Hi Robert,
I appreciate you getting involved! According to our SEO provider this tactic is a major part of their strategy and reason for the success of the site. I asked them to disable them and then they said for sure we would see "major damage to our domain authority".
The other issue is that they actually don't spend any time on these sites. They haven't been updated or touched in 7 months. The blog posts and single "unique" paragraph per site has remained the same. In fact, blog posts are exactly the same on all sites, basically scraped. However, they bill us for these sites because they are supposedly required for our SEO success.
My challenge has been trying to question their strategy when I am not an expert and they are supposed to be. Yes, they speak as if this tactic is unicorn dust.
-
If you haven't done any link building to those sites, they are pretty much worthless. G knows about this strategy and best case scenario, ignores them. DA is irrelevant to rankings. I can show you many sites with amazing DA but shit rankings because they are penalized/crappy links.
Opportunity cost: 100 domains @ $10/yr + 100 ips @ $20/yr = $3k in yearly savings. You can easily put that money to better use.
-
Heh, heh. Does ring a bell doesn't it Robert?
I'd de-link stat before Google banishes my site and ignores my reconsideration requests.
-
C3
You have some good responses but this is another of those where it is hard to sit on the sidelines. I have to ask a few different questions with a situation like this; first, forget what they did re the C blocks. What was the desired result they were seeking? What was the plan (with rationale) to achieve that result? And, no matter the answer to any of that, what percentage of optimization/ranking do they or their client believe is related to linking?
So, do they really spend this much effort on a 20 to 30% factor? And remember, this is not effort around bringing in quality links, it is effort around linking as if that is the Holy Grail of SEO. Given the time spend, the opportunity spend, the actual cost to the client, etc. Is this 80% plus of the SEO effort? I would be surprised if it wasn't. Usually when I come across this kind of thing, the "SEO" firm doing it is doing it as some sort of silver bullet SEO. They have discovered a secret way to sprinkle unicorn dust on the algorithm, etc.
To me and in my opinion, it is not white hat, grey hat, or black hat with sequins. It is just a waste of time and energy. It is just highly inefficient. Are they saying they can do more with this strategy than say the people on this forum with an actual strategy? If you are worrying about can linking via multiple C blocks from EMD's I own for some sort of benefit to some site, I think you are looking at SEO from a very odd perspective (not you, I am using the global you as if for anyone who). Interesting approach.
Best
-
C3,
Let's see... if those sites have no inbound links, what value are they to the main domain? If they have no inbound links, how is Google going to find them? If you submit the urls to google, google will see 100 new new sites that were all registered at the same time (and maybe to the same owner), all with the same content, and all with links only to your site.
This attempt at manipulation is very easy for google to recognize and you're putting your main site in jeopardy by following this tactic.
-
Sorry, I just re-read my response. I wasn't trying to be condescending with the first line. I was actually trying to clarify who initiated the tactic. Thanks!
-
SEJunkie,
To clarify, the SEO provider did this. But, yes, 100+ direct match urls, all on different C block ip's, but mostly the same content. Navigational links from these site link to sections of our main site. Ex. "Electronics" on satellite site links to "Electronics" on our main site.
There is a paragraph on each homepage below the fold that describes that is unique for each page, but that is the only differing piece of content. The rest of the content is exactly the same including the blog posts.
-
Hi Eric,
Just to clarify, you have purchased 100+ domain names, created 100+ near duplicate websites, using hosting on 100+ different cblock ip's? I would lean more towards the thinking that it's a little bit on the black-hat side of the fence. With no backlinks these sites are offering no Domain Authority to your site. They still however, maybe passing some rank juice. You need to be able to test the effectiveness of the links in order to decide to keep it or remove it. If you find the links are passing some value, i wouldn't remove them. I suggest developing them into something more over time. You don't need to regularly update these sites, just develop somethng decent for a content centerpiece and move on to the next, before you know it you'll have your own network.
-
Oleg,
So what's best course of action? Building strong content for each of these sites (100+) would be an enormous task, but disabling would kill the number of linking domains, which I assume would lower our DA in a hurry.
We actually didn't ask or want the sites developed because we don't have the resources to develop content for so many sites. The SEO insisted and put the sites up for "free" as part of their strategy. Yet, they haven't developed any new content for these sites in over 7 months.
Seems like it was a mistake from the beginning to do this.
Thanks,
Eric -
This used to work, now its a waste of time that will most likely get you penalized.
You are better off using those time and resources to develop a strong piece of content and link building to it from authoritative sites.
Cheers,
Oleg
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long Google will take to Disavow a link?
Just want to know how long will Google take to Disavow a link? I uploaded my file on 18 Dec 2020 and today is 5th January 2021 and still, that link is appearing in my Search Console in Top linking domains. Anyone who recently done this practice and how long it took? I mentioned the domain name below and hopefully, it will disavow all the links [subdomain+www+without www] coming from that domain. domain:abcd.com Help me out, please...
White Hat / Black Hat SEO | | seotoolsland.com0 -
White hat or black hat?
There seems to be very differing opinions on what is good practice (white hat) and what is not (Black hat) and I'm not sure which way to lean (although my inclinations are slightly to the white). I'm starting a business offering a service and see ranking position 1-3 in the serps as my key to success. I'm creating good and useful content on my site and without much effort beyond on page seo have reached page 4 google for a few choice keywords. I feel that with a small number of links to a few of my pages i can reach page 1 and here is where my dilemma begins. With a bit of investment in some software (£400-600 for 3 different products) I can start Tiered linkbuilding (in a black hat way) and get results quickly but potentially risking my site in the eyes of google. I've been doing a little outreach to gain links in a whiter way but not had much success yet. I'm keen to keep with the whiter side but see progress as slower. Am I wrong? Can i build a robust link profile in a white hat way rapidly? Are there any quick wins i can gain to give me confidence? Why is white hat better than black hat? All wisdom, experience, guidance and humour gratefully received.
White Hat / Black Hat SEO | | roadhaulageservices0 -
Why do these links violate Google's Quality Guideline?
My reconsideration request was declined by Google. Google said that some of the links to my site (www.pianomother.com) are still outside its quality guidelines. We provide piano lessons and sheet music on the site. Three samples are given. 1. http://www.willbeavis.com/links.htm 2. http://vivienzone.blogspot.com/2009/06/learning-how-to-play-piano.html 3. http://interiorpianoservice.com/links/ The first one is obvious because it is a link exchange page. I don't understand why the 2nd and 3rd ones are considered "inorganic links" by Google. The 2nd link is a blog that covers various topics including music, health, computer, etc. The 3rd one is a page of the site that provides piano related services. Other resources related to piano including my website are listed on the page. Please help. Thanks. John
White Hat / Black Hat SEO | | pianomother0 -
Benefits of having outbound links
Are there any strengths (benefits) in having outbound links within the site regarding SEO? If linking to reputable sites, would that help increase our SEO strength or does that only work if they links back to us?
White Hat / Black Hat SEO | | WebRiverGroup1 -
Removing Poison Links w/o Disavow
Okay so I've been working at resolving former black-hat SEO tactics for this domain for many many months. Finally our main keyword is falling down the rankings like crazy no matter how many relevant, quality links I bring to the domain. So I'm ready to take action today. There is one inner-page which is titled exactly as the keyword we are trying to match. Let's call it "inner-page.html" This page has nothing but poison links with exact match anchor phrases pointing at it. The good links I've built are all pointed at the domain itself. So what I want to do is change the url of this page and let all of the current poison links 404. I don't trust the disavow tool and feel like this will be a better option. So I'm going to change the page's url to "inner_page.html" or in otherwords, simply changed to an underscore instead of a hyphen. How effective do you think this will be as far as 404ing the bad links and does anybody out there have experience using this method? And of course, as always, I'll keep you all posted on what happens with this. Should be an interesting experiment at least. One thing I'm worried about is the traffic sources. We seem to have a ton of direct traffic coming to that page. I don't really understand where or why this is taking place... Anybody have any insight into direct traffic sources to inner-pages? There's no reason for current clients to visit and potentials shouldn't be returning so often... I don't know what the deal is there but "direct" is like our number 2 or 3 traffic source. Am I shooting myself in the foot here? Here we go!
White Hat / Black Hat SEO | | jesse-landry0 -
How to know if a link in a directory will be good for my site?
Hi! Some time ago, a friend of my added our site to a directory. I did not notice it until today, when in the search results for my domain name, the directory came in the first page, in the four position. My friend wrote a nice article, describing our bussiness, and the page has a doFollow link. Looking at the metrics of that directory, I found the following: Domain Authority: 70; main page authority: 76; linking domain roots: 1383; total links: 94663 (several anchor texts); facebook shares: 26; facebook likes: 14; tweets: 20; Google +1: 15. The directory accept a free article about a company, does not review it before it is published, but look for duplicated articles representing spam; so one company can only have one listing (in theory). Is there any formula to know if a directory is safe to publish a doFollow link? If they don't review the link I would say is not a good signal, but is there any other factors to take into account?
White Hat / Black Hat SEO | | te_c0 -
Should I 301 Redirect a Site with an 'Unnatural Link' Warning?
Hey Fellow Mozzers, I have recently been approached by a new client that has been issued with an 'Unnatural Link' warning and lost almost all of their rankings. Open Site Explorer shows a ton of spammy links all using main keyword anchor text and there are way too many of them to even consider manually getting them removed. There are two glimmers of hope for the client; The first is that the spammy links are dropping off at a rate of about 25 per week; The second is that they own both the .com and the .co.uk domain for their business. I would really appreciate some advice on the best way to handle this, should I :- Wait it out for some of the spammy links to drop off whilst at the same time pushing social media and build some good clean links using the URL and brand as anchor text? Then submit a recosideration request? Switch the website over from the .com domain to the .co.uk domain and carry out a 301 redirect? Switch the website over from the .com to the .co.uk without doing a redirect and start again for the client with a clean slate? I would still register an address change via Webmaster Tools. Add a duplicate site on the .co.uk domain. Leave the .com site in place but rel="canonical" the entire domain over to the .co.uk Any advice would be very much apprecited. Thanks
White Hat / Black Hat SEO | | AdeLewis
Ade.0 -
Opinions Wanted: Links Can Get Your Site Penalized?
I'm sure by now a lot of you have had a chance to read the Let's Kill the "Bad Inbound Links Can Get Your Site Penalized" Myth over at SearchEngineJournal. When I initially read this article, I was happy. It was confirming something that I believed, and supporting a stance that SEOmoz has taken time and time again. The idea that bad links can only hurt via loss of link juice when they get devalued, but not from any sort of penalization, is indeed located in many articles across SEOmoz. Then I perused the comments section, and I was shocked and unsettled to see some industry names that I recognized were taking the opposite side of the issue. There seems to be a few different opinions: The SEOmoz opinion that bad links can't hurt except for when they get devalued. The idea that you wouldn't be penalized algorithmically, but a manual penalty is within the realm of possibility. The idea that both manual and algorithmic penalties were a factor. Now, I know that SEOmoz preaches a link building strategy that targets high quality back links, and so if you completely prescribe to the Moz method, you've got nothing to worry about. I don't want to hear those answers here - they're right, but they're missing the point. It would still be prudent to have a correct stance on this issue, and I'm wondering if we have that. What do you guys think? Does anybody have an opinion one way or the other? Does anyone have evidence of it being one way or another? Can we setup some kind of test, rank a keyword for an arbitrary term, and go to town blasting low quality links at it as a proof of concept? I'm curious to hear your responses.
White Hat / Black Hat SEO | | AnthonyMangia0