DA vs Relevancy - Trade Off Question
-
Hey Guys
We all know that relevancy largely trumps DA nowadays.
What I am wondering is if there is a DA 'level' at which relevancy doesn't really matter - you probably still want a backlink from that site...
For example, sites with DA of 100 we probably want backlinks from.
So where do you draw the line? What I mean is for a high DA 'non relevant' site, what DA is 'acceptable' where you start to disregard relevancy? I'm thinking something like 70 and above would like some other thoughts...
Obviously you would still be building relevant links too, developing content to do so and all that good stuff. I am just wondering what DA I should focus on for building non-relevant links ALONGSIDE relevant links
Thanks
-
I submitted a request through The Guild to find out why my original answer above was not delivered to you. Sorry about that.
As to your question about the average Moz Domain Authority. DA is made up of "MozRank", "MozTrust" and your link profile. Of those three, the one that we can most quantify is "MozRank." Moz actually says the average MozRank (from 1-10) for a site page is a "3."
So just taking that information it's easier to see why the average Moz Authority would be on the lower end 30-40 and not 50 or higher. It also explains why it is easier to move from 30 to 40 than it is to move from 70-80.
Good luck with the above, let me know if I can provide any other help.
-
Weird, I didn't get anything over there, just checked and it's still not on the thread or anywhere I can see...
I posted it on a few forums as I wanted to see what the response was like as a sort of 'test' as to which forum I should spend time on
Anyway, thanks for the response. I was not aware so many sites below 40 existed - I really had no basis point for what was an above average DA (I would naturally have assumed above 50!).
I will check out your articles - thanks again!
-
Hi Michael,
You submitted this same exact question over at The Guild (SEO Chat). Did you receive that answer?
Here's that answer again, including links to the subscriber-only materials you have access to you. Let me know if you can't access anything or if you have any clarification questions:
First, if you haven't seen our article this month on Domain Authority I'd like you to review that first here:
https://guild.seochat.com/se-news/content/looking-to-boost-your-domain-authority-here-are-some-tips
As for your question below, honestly, there is no right answer. Speaking as a link builder myself I'm ALWAYS going to choose a lower DA value site link target that is niche-related over a higher DA site that is not. That's a qualified linking target. Just as I would always choose a link on a page that is actually going to get clicked over a link on a higher DA page that sends me NO traffic.
So there is really not a quantifiable answer to your question. If you want to use DA 70 as your make or break it number, awesome. But considering that the VAST majority of sites have a DA score of 40 or less, personally, 70 seems awfully high to use as a preconceived standard for evaluation.
Make sure to read my detailed article on link evaluation here:
When I'm evaluating a link I'll honestly not review DA till the end. And even then, again, if a link isn't going to get clicked or send traffic, to me that is a HUGE deal and I devalue the link immediately.
Hope that helps. Thanks for the question and definitely respond (other here, or at The Guild) for more information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword research and relevancy
Hello, I am wondering if it is ok to use a keyword that is in 950 position in the keyword list with a relevancy of 1 or it if it better to only use keywords with relevancy of 5 or 6 which are at the top of list ? By the way there are some keyword that I found 3 weeks ago doing my keyword research that i don't find anymore in the list doing my keyword research, is it possible to that the list varies week after week ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Links Questions and advice?
I have a website which has a fair few link assets that are doing very well (a lot of really powerful sites have link to them with follow links) but my commercial pages are not doing as well as a lot of sites without any other investment than (mediocre) links direct to there commercial pages with at least 10% of them carrying the money anchor text. Even pages we have had a few links for with generalized real anchor text and reasonable links do not do as well as the above due to none of them carrying the money keyword? Is it me or does google still rely on links to the commercial page and keywords with anchor text to match the money term?
Intermediate & Advanced SEO | | BobAnderson0 -
PageSpeed Vs Page Size
Hi, We all know that Google doesnt like slow loading pages, fair enough! However, for one of my websites, user interactivity is key to its success. Now each of my pages are fairly large sized (ranges in the order or 1.8 to 2.5 MB) because it has a lot of pictures, css and at times some Java script elements. However, I have tried to ensure that the code is optimized - for example html minified and compressed, caching enables, images optimized and served through CDN, etc. In spite of high page size, my GTMetrix PageSpeed score is 93+ for most pages. However, the number of requests served is 100+ and page loading time is 4.5s + as per GTMetrix and Pingdom. My question is - should this matter from an SEO perspective. Is google likely to penalize me for high loading time even though I am serving highly optimized pages? I really dont want to cut down on the user interactiveness of my website unless I have to from an SEO perspective. Please suggest. Here is my homepage, just as to give you an idea of what i am talking about: www.dealwithautism.com
Intermediate & Advanced SEO | | ashishb010 -
Question about multiple websites in same field
I know what most people say that it is best to only have the 1 website for focus but if we can put this to the back of our minds, if we create 2 different websites that are totally different designs (one upmarket one and one targeting the cheaper market) but in the same fields (printing) and go after 80% of the same keywords is this ok (could we be penalized). Please note we will not be interlinking the websites, the website .will be on different servers and the names will be registered under different people (2 partners in the business). We will however be accessing webmaster tools from the same location.
Intermediate & Advanced SEO | | BobAnderson0 -
Divs Vs Table for styled data\
Hello, We're in the process of launching MultipleSclerosis.net and are a bit confused with how to present some specific information. Looking at pages such as http://multiplesclerosis.net/symptoms/, http://multiplesclerosis.net/what-is-ms/courses-patterns/ and http://multiplesclerosis.net/treatment/prescription-nonprescription-medications/ is it better to keep this data structured as divs, and style them as tables or to keep them as tables and style them accordingly. Though not technically "tabular" data, i'm not too sure how to handle this. The text to code ratio is quite high with the divs in the markup, which though i'm not overly worried about, it could cause some issues with the site's indexability. Thanks I appreciate any feedback.
Intermediate & Advanced SEO | | OlivierChateau0 -
A very basic seo question
Sorry, been a long day and wanted a second opinion on this please.... I am developing an affiliate store which will have dozens of products in each category. We will not be indexing the product pages themselves as they are all duplicate content. The plan is to have just the first page of the category results indexed as this will have unique content about the products in that section. The later pagnated pages (ie pages 2,3,4,5 etc) will have 12 products on each but no unique content. Would the best advice be to add a canonical tag to all pages in the 'chairs' category pointing to the page with the first 12 results and the descriptions? This would ensure that the visitors are able to browse many pages of product but google won't index products 13 and onwards. Am I right in my thinkings? A supplemental question. What is the best way to block google from indexing/crawling 90,000 product listings which are pulled direct from the merchant so are not unique in the least. I have previous played with banning google from the product folder but it reports health issues in webmaster tools. Would the best route be a no index tag on all the product pages and to no follow all the products in the category listings? Many thanks Carl
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Meta keywords vs tags
On a blog from an SEO perspective how do you choose keywords to use in the "meta keyword tag" vs. "post tags"? Will it be different based on the search volume/competition of the keywords targeted?
Intermediate & Advanced SEO | | saravanans0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0