When to stop link building because page authority is low - open site explorer
-
Hi,
I'm link building with Open Site Explorer. I'm really picky in get links from only high quality sites.
When do you stop going down the list of possible backlink providers because the page authority is too low.
I usually stop at 40, but what do you do, why, and what does it depend on?
-
Lots of people come to these forums looking for numbers and formulas.
If the link is going to be on what looks like a quality site then it doesn't matter what the page authority is...
.... and if the site has a high page authority but it looks like a crappy manipulative site then why would you want a link on it?
-
Disregard page strength, acquire links.
...as long as the page is relevant. The fact that cheaper forms of SEO like article marketing are still effective proves that poor page strength shouldn't sap your efforts so quickly. I use page strength more of an indicator of where on my priority list I should put acquiring a link, if possible, and also assuming the page is relevant.
I understand not wanting to be associated with bad neighborhoods. Page strength is not a strong indicator of a bad neighborhood. One common SEO tactic is to create contests, correct? One recent contest I developed required participants make a review of a product, and naturally that meant linking to the client website. Most of these reviews came from personal blogs and general websites that did not have a lot of domain authority. Yet it was still a boost to rankings and domain authority.
Use your own judgement by looking at a website and determining if you want to be associated with it. Don't pick and choose based on the website popularity alone.
-
Someone else may be able to give you an 'absolute stopping point', but I really play it by ear if there's something that looks relevant and good, then I'll try and get a link.
I obviously wouldn't put as much effort into getting a link with a lower PA/DA as a high one. If I was to put a lower limit, I'd always use domain authority rather than page authority as this to me is a much better indicator, especially if you're looking for a link on the site in general, not just specifically on the same page as your competitor's link.
If I was to use a lower limit, I'd probably stick around the 40 mark for domain authority to be worth putting time and effort into getting a link, but I would still consider a lower DA if the link was relevant and fairly easy to get.
-
Good response. That helps. Yes, I will use gut instinct. There's one thing, though - I don't want any new sites. One of my requirements is age. Does that change your response? Also, where do you absolutely stop? PA 25? PA 20? I'd like to have a stopping place to rely on.
-
A link can still be good regardless of page authority, simply because there are many other ranking factors involved to determine the quality of a site which simply aren't available to OSE. You could easily get a spammy directory much higher in the OSE listings than a good quality content rich site that's lacking links further down.
I would use a combination of page authority and gut instinct. 40 is probably a good level to use as a marker, however there may be some newish or good content sites further down that you can make a judgement on; it's just that they may be fewer and farther in-between
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My domain authority is low in MOZ and high in other apps
_I checked my site in other SEO apps and they are showing a higher DA for my site. However, in MOZ is very low. Can you please explain it to me? _what caused the difference?
Moz Pro | | K_Monestel0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
The pages that add robots as noindex will Crawl and marked as duplicate page content on seo moz ?
When we marked a page as noindex with robots like {<meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noindex</a>" />} will crawl and marked as duplicate page content(Its already a duplicate page content within the site. ie, Two links pointing to the same page).So we are mentioning both the links no need to index on SE.But after we made this and crawl reports have no change like it tooks the duplicate with noindex marked pages too. Please help to solve this problem.
Moz Pro | | trixmediainc0 -
Indexed sites
Hello I´m a newbie here on seomoz. My question is just simple but i need to be sure about that:Refering to the report which you can create in the "crawl diagnostic summary". Does the CSV export lists all and ONLY the URLS which are in the google index?If not: does a report is available where all and ONLY the URLS are listed which are in google index?Many thanks! Henrik
Moz Pro | | drgoodcat0 -
Open Site Explorer CSV export limit?
Hi! Something has been puzzling me. I've filter down a few things within open site explorer to produce some links of interest to me - around 500 records are showing When I try to export it via CSV however, only 25 links appear? Anyone know why and how I can get the rest?? David
Moz Pro | | rejigdigital0 -
Domain and Page authority dropped on home page
My two strongest links did not show up in the latest Open Site Explorer run. Business.com and BOTW.com...Linking domains went from 21 to 15. Domain dropped 3 pts and Page authority dropped 3 pts I believe. Is there a function in SEOmoz.org that lets you track your domain scores over time? Thanks, Boo
Moz Pro | | Boodreaux0 -
In OSE "Followed Linking Root Domains" = "links from homepages"
In OSE's, "Followed Linking Root Domains" are defined as "The number of root domains that have at least one followed link to a page or domain." Does this mean that if one of my competitors has, let's say, 1000 followed linking root domains, they have a link pointing to them on the homepage of 1000 other sites? Thanks for your help!
Moz Pro | | gerardoH0 -
Page Rank and offline sites
I have a domain with PR6 according to the Historical Pagerank Checker. But that last PR was calculated 2 years ago. I brought the site back online a few days ago and have checked that many/most of the backlinks are still valid. It is now in the Google index but the Historical Pagerank Checker shows PR0. Will it get back its previous rank or something close to it? How long will it take?
Moz Pro | | DomainOptions0