Estimating the number of LRD I need to outrank competitor
-
I just ran a SERP/keyword diffculty report for a keyword I want one of my pages to rank
Also, I just conducted the on-the-page-optimization and now I am going to start buidling links.
=> I would like to estimate how many linking root domains I need to overrank one of my competitor. These are the MOZ data:
1. My page:
Page Linking Root domains: 0
Root Domain Linking Root Domains: 1512. Competitor:
Page linking root domains: 1
Root Domain Linking Root Domains: 5,786I don't really know on which metric (Page or domain LRD) to rely on in order to make an estimation and I would be glad for some help!
To simplyfy the problem, assume that all toher factors (code, on-the-page keyword use., social etc.) are equal for both sites. Can I just get 2LRD to that page in order to likely outrank my competitor or do I need around 5000 more links poiting to my site?
I think an answer to this question could help a lot of users here, since I saw similar questions/difficulties regarding the use of page LRD vs. root domain LRD
P.S. Non of the pages of my website do currently rank in the top 100 for that keyword.
-
**I am trying to get an rough estimation anyway.... **
That's good. Lots of people are selling the $1000/month package to people who need $5000/month to be effective and don't know it. You are trying to figure it out and that is very important.
Economists call this "ceteris paribus" analysis.
There's an awful lot of variables here. That's what makes it interesting.
-
Thanks EGOL for the detailed answer and sorry for my late reply- just coming back from vacation
It is clear to me that the question cannot just be easily answered by a number. I understand this and I wish my clients/supervisors would do that too - but that's a different story...
I am trying to get an rough estimation anyway, assuming that all other factors (on-the-page factors, words on page, quality of the content, social signals, age of the domian etc.) of the competitor's page/site are exactly the same as on my page/site and these factors would not move. Economists call this "ceteris paribus" analysis.
P.S. I've been in the business now for a while, but just starting to use SEOMoz (we used other tools before) - the ressources on this website and especially this question section is just simply awesome!!
-
Thank you Robert. I put a little extra effort into it because I figured that you would like it.
-
Ayup - and to accomplish that means being willing/able to do some business analysis, not just site analysis. Which moves into the realm of web marketing optimization, not just SEO. Which is where the real value in this whole process lies IMO
-
EGOL
This may be the best answer I have seen to any question. The beauty of it goes on and on in that anyone in SEO dealing with clients must always ask themselves: Can the client handle the truth and/or understand the explanation?
The explanation is what you just gave and the truth is that no one knows for sure and it will often cost you a lot to get there. The other difficulty for SEO clients is that we live in a world of instant gratification. I have clients who spend a lot with me and regularly ask what they are getting (yes, they get reports, etc.). Interestingly, these same clients will mention they have just authorized a $250,000 TV ad campaign. (Trust me we are a bit cheaper.)
They understand the tv ad no matter what the results; even with excellent results (more and more customers) they have a hard time understanding the SEO.
Egol,
Thanks a bunch for a truly excellent, thought provoking answer.
Robert
-
We compete against pages on about.com and amazon.com that are thin content and have zero off-page assets. About.com has skimpy information about our topic and nothing to sell. Amazon has a couple things to sell an zero informative information.
They outrank us for the short tail.
We have the best and deepest informative content on the web in our niche, informative video, youtube channel, lots of articles, printable resources, one of the best selections of products on the web, an email address that answers questions almost 24/7/365 and a phone where you can talk to someone who uses and tests most of the products regularly and has handled, inspected, photographed, described, every product that we sell.
This is one of the problems with google giving huge authority to the brands. Mom and pop know an awful lot more and give immediate, helpful, informative, caring service... but they get no respect in the SERPs.
I understand our position in the rankings. Praise be that I don't have to explain it to a client!
-
Or to determine whether the costs of attempting success are just too high to be worth it.
I would really respect an SEO who told me something like this.
-
Egol's explained the true scope of what you're asking, but to give a specific response to your specific question:
No, just having one more incoming link to your page than your competitor has to his will never guarantee you'll outrank him.
Search engine algorithms use hundreds of signals when ranking a page, and the number of incoming links to that page is only one measure.(And it's more than just wht else is on the page, like code, keyword use etc)
Even if we just restrict ourselves to considering the links to a single page, additional issues like the authority of the sources of the incoming links plays a huge part. Not all links are of equal value.
But more importantly, things like the authority of the whole domain have huge impact as well. That's why SEOMoz goes out of it's way to compute scores for domain authority as well as page authority. A weak page on a strong site will frequently outrank a strong page on a weaker site. That's one of the big frustrations/challenges of nearly all small site owners.
Bottom line, neither Page nor Domain LRD metrics are sufficient in themselves for assessing the work ahead of you.
Make sense?
Paul
-
What Egol has just described is why SEO is only part science and the rest - a significant portion - is still an art. It's also why automated implementation tools have never been successful.
It takes juggling a wide range of constantly changing factors, many with very subjective values, to deliver success. Or to determine whether the costs of attempting success are just too high to be worth it.
Website owners always HATE to hear the accurate response, but the honest response to so many (most?) SEO questions is:
It Depends.
And the good, ethical SEOs will tell you that right up front, as Egol has done. It would be so much easier if SEO, Conversion Rate Optimization etc were linear pursuits - do X and the result will be Y. But they're not, which is why good SEOs will beat those who simply follow (and promise results from ) formulas.
Frustrating huh?
Paul
-
Anyone who gives you a number for this question is full of beans.
This is one of the most difficult questions in SEO and most of the people who are charging clients for SEO can not answer it.
Lots of SEOs have clients on a $500/month plan when the target that they are attacking needs many times that much to become competitive.
Let me give just a couple reasons why this question is so difficult and why nobody here can give you an answer with the information that you provided.
====================
Your question is making a straight comparison.... it assumes that you are racing a stationary target. The target is not stationary. This question is really like this...
Two cars are driving on a road.... Car B leaves when Car A is already 100 miles down the road and traveling at a rate of 50 miles per hour. How fast will Car B have to drive to overtake Car A?
The answer is a velocity AND a time. And the question as I stated it assumes no acceleration.
If Car B drives 51 miles per hour... will you have the patience and budget for that long of a wait? Do you have the resources needed to drive 70 and not be stopped by Google?
=====================
Another example...
George Pickett was a southern general at the Battle of Gettysburg who was ordered by Robert E. Lee to run the Union Army off of Cemetery Ridge. To do that Pickett's troops would need to cross a mile-wide open field in broad daylight under an absolute hail of enemy fire.
The field was one mile wide, it was up a slight incline - if they were slow in crossing they would be mowed down...
The Union had hundreds of troops dug in and positioned on the ridge and ready to open fire - if Pickett had only a few troops they would be mowed down....
So Pickett needed to order an enormous number of troops across that field and order them to run their asses off to engage the Union Army quickly - or die in the middle of the field.
If they failed in getting a enough troops... if they failed in getting enough speed... if they failed in getting enough determination and courage... then they would be mowed down.
Your job is similar to Pickett's... you must get an enormous number of links... you must get them quickly... and you must get them before your budget runs out... and you better hope that your budget is big enough.
Pickett knew that his field was a mile wide.. he knew how many troops that he had. He could see Union troops on the other side of the field at the top of the ridge.
Pickett's generals complained when they were told the battle plan... but Pickett pointed at the Ridge.
This battle was lost on bad math.
But your problem is even more difficult... the field is getting wider as you cross it because the target is moving away from you.
==================
And your problem is also more difficult because links have different values. One link from the Pope's site is worth a thousand from pedestrian sites.
Furthermore... different pages compete with different amounts of vigor. I might have a fifty word page but my competitor might have 2000 words, ten images, a video, and tables of data. Big difference.
===================
So, how is an SEO to know what is needed?
-
You run the keyword difficulty tool and get a feel for the numbers (which contain much confusion).
-
You go out to those SERPs and visit the sites and get a feel for their authority and content quality.
-
Then you use open site explorer to see if the Pope is on their side.
-
Then you decide if you are up to beating their content, beating their numbers, beating their quality.... and if you can do that quickly enough before the conditions of engagement change or your budget runs out.
After all of that you decide on taking the gamble or not.
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am having a duplicate title -pagination issue Need Help
In WordPress I added %%page%% saved changes and ran a test in Moz and screaming frog, it doesn't show any duplicates. But when I open the site up in a new browser the duplicate titles are still there. No access to the PHP file due to the theme the client chose also. Any suggestions anyone?
Moz Pro | | Strateguyz0 -
Before Migration/after(www/non-www/http/https) - Good concentration needed :p
Hi all, Im confusing between those www's and http's. If i go to searchbar (chrome) and ENTER: www.mywebsite.nl, It changes to https://www.mywebsite.nl
Moz Pro | | Dreamgame2016
( with www, and https:// not used) / Its OK next: typing in searchbar and enter: mywebsite.nl, It changes to https://mywebsite.nl (without www and https:// ) / OK Next: www.mywebsite.nl, it stay the same, just https:// added: https://mywebsite.nl (used with https://) / OK Now its comes: If I do it again without http**(s)://mywebsite.nl, **It changes to https://www.mywebsite.nl/?SID=bccbuhvi1cf53r188bpvskn597 / NOT OK 😛 In google search console (webmastertool) I gave property for the https://mywebsite.nl and https://www.mywebsite.nl Each of the website, Im seeying data clicks/ volume keywords etc, so both of them functionating By search console: https://www.mywebsite.nl (With www) I see crawlfaults/errors: 1633 (the url has not linked existing page) I see again: "?SID=..." after urls, example: mywebsite.nl/blabla/?SID=m07ev6lliefbf0tfhe4kf0ih54 By search console - other website: https://mywebsite.nl **(none-www) **you see two crawlfaults/errors! Bad influance for my SEO, because of no existed pages, bad urls and dubble content. Bye bye keywords! Lets analyze/crawl with Moz tool ofcourse ^^: Pages with High Priority Issues: | 2646 | Duplicate Page Content |
| 14 | 4XX Client Error |
| 3 | Crawl Attempt Error |
| 1 | Title Missing or Empty | Medium priority: | 9618 | Temporary Redirect |
| 2688 | Duplicate Page Title |
| 13 | Title Element is Too Long |
| 1 | Missing Meta Description Tag | After seeying this results what is the best option (no losing link-juice)? redirect 301? www to none-www (https://) ? Shortly I am going to change my domain provider and the website template in magento. After that I am going to focus on the SEO implementation. First, I have to solve this problem. Who can give me an advice for this situation? Regarding, Newbee0 -
How Can I see my competitors rankings?
Hi, I can see my ranking details but I can not see my competitors rankings. Seomoz is not updating their ranking and there is some problem in rank tracker as I refresh the ranking, but it says it doesn't update it. Why? Regards, Zeeshan
Moz Pro | | Shehryar0 -
Discrepancy between the Moz Trust Number and Chart, same thing for Moz rank.
I am showing a difference between my Moz rank and Moz Trust Number and their respective charts. Is anyone else seeing this issue?
Moz Pro | | EcommerceSite0 -
Does a url with no trailing slash (/)need A special redirect to the same url with a trailing slash (/)
I recently moved a website to wordpress which the wordpress default includes the trailing slash (/) after ALL urls. My url structure used to look like: www.example.com/blue-widgets Now it looks like: www.example.com/blue-widgets/ Today I checked the urls using Open Site Explorer and below is what I discovered: www.example.com/blue-widgets returned all my links, authority, etc HOWEVER there is a note that says........."Oh Hey! it looks like that URL redirects to www.example.com/blue-widgets/. Would you like to see data for that URL instead?" When I click on the link to THAT URL I get a note that says_.....NO DATA AVAILABLE FOR THIS URL._ Does this mean that www.example.com/blue-widgets/ really has NO DATA? How do I fix this?
Moz Pro | | webestate0 -
Number of available links limited?
OK, I've been making use of the free LinkScape API (on behalf of a client of mine) and trying to get links (and info on those links) to a specific domain/page/etc. NOTE : I've been using it without any issue in the past, however we are currently facing some weird issues. Let's take this simple query as an example : http://lsapi.seomoz.com/linkscape/links/wikipedia.org?SourceCols=4&TargetCols=4&Sort=page_authority&Scope=page_to_domain What this one supposedly does is to get links to "wikipedia.org", right? I'm reading : The Page_to_* scopes will by default return 25 links per source domain if no limit is specified, so you can see domain diversity. Due to space limitations in our API, a general link query for a given page will return at most 25 pages for every unique domain linking to that page. And I'm saying OK, that's fine. The thing is that (instead of the 1000 links I had been getting before), I'm now getting just 25 links. NOT per... "source domain"... but obviously per "target domain" (= wikipedia.org) - or am I missing something? (well, probably wikipedia suddenly has just about 25 links pointed to it... makes sense! 🙂 ) Please, let me know what's going on with the above, simply because getting just 25 links is close to worthless... Thanks a lot, in advance!
Moz Pro | | drkameleon0 -
Need help with Opensite explorer
Hi, Every time i try to create an advanced report with Opensite it will get to around 4-6k links and then start to finalize the report, however it says i have 750k links aiming at my root domain. I have not used any of the page/domain authority filters, any ideas on why this could be cutting me off? Kyle
Moz Pro | | kyleNeedham0 -
Drop in number of Pages crawled by Moz crawler
What would cause a sudden drop in the number of pages crawled/accessed by the Moz crawler? The site has about 600 pages of content. We have multiple campaigns set up in our Pro account to track different keyword campaigns- but all for the same domain. Some show 600+ pages accessed, while others only access 7 pages for the same domain. What could be causing these issues?
Moz Pro | | AllaO0