Estimating the number of LRD I need to outrank competitor
-
I just ran a SERP/keyword diffculty report for a keyword I want one of my pages to rank
Also, I just conducted the on-the-page-optimization and now I am going to start buidling links.
=> I would like to estimate how many linking root domains I need to overrank one of my competitor. These are the MOZ data:
1. My page:
Page Linking Root domains: 0
Root Domain Linking Root Domains: 1512. Competitor:
Page linking root domains: 1
Root Domain Linking Root Domains: 5,786I don't really know on which metric (Page or domain LRD) to rely on in order to make an estimation and I would be glad for some help!
To simplyfy the problem, assume that all toher factors (code, on-the-page keyword use., social etc.) are equal for both sites. Can I just get 2LRD to that page in order to likely outrank my competitor or do I need around 5000 more links poiting to my site?
I think an answer to this question could help a lot of users here, since I saw similar questions/difficulties regarding the use of page LRD vs. root domain LRD
P.S. Non of the pages of my website do currently rank in the top 100 for that keyword.
-
**I am trying to get an rough estimation anyway.... **
That's good. Lots of people are selling the $1000/month package to people who need $5000/month to be effective and don't know it. You are trying to figure it out and that is very important.
Economists call this "ceteris paribus" analysis.
There's an awful lot of variables here. That's what makes it interesting.
-
Thanks EGOL for the detailed answer and sorry for my late reply- just coming back from vacation
It is clear to me that the question cannot just be easily answered by a number. I understand this and I wish my clients/supervisors would do that too - but that's a different story...
I am trying to get an rough estimation anyway, assuming that all other factors (on-the-page factors, words on page, quality of the content, social signals, age of the domian etc.) of the competitor's page/site are exactly the same as on my page/site and these factors would not move. Economists call this "ceteris paribus" analysis.
P.S. I've been in the business now for a while, but just starting to use SEOMoz (we used other tools before) - the ressources on this website and especially this question section is just simply awesome!!
-
Thank you Robert. I put a little extra effort into it because I figured that you would like it.
-
Ayup - and to accomplish that means being willing/able to do some business analysis, not just site analysis. Which moves into the realm of web marketing optimization, not just SEO. Which is where the real value in this whole process lies IMO
-
EGOL
This may be the best answer I have seen to any question. The beauty of it goes on and on in that anyone in SEO dealing with clients must always ask themselves: Can the client handle the truth and/or understand the explanation?
The explanation is what you just gave and the truth is that no one knows for sure and it will often cost you a lot to get there. The other difficulty for SEO clients is that we live in a world of instant gratification. I have clients who spend a lot with me and regularly ask what they are getting (yes, they get reports, etc.). Interestingly, these same clients will mention they have just authorized a $250,000 TV ad campaign. (Trust me we are a bit cheaper.)
They understand the tv ad no matter what the results; even with excellent results (more and more customers) they have a hard time understanding the SEO.
Egol,
Thanks a bunch for a truly excellent, thought provoking answer.
Robert
-
We compete against pages on about.com and amazon.com that are thin content and have zero off-page assets. About.com has skimpy information about our topic and nothing to sell. Amazon has a couple things to sell an zero informative information.
They outrank us for the short tail.
We have the best and deepest informative content on the web in our niche, informative video, youtube channel, lots of articles, printable resources, one of the best selections of products on the web, an email address that answers questions almost 24/7/365 and a phone where you can talk to someone who uses and tests most of the products regularly and has handled, inspected, photographed, described, every product that we sell.
This is one of the problems with google giving huge authority to the brands. Mom and pop know an awful lot more and give immediate, helpful, informative, caring service... but they get no respect in the SERPs.
I understand our position in the rankings. Praise be that I don't have to explain it to a client!
-
Or to determine whether the costs of attempting success are just too high to be worth it.
I would really respect an SEO who told me something like this.
-
Egol's explained the true scope of what you're asking, but to give a specific response to your specific question:
No, just having one more incoming link to your page than your competitor has to his will never guarantee you'll outrank him.
Search engine algorithms use hundreds of signals when ranking a page, and the number of incoming links to that page is only one measure.(And it's more than just wht else is on the page, like code, keyword use etc)
Even if we just restrict ourselves to considering the links to a single page, additional issues like the authority of the sources of the incoming links plays a huge part. Not all links are of equal value.
But more importantly, things like the authority of the whole domain have huge impact as well. That's why SEOMoz goes out of it's way to compute scores for domain authority as well as page authority. A weak page on a strong site will frequently outrank a strong page on a weaker site. That's one of the big frustrations/challenges of nearly all small site owners.
Bottom line, neither Page nor Domain LRD metrics are sufficient in themselves for assessing the work ahead of you.
Make sense?
Paul
-
What Egol has just described is why SEO is only part science and the rest - a significant portion - is still an art. It's also why automated implementation tools have never been successful.
It takes juggling a wide range of constantly changing factors, many with very subjective values, to deliver success. Or to determine whether the costs of attempting success are just too high to be worth it.
Website owners always HATE to hear the accurate response, but the honest response to so many (most?) SEO questions is:
It Depends.
And the good, ethical SEOs will tell you that right up front, as Egol has done. It would be so much easier if SEO, Conversion Rate Optimization etc were linear pursuits - do X and the result will be Y. But they're not, which is why good SEOs will beat those who simply follow (and promise results from ) formulas.
Frustrating huh?
Paul
-
Anyone who gives you a number for this question is full of beans.
This is one of the most difficult questions in SEO and most of the people who are charging clients for SEO can not answer it.
Lots of SEOs have clients on a $500/month plan when the target that they are attacking needs many times that much to become competitive.
Let me give just a couple reasons why this question is so difficult and why nobody here can give you an answer with the information that you provided.
====================
Your question is making a straight comparison.... it assumes that you are racing a stationary target. The target is not stationary. This question is really like this...
Two cars are driving on a road.... Car B leaves when Car A is already 100 miles down the road and traveling at a rate of 50 miles per hour. How fast will Car B have to drive to overtake Car A?
The answer is a velocity AND a time. And the question as I stated it assumes no acceleration.
If Car B drives 51 miles per hour... will you have the patience and budget for that long of a wait? Do you have the resources needed to drive 70 and not be stopped by Google?
=====================
Another example...
George Pickett was a southern general at the Battle of Gettysburg who was ordered by Robert E. Lee to run the Union Army off of Cemetery Ridge. To do that Pickett's troops would need to cross a mile-wide open field in broad daylight under an absolute hail of enemy fire.
The field was one mile wide, it was up a slight incline - if they were slow in crossing they would be mowed down...
The Union had hundreds of troops dug in and positioned on the ridge and ready to open fire - if Pickett had only a few troops they would be mowed down....
So Pickett needed to order an enormous number of troops across that field and order them to run their asses off to engage the Union Army quickly - or die in the middle of the field.
If they failed in getting a enough troops... if they failed in getting enough speed... if they failed in getting enough determination and courage... then they would be mowed down.
Your job is similar to Pickett's... you must get an enormous number of links... you must get them quickly... and you must get them before your budget runs out... and you better hope that your budget is big enough.
Pickett knew that his field was a mile wide.. he knew how many troops that he had. He could see Union troops on the other side of the field at the top of the ridge.
Pickett's generals complained when they were told the battle plan... but Pickett pointed at the Ridge.
This battle was lost on bad math.
But your problem is even more difficult... the field is getting wider as you cross it because the target is moving away from you.
==================
And your problem is also more difficult because links have different values. One link from the Pope's site is worth a thousand from pedestrian sites.
Furthermore... different pages compete with different amounts of vigor. I might have a fifty word page but my competitor might have 2000 words, ten images, a video, and tables of data. Big difference.
===================
So, how is an SEO to know what is needed?
-
You run the keyword difficulty tool and get a feel for the numbers (which contain much confusion).
-
You go out to those SERPs and visit the sites and get a feel for their authority and content quality.
-
Then you use open site explorer to see if the Pope is on their side.
-
Then you decide if you are up to beating their content, beating their numbers, beating their quality.... and if you can do that quickly enough before the conditions of engagement change or your budget runs out.
After all of that you decide on taking the gamble or not.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Before Migration/after(www/non-www/http/https) - Good concentration needed :p
Hi all, Im confusing between those www's and http's. If i go to searchbar (chrome) and ENTER: www.mywebsite.nl, It changes to https://www.mywebsite.nl
Moz Pro | | Dreamgame2016
( with www, and https:// not used) / Its OK next: typing in searchbar and enter: mywebsite.nl, It changes to https://mywebsite.nl (without www and https:// ) / OK Next: www.mywebsite.nl, it stay the same, just https:// added: https://mywebsite.nl (used with https://) / OK Now its comes: If I do it again without http**(s)://mywebsite.nl, **It changes to https://www.mywebsite.nl/?SID=bccbuhvi1cf53r188bpvskn597 / NOT OK 😛 In google search console (webmastertool) I gave property for the https://mywebsite.nl and https://www.mywebsite.nl Each of the website, Im seeying data clicks/ volume keywords etc, so both of them functionating By search console: https://www.mywebsite.nl (With www) I see crawlfaults/errors: 1633 (the url has not linked existing page) I see again: "?SID=..." after urls, example: mywebsite.nl/blabla/?SID=m07ev6lliefbf0tfhe4kf0ih54 By search console - other website: https://mywebsite.nl **(none-www) **you see two crawlfaults/errors! Bad influance for my SEO, because of no existed pages, bad urls and dubble content. Bye bye keywords! Lets analyze/crawl with Moz tool ofcourse ^^: Pages with High Priority Issues: | 2646 | Duplicate Page Content |
| 14 | 4XX Client Error |
| 3 | Crawl Attempt Error |
| 1 | Title Missing or Empty | Medium priority: | 9618 | Temporary Redirect |
| 2688 | Duplicate Page Title |
| 13 | Title Element is Too Long |
| 1 | Missing Meta Description Tag | After seeying this results what is the best option (no losing link-juice)? redirect 301? www to none-www (https://) ? Shortly I am going to change my domain provider and the website template in magento. After that I am going to focus on the SEO implementation. First, I have to solve this problem. Who can give me an advice for this situation? Regarding, Newbee0 -
Why is my domain authority lower than my competitors ?
I am totally confused with the information that I am getting from the Site explorer. My domain authority is 26, while my competitor's is 29. I am confused because every one of the factors that SEOmoz uses to determine our domain authority has higher rankings for my website. My SEOmoz rank is higher, my external followed links is higher, and so forth. The only factor that my competitor has with a greater ranking or number is that they have more internal links. I used the link metrics portion and added their URL to see all of this information. Can anyone tell me how this is possible ? My domain is www.Prickettproperties.com and one of the ones that I am looking at comparing is www.liquidlifevacationrentals.com
Moz Pro | | Prickett0 -
How can i find competitor title and descriptions in SEO moz PRO
Hi I can't seem to find competitor title and descriptions in SEO moz PRO. I am missing something?
Moz Pro | | smashseo0 -
Need assistance with tool development using SEOMoz & Google APIs
I don't know where the right place to ask this is, but I work for an SEO agency, and we are looking for someone to help us with development of some tools utilizing the SEOMoz API and probably the Google API as well. Does anyone know where I could find a person with some previous experience with development using these APIs? We've had poor luck just trying to use a developer without specific experience in this area or at least some SEO knowledge. If you're a developer and would like to talk with us, you can contact me directly if you prefer, rather than post your contact info publicly, but I welcome any helpful thoughts or ideas regarding development of SEO tools. Thanks,
Moz Pro | | BandLeader
John
jmaher [at] mcdia.com0 -
Best way to count number of inbound links to a subfolder?
What's the best way to count and track the number of inbound links to a subfolder like http://www.domain.com/folder ? I've tried using open site explorer but I can't seem to get it to show just inbound links for things under a subfolder. i.e. links to domain.com/folder domain.com/folder/subfolder etc. Thanks!
Moz Pro | | twobitoperation0 -
Need a Google Doc That Imports List of Links From SEOMoz API
Does anyone have one they'd like to share. I've seen the great ones by Tom Anthony but I am looking for the list of the first 1000 links that are available as part of the free API to be brought back into a Google Doc.
Moz Pro | | belasco0 -
Domain Authority: Sites with lower numbers get higher ranking.
We had a question about your online tools for http://pro.seomoz.org/campaigns/30642 The Domain Authority for http://pucherinsurance.com/ is 67: Higher than most of our competitors such as <cite><cite>bluecross.ca which has a rank of 54. </cite> </cite> We show up lower in the actual search results. An example phrase would be Visitors to Canada health Insurance. We get an A grade for onsite reports and we are working on the duplicate content and titles. We appear to be doing everything right but still rank lower than expected. Any thoughts?
Moz Pro | | northerncs0 -
How Do You Leverage Linkscape Data to Overcome Your Competitors?
Hi everyone...I used Q&A a long time ago when my company had a paid subsription but I haven't used it in a while, so I'm excited! And since the Q&A is apparently addressed to the Moz community, I figured I would embrace it and ask all of the Mozzers out there: How do YOU use Linkscape to reverse engineneer a competitor's website? I understand how to use Linkscape. What I'm looking for is specific filtering or "out of the box" uses of Linkscape to truly understand how a website has obtained it's ranking in the search results. In particular, I'm really curious about how everyone (including those who truly know how Linkscape works, i..e the minds behind it) make sense of the "DJ Passed" #'s or "most important links" criteria. I realize Linkscape is wonderful, but what I've found is that often times the links that "pass the most juice" or the links that are the "most important" AREN'T actually the most important links on a site. For example, I often find that the links that have the highest "DJ Passed" are directories. I could be wrong, but my guess would be that directories actually pass very little link juice. If directories gave as much link juice as the linkscape metrics indicate, then they are by far the best linking source, which I think we all know isn't the case in most instances. To be clear...My intention is not to "debunk" the value of Linkscape...On the contrary, I think it's a wonderful tool and I want to understand it's nuances so I can identify "false positives", use it to get a true picture of a website, and get any tips/tricks from those who've successuly used it to overcome there competitors. Thanks ahead of time!
Moz Pro | | LuminConsutling1