Page Authority vs Domain Authority
-
I'm using the site explorer to compare a potential clients site against 4 others, in an incredibly competitive market.
Each of their competitiors has a higher page authority (on the home page) than their domain authority. This is untrue for the clients site. (which have much lower metrics all round)
Any input as to what this means/says about their competitors who I would guess (looking at some of their backlink profiles) have done some failry widespread grey hat stuff in the past. (Though haven't we all )
-
The majority of their links could all be pointing to their home page and they may not have many deep links, which can explain why the page authority for their home page is higher than the domain authority.
-
Hi, just today Rand Fishkin released another Whiteboard Friday, which almost exactly tackles that topic. Okay the Whiteboard Friday is on Googles PageRank, but he almost certainly explains an awful lot on Seomoz' own ranking parameters and also tackles the question what a good domain authority and bad page authority meand. Definitely worth watching
http://itunes.apple.com/ch/podcast/whiteboard-friday/id411307102 (2011-08-12 PageRank.mov)
Sorry for the itunes link, but the podcast hasn't appeared on seomoz.org yet (sure it will a bit later..)
PS: In a nutshell: High domain authority is much more important, than high page authority. A good domain can give you much more "link juice" than a good article. Or as Rand says it in the vid: A link from an article on CNN with a low PageRank / page authority? Go grab that stuff! It is on CNN!!!! Hope that helped
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content: Marketing Page / Content Page
So I am getting duplicate content warnings on my website for my pages white paper and webinar video pages. Each white paper / webinar video page is behind a marketing form page that must be filled out. I am getting a lot of warnings that the marketing page and the content page are being picked up as duplicated content. In the past, both the marketing page and the content page were given the same title and url, the body content is not similar. My question: Is the URL / Title similarity enough to set off the duplicate content warnings and would changing one or the other solve the issue?
Moz Pro | | AllMedSeo0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Duplicate page content and title
hi,
Moz Pro | | solutionforweb
i have a serious issue with my site. my website contains 21 pages. but during my weekly report, moz found 84 errors. i.e 42 errors in duplicate page content and 42 errors in duplicate page title... when i see the error in details.. all my 21 links are displaying twice. for example http://domain.com/
http://domain.com/page1.html
http://domain.com/page2.html
and
http://www.domain.com/
http://www.domain.com/page1.html
http://www.domain.com/page2.html so, the same link is repeating twice with www and without www. how do i resolve this error? so please kindly anyone help me....0 -
Concerned About Individual Pages
Okay. I've setup a campaign for www.site.com and given a list of keywords. So after the initial crawl we'll have some results. What I'm looking for tho is how do individual pages on my site rank for the list of keywords given. And then be able to go to a screen in seomoz with data for that particular page with recommendations and stuff like that. Is this what's going to happen or do I need to create a campaign for each url i want to track? If all will work as I'd like in the example above, should I then add the second list of keywords that some other pages should rank for? Will it get to be a big mess or can I relate the keywords to pages in some way? It seems like what I'm looking for is what this program should be... Thanks!
Moz Pro | | martJ0 -
Duplicate page errors
I have 102 duplicate page title errors and 64 duplicate page content errors. They are almost all from the email a friend forms that are on each product of my online store. I looked and the pages are identical except for the product name. Is this a real problem and if so is there a work around or should I see if I can turn off the email a friend option? Thanks for any information you can give me. Cingin Gifts
Moz Pro | | cingingifts0 -
Too many on-page links
one of my SEOmoz pro campaigns has given me the warning: Too many on-page links and the page in question is my html sitemap. How do i resolve this because I obviously need my sitemap. How do i get around this?
Moz Pro | | CompleteOffice1 -
Confused on www vs non-www
Hey Everyone... Really new to the SEO world and have learned tons each day. When I joined SEOmoz I went to my host and set up the 301 direct to have frogfanreport.com go to www.frogfanreport.com. After a couple of days I noticed that Rogerbot only crawled 1 page on www.frogfanreport.com. Looked into the community posts to try to find an answer. So, I went in and took the 301 direct off and setup a new campaign just for frogfanreport.com. It has now crawled over 300 pages. Not sure what I need to do or if I just did not set it up the 301 direct correctly. Looking at the link stats the root domain stats are obviously the same. The subdomain stats is where there is a big difference: www: ext f links 1, total ext links 5, total links 5, f root domain 1, total linking root domain 4 non-www: ext f links 76, total ext links 109, total links 7.962, f root domain 11, total link root domain 19 I am guessing that I should go back in and put the 301 direct from www to non-www? Is this going to affect RogerBot going in? Or did I just not set it up correctly? zach
Moz Pro | | TCUFrogFanReport0 -
Tracking a british (.co.uk) domain
I tried to track a british domain and it looks like SEOMOZ interprets it as a subdomain as it has a XX.co.uk address. When I list it as a subdomain, it says that the domain is not responding (i guess coz its pining co.uk. What do I do to create a profile?
Moz Pro | | jwainstain0