Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain authority and keyword difficulty
-
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool?
I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
-
I have a new domain and I am beating pages with much higher page ranks / DA etc
Typically in the niche I am targeting if I use the Keyword Difficulty tool i get an average of about 50% - I think this tool mainly uses DA / PA to work out the difficulty %.
After I will do a Google search and look on the pages for mentions of the keyword's I am going to target with my pages / posts.
Often I find there are few mentions or 1 exact match mention in the content with the page title being something different / not exact match.
I will then build a page which is targeted specifically for the keyword and optimise for it, I don't over do the optimisation - if the other pages only have 2 mentions of the keyword in the content I would normally build a post with say 3 - 5 mentions. I have noticed when going over 5 keyword (aprox) pages tend to rank poorly or rank badly then crawl back up the SERP's slower - this could be due to the domain on the site being 1 month old.
I also only build quality content that is relevant to the search term, this should prevent the pages dropping from the SERP's (I hope!).
Obviously if your niche has highly optimised pages and a bunch of links pointing at each page then this method is not going to work.
Hope that helps.
-
I do this a lot (on a daily bases), so first off your not alone.
And your right in the fact it does need to be weighed up.
It's very hard to give an actual answer but in general if I see PA/DA 40+ a lot of work could be involved, if I see PA/DA 20- should be easy, id expect first page rankings in a few weeks to a month if that.
I don't just go off this alone but it's my starting point, I will check out all of page 1 and page 2 and some times page 3. You might find page 1 is 40+ then page 2 is 25+.
But I do look at lots of elements for example how much there content is shared socially, this gives me an idea if I produced the same sort of content and pushed PPC to it what kind of sharing potential is available from this audience (but this is just a method I do, haven't seen anyone else doing it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do Meta Keywords matter?
I am a firm believer in the fundamentals of SEO but is there any data to support its impact positively or negatively towards a sites rank?
Technical SEO | | Brandonp0 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Are .clinic domains effective?
We acquired a .clinic domain for a client, they are right now running under a .ca and I was just wondering if there were any cons to making the switch. On the flip side are there any pros? I've tried to search for the answer but couldn't seem to come across anything, thank you if you have any knowledge or could point me to a resource.
Technical SEO | | webignite0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Using hyphenated sub-domains or non-hyphenated sub-domains? What is the question! I Any takers?
For our corporate business level domain, we are exploring using a hyphenated sub-domain foir a project. Something like www.go-figure.extreme.com I thought from a user perspective it seems cluttered. The domain length might also be an issue with the new Algorithm big G has launched in recent past. I know with past experience, hyphenated domains usually take longer to index, as they are used by spammers more frequently and can take longer to get out of the supplementary index. Our company site has over 90 million viewers / year, so our brand is well established and traffic isn't an issue. This is for a corporate level project and I didn't have the answer! Will this work? anyone have any experience testing this. Any thoughts will help! Thanks, Rob
Technical SEO | | RobMay0