Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Domain authority and keyword difficulty
-
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool?
I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
-
I have a new domain and I am beating pages with much higher page ranks / DA etc
Typically in the niche I am targeting if I use the Keyword Difficulty tool i get an average of about 50% - I think this tool mainly uses DA / PA to work out the difficulty %.
After I will do a Google search and look on the pages for mentions of the keyword's I am going to target with my pages / posts.
Often I find there are few mentions or 1 exact match mention in the content with the page title being something different / not exact match.
I will then build a page which is targeted specifically for the keyword and optimise for it, I don't over do the optimisation - if the other pages only have 2 mentions of the keyword in the content I would normally build a post with say 3 - 5 mentions. I have noticed when going over 5 keyword (aprox) pages tend to rank poorly or rank badly then crawl back up the SERP's slower - this could be due to the domain on the site being 1 month old.
I also only build quality content that is relevant to the search term, this should prevent the pages dropping from the SERP's (I hope!).
Obviously if your niche has highly optimised pages and a bunch of links pointing at each page then this method is not going to work.
Hope that helps.
-
I do this a lot (on a daily bases), so first off your not alone.
And your right in the fact it does need to be weighed up.
It's very hard to give an actual answer but in general if I see PA/DA 40+ a lot of work could be involved, if I see PA/DA 20- should be easy, id expect first page rankings in a few weeks to a month if that.
I don't just go off this alone but it's my starting point, I will check out all of page 1 and page 2 and some times page 3. You might find page 1 is 40+ then page 2 is 25+.
But I do look at lots of elements for example how much there content is shared socially, this gives me an idea if I produced the same sort of content and pushed PPC to it what kind of sharing potential is available from this audience (but this is just a method I do, haven't seen anyone else doing it).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do Meta Keywords matter?
I am a firm believer in the fundamentals of SEO but is there any data to support its impact positively or negatively towards a sites rank?
Technical SEO | | Brandonp0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Does a subdomain benefit from being on a high authority domain?
I think the title sums up the question, but does a new subdomain get any ranking benefits from being on a pre-existing high authority domain. Or does the new subdomain have to fend for itself in the SERPs?
Technical SEO | | RG_SEO0 -
Domains
My questions is what to do with old domains we own from a past business. Is it advantages to direct them to the new domain/company or is that going to cause a problem for the new company. They are not in the same industry.
Technical SEO | | KeylimeSocial0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0