How to outrank a directory listing with high DA but low PA?
-
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site.
This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA?
Thanks
-
Most of my work is writing articles that take between three days and a week to author. I also have employees who assist with these articles by taking photos, making graphics, doing research, collecting data and posting them to websites. Some of these articles attack very difficult keywords.
After doing this for about 12 years on the same website, I still don't know how these articles are going to rank. A year or two after posting some are on the first page of Google defeating popular websites that surprise me. Others, perplex me because I am being beaten by pissants - in SERPs that I would judge to be much easier. I suspect that semantics, keyword diversity and titles that elicit clicks help the pissants beat me but I don't know for sure.
I can't predict how my own rankings will turn out on a website that I know well, in an industry where I have worked for 40 years and against competitors who are often people who I know by name or are even my own customers. The SERPs can be very difficult to understand. One thing that I will say with confidence is that DA and PA explain nothing and give zero guidance in winning a fight. They count as zero importance in my decisions. I can't even tell you those numbers for my own websites unless I go look. That's how little attention I give to them.
-
Sorry I couldn't help.
From above DA is a factor and one that is very difficult for you to catch up to or surpass - BUT certainly not a massive % of the ranking algorithm.
-
Actually I don't, hence why I came here to ask the question. I think Egol answered it above, they are beatable as small businesses beat them every day. That's what I needed to know, whether they are beatable, how easy/hard is it to beat them as that is a deciding factor as to whether to invest more time and money into SEO or not (money could be better spent on ads for example).
You gave me a philosophical answer that basically said, "you can't change what they do or have so work on what you can change in yourself", which is all fine and dandy but its a loose, vague, cookie cutter spiritual science answer. I mean could I theoretically outrank "British Cancer Research" Website for the keyword cancer "cancer research"? Obviously the answer is yes, using your advice, I can just keep working on my cancer research site, maybe throw a million £ into its SEO and a couple of years and I'll outrank them. We know that, everyone knows that, everyone knows that with hard work and enough time/money/effort you can achieve anything - that is not the question, the question was "how hard/easy is it?" as that is obviously a big factor when considering to continue with that strategy or not.
I mean no disrespect, I think you just misunderstood my question from the start as a "complaining type of question", perhaps you interpreted it as me whinging about the high DA competition and you were trying to encourage me to not focus on that DA. I wasn't complaining, it was a straight up question of how much that high DA is a factor in their site outranking mine as I have built a lot of backlinks to my page, they have none to theirs and therefore must rely on their site DA and traffic.
-
If you did not want constructive suggestions, why even ask? You obviously already knew the answer you wanted.
Best
-
You came here asking how to beat a directory and you got good answers and an action plan.
Unfortunately you look at metrics that google does not use, metrics that are based upon a domain, metrics that have nothing to do with the methods of winning a SERP. Don't allow rubbish metrics to frighten you away.
These are directory sites that you are trying to defeat. Directories.
They are not the Library of Congress or the Pope. Pages on these sites are defeated by small businesses every day. Pages on Amazon are defeated by small businesses every day. These small businesses didn't run because they faced competition. They got to work.
If you are willing to work hard you should not fear competition. Because where there is competition there is usually a lot of search volume on a lot of diverse keywords. And, where there is competition there is usually a lot of money changing hands. Attack there with long content with diverse keywords and excellent quality. There is a good chance that you will earn traffic. Attack that keyword with multiple pages, each of excellent quality and targeting the long tail. One of more of those pages might eventually gain rankings for the short tail keyword.
Maybe you will not win if you fight. But you will never win if you run.
-
PA is built with inbound links to that page. That page has 0 backlinks. If it has a 29 PA which I had already checked, it is boosted probably by traffic, internal links to that page which is all a direct cause of Gumtree having massive traffic and DA.
I am not taking it personally, I'm just looking for a reasonable answer as to how much DA weigh in as a factor in rankings? I have a pragmatic approach to things so for example:
Site A has DA30 PA29, my site has DA25 PA28 - This gives me a good idea of what I need to do. I need to study site A's backlink profile, onsite SEO and try to raise my DA to match or beat theirs. Its clear pragmatic approach.
Now example B:
Site A has DA90 and PA 29, my site has DA30 and PA45. - Now for a logical approach, this is much harder to approach pragmatically on how to beat it. Mainly because we know we can never achieve a DA90, and my higher PA isn't overcoming that DA gap. So the question is, how much of a factor is that huge DA? This is important from a business decision perspective because if it means 6 months of high quality backlinks could overcome that, they maybe it's a go, but if it means that I may need to achieve a DA of 60 and PA 60 to outrank that site, then there would be no point. And yes I know DA/PA doesnt mean ranking, but its the closest measuring stick we have to how successful a site will be in ranking.
And I don't mean to be rude but the idea of just "overcoming" something by doing things better in things you can change is a very vague idea. If I want to be the best boxer in the world and I'm 50 years old, you could follow some rara and claim that you can't change your age so you might as well just train and work on what you can change. But fact of the matter is, its nearly impossible to be the best boxer in the world at 50yrs old so although theres a 0.01% chance it could be done, its not a worthwhile investment. And this is why I asked a simple question of how much of a factor that DA is, its to figure out whether its worth investing into more SEO.
And you're right in the context that keyword difficulty doesn't change based on whos above you, however if you are aiming for no.1 position and you are no.2 but no.1 is almost impossible to take over, then yes that keyword is hard. When building niche sites, part of keyword research is to research your competition for that keyword before embarking on that niche site project.
But thank you for taking your time to answer.
-
Magusara,
I think you are taking this a bit personally. Yes the pages above you are ultimately owned by someone even if that someone is a stockholder in the company that owns the page. The link you provide (https://www.gumtree.com/removal-services/bournemouth) has a page authority of 29.
When you complain about the authority reference with the Facebook example, you are missing the point. The issue is not 100% DA and you are fixated on that. Sorry, it is simply my opinion that you cannot fixate on the thing you say is insurmountable (increase your DA above theirs) and then say any other way to deal with the roadblock is in some way not relevant or that whomever supplied the suggestion is just wrong.
You are wrong about keyword difficulty. You say: And I disagree, who ranks above you is a reflection on keyword difficulty. Obviously if you are trying to rank #1 for the keyword "dog training", you are currently 2nd but no.1 is occupied by Facebook deciding to have a specific dog training page target for your area, it would be next to impossible to overtake them. Hypothetical situation but you get what I mean.
Who ranks above you is NOT a reflection on keyword difficulty. If today I erect a page that is purple dog jellybeans and I add content weekly to it. If in three months you erect a page that is purple dog jellybeans and index it, most likely, it will initially rank below my page. That doesn't mean that the term purple dog jellybeans is a competitive keyword.
That is determined by the keyword within the context of all the competition in a given vertical for that term. It is not determined by the site above you with a lot of DA.
Everyone involved in SEO on these Q&A pages have faced the same hurdles you are experiencing and all we can give you is our experience. Yes, DA is a factor and one that is very difficult for you to catch up to or surpass - BUT certainly not a massive % of the ranking algorithm.
The points we were making were to suggest you quit looking at that one roadblock (DA) and go about developing everything else that your competitor cannot. You KNOW the area the directory participates in. Use what your advantage is against them and quit being argumentative with those who only want to assist you. Focus is where we place our gaze and apply our energy. You are focusing on the roadblock of DA. We are suggesting you focus on everything else and make the roadblock cease to exist.
I will apply a golf analogy and then say goodbye:
If you are faced with water you must cross to get to the green with your shot, you can focus on the water or on the green. Ask any golfer where the ball goes if you are focused (caught up with avoiding) the water. Just a fact of life for me.
We wish you the very best,
Robert
-
I don't hate directories or envy their DA. The pages above me are NOT owned by a person. They are a directory listing.
I say "obviously they have a low PA" because these pages are not owned by anyone therefore they dont have any SEO or owner building traffic or backlinks to them. They are the result of the directory listing. Example: https://www.gumtree.com/removal-services/bournemouth
That URL has 0-1 backlinks according to ahrefs or moz.com site explorer. The page ranks highly literally because the DA of the site is 70. As for researching what they are doing, well thats kinda like saying research what Facebook or yell.com is doing to see if you can achieve the same DA as their site.
And I disagree, who ranks above you is a reflection on keyword difficulty. Obviously if you are trying to rank #1 for the keyword "dog training", you are currently 2nd but no.1 is occupied by Facebook deciding to have a specific dog training page target for your area, it would be next to impossible to overtake them. Hypothetical situation but you get what I mean.
The purpose of the question is not to hate on those sites above me, it's to estimate how difficult directory results are to outrank in search engines. Because they can't be held to the same ranking factors as other personally owned sites because of the huge DA they hold, such as a Forbes article. You can build more links to your page than there are to the Forbes article but the Forbes DA will also weigh in, I guess the question is how much of a factor is the DA over the PA.
-
I really enjoyed Robert's answer because we all see so many of these DA and PA envy questions.
To build on Robert's theme of "leverage your advantages" - these directories are usually built by people who know very little about your local area or industry. They are also generally cookie-cutter sites built by factory workers who live 1000 miles away. For those reasons, it is often very easy for a local expert or an industry expert to build content that is vastly superior and to provide a landing page experience that impresses the visitor enough that he will share it with others. You probably also know the visitor better than the factories who build these sites too.
Put some time into your content and presentation. Winning there can significantly improve the success of your page in the SERPs. It can also significantly improve your chances of attracting the searcher to your business instead of the business of your competitors. Show your expertise!
-
Magusara
In my opinion, if you wish to outrank anyone, you should first take a step back and not draw conclusions without first researching. You state, "...obviously the pages would have low PA and they are top based on the high DA of the site." If it is obvious, then why research to see what exactly they are doing?
If you have spent any time on Moz's Q&A you will have seen ton's of "How is this page outranking me?" questions. The key to combatting those pages is to learn more about them than they know about themselves. You cannot do that if you assume or believe anything to be obvious.Yes, I hate directories (when it serves me to not work with them) as much as any other SEO professional, but they do get many things correct. One thing you can do is grow your DA and yes, it will take a bit of time. But look closer, is everything as it really appears?
My point is this: Knowing they have an advantage is not going to assist you in combatting them, knowing what the advantages and disadvantages they have will help you fight them if you choose to exploit them. So, is it worth the time to fight it? If it is, then know your opponent and leverage your own advantages.
Hope that helps,
Robert
PS the keyword difficulty does not change based on who is above you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
is moz website update algo of pa and da ?
I sea tons of website build redirects backlink from google to them , and the have get a high ranking in moz , why moz don't pervent this methods?
Intermediate & Advanced SEO | | tailor1090 -
Old brand name being suffixed on Google SERP listings
At the end of some of our listings in Google search results pages, our old brand name is being suffixed even though it is not in our title tags. For context, we re-branded several months ago, and at that time also migrated to a new domain name. Our title tags have our current brand name suffixed, like "Shop Example Category | Example©". In the Google search results, but not in Bing nor Yahoo, about half of our pages have titles whcih instead look like this: "Shop Example Category | Example© - oldBrandName". The "dash" and the old brand name are not in our title tags, but they are being appended, even when our title tags are fairly long. For example, even with titles at 54 characters (421 pixels), the suffix is being appended. BUT, not with our longer title tags. We are actually OK with the brand name being appended if our title tags are on the shorter side, but would prefer that our current brand name be appended instead of the older one. I realize we could increase the length of all our title tags, and perhaps we may go that route. But, does anyone know where Google would be getting the old brand name to append onto the URLs? We've checked and it is not in our page source (the old brand name is used in our page source in some areas of text and some url paths, but not in any kind of meta tag). Per Google's guidance (https://www.searchenginejournal.com/google-do-not-put-organization-schema-markup-on-every-page/289981/) we only have schema for the "Organization" on our home page, and not on every page. So, assuming this advice is correct to not add schema to every page, how can we inform Google of our current brand name so that it stops appending our old brand name on pages?
Intermediate & Advanced SEO | | seoelevated0 -
SEO suggestions for a directory
Hi all, I am new to SEO. I work for a ratings and review website, like TripAdvisor and LinkedIn. How would one go about setting up SEO strategy for national directories that have local suggested pages? What can be a good practice. For example, Tripadvisor has many different restaurants across the UK. What would they do to improve their SEO? How do they target correct links? How do they go about building their Moz Score? Would really appreciate your thoughts and suggestions. Thanks!
Intermediate & Advanced SEO | | Eric_S
Eric0 -
Multiply List of Keywords | Tools?
Hi guys, I was wondering does anyone know of any tools which you can had a large list of seed keywords and it will find related keywords per seed keyword. I know scrapebox, ultimate niche finder can do this, but was wondering if there was anything else in the market to checkout? Cheers.
Intermediate & Advanced SEO | | jayoliverwright0 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
Wise or cluttery for a website? Should our "out of the mainstream" of popular products be listed on our site? (older/discontinued, umfamiliar brands, parts to products, etc...)
For instance, should we list replacement parts for a music stand? Or parts for a trumpet, like a valve button? To some, this seems like a cluttery thing to do. I suppose another way to ask would be, "Should we only list the high quantity selling items that are well branded and that everyone shops for, and leave the rest off the website for instore customers only to buy?" (FYI: Our website focus is for our local market mainly, and we're not trying to take on the world per-say, but if the world wants in, that's cool too.) (My thought here is that if a customer walks into our retail store and they request an odd ball part or item... we go hunting for it and find it for them. Or perhaps another Music Store needs a part? To me, it's ALL for sale,... right? Our retail depth, should be reflected in our online presence as much as possible,... correct? I'd personally choose to list the odd balls on our site, just as if a customer was standing in the store. Another side thought is, if we only list the main stream products... we are basically lessening our content (which could affect our rankings) and would be inviting ourselves into a higher competitive market place because we wouldn't be saying anything different than what most other music store sites out there say. I believe we need to show off our uniqueness,... and product depth (of course w/good SEO & content too) is really kinda it, aside of course also from good expert people and a large facility. But perhaps that's a wrong way to look at it?) Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
How to Hide Directories in Search?
I noticed bad 404 error links in Google Webmaster Tools and they were pointing to directories that do not have an actual page, but hold information. Ex: there are links pointing to our PDF folder which holds all of our pdf documents. If i type in , example.com/pdf/ it brings up a unformated webpage that displays all of our PDF links. How do I prevent this from happening. Right now I am blocking these in my robots.txt file, but if i type them in, they still appear. Or should I not worry about this?
Intermediate & Advanced SEO | | hfranz0 -
Removing URLs in bulk when directory exclusion isn't an option?
I had a bunch of URLs on my site that followed the form: http://www.example.com/abcdefg?q=&site_id=0000000048zfkf&l= There were several million pages, each associated with a different site_id. They weren't very useful, so we've removed them entirely and now return a 404.The problem is, they're still stuck in Google's index. I'd like to remove them manually, but how? There's no proper directory (i.e. /abcdefg/) to remove, since there's no trailing /, and removing them one by one isn't an option. Is there any other way to approach the problem or specify URLs in bulk? Any insights are much appreciated. Kurus
Intermediate & Advanced SEO | | kurus1