Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How valuable is a link with a DA 82 but a PA of 1?
-
Our county's website has a news' blog, and they want to do an article about an award we won. We're definitely going to do it, and we're happy about the link. However, all the other news' articles they have only have a PA of 1. The DA is 82, and the link is completely white hat. It's a govt site in our locale, however, with such a terrible PA, I'm don't think the link is really all that great from an SEO stand point. Am I right or wrong (or is it some dreadful murky grey area like everything else in this industry (which I'm thankful to be a part of
)?
Thanks so much for any insights!
- Ruben
-
Those are very good points. Thanks Lewis!
-
The Page Authority will be 1 as it'll be a brand new page. You don't create a page with an instantly high PA, it has to be earned. Take the BBC, for example; if they create a news story today the page will have a PA of 1 but a DA of 100, but most SEOs would love a link from the BBC!
News websites are constantly adding new pages as new stories break. It's unlikely these types of pages will get huge PAs as, let's face it, yesterday's news won't continue to attract many backlinks after day one or two of the story breaking.
Keep up the good work! You should certainly see some positive results if you keep building links like that!
Cheers,
Lewis
-
As always, thanks everyone!
- Ruben
-
Thanks for the excellent answer, Travis. It was very insightful. I appreciate it.
- Ruben
-
I'll chime in to wholeheartedly agree with Ryan and Travis. This is a particularly valuable link from the standpoint of local SEO, given that it's coming from a local new source.
-
I agree with Travis. In short, yes it's an excellent link. Like Travis mentions, getting caught up in the numbers can be misleading at times, and for a short hand of the sites and people you want to work with it's better to think of them as relationships. In this case, being connected to an official site that's reputable, spam-free, and exclusive is an excellent connection.
-
I would generally dispense with the concern over metrics, considering the source. It sounds like a great citation source, regardless. Plus it may do what links were intended to do in the first place: Drive Traffic
OSE, aHrefs, Majestic and the like are just keyhole views into what's really going on. Albeit important keyhole views, but still limited insights into the big picture.
I would challenge that if one focuses less on granular metrics, and puts more attention into traffic and general relevancy; one would be happier with the results and have more time for generating similar results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any crawl issues with TLS 1.3?
Not a techie here...maybe this is to be expected, but ever since one of my client sites has switched to TLS 1.3, I've had a couple of crawl issues and other hiccups. First, I noticed that I can't use HTTPSTATUS.io any more...it renders an error message for URLs on the site in question. I wrote to their support desk and they said they haven't updated to 1.3 yet. Bummer, because I loved httpstatus.io's functionality, esp. getting bulk reports. Also, my Moz campaign crawls were failing. We are setting up a robots.txt directive to allow rogerbot (and the other bot), and will see if that works. These fails are consistent with the date we switched to 1.3, and some testing confirmed it. Anyone else seeing these types of issues, and can suggest any workarounds, solves, hacks to make my life easier? (including an alternative to httpstatus.io...I have and use screaming frog...not as slick, I'm afraid!) Do you think there was a configuration error with the client's TLS 1.3 upgrade, or maybe they're using a problematic/older version of 1.3?? Thanks -
Technical SEO | | TimDickey0 -
Updating inbound links vs. 301 redirecting the page they link to
Hi everyone, I'm preparing myself for a website redesign and finding conflicting information about inbound links and 301 redirects. If I have a URL (we'll say website.com/website) that is linked to by outside sources, should I get those outside sources to update their links when I change the URL to website.com/webpage? Or is it just as effective from a link juice perspective to simply 301 redirect the old page to the new page? Are there any other implications to this choice that I may want to consider? Thanks!
Technical SEO | | Liggins0 -
How to fix broken links?
Hi, I use WordPress CMS with Yoast SEO plugin. I have just found out that my 403 errors increased dramatically. It seems that all my tags below of each post are being broken for some reason. When i click on the tags i get the following massage: **403 Forbidden Request forbidden by administrative rules. ** I assume it has something to do with the configuration within Yoast SEO plugin. Dose anyone know how should i fix that? Thanks, Raviv evsGujA
Technical SEO | | Indiatravelz0 -
Footer Links with same anchor text on all pages
We have different websites targeted at the different services our company provides. (e.g. For our document storage services, we have www.ukdocumentstorage.com. For document management, we have www.document-management-solutions.co.uk). If we take the storage site for example, every single page has a link in the footer to our document management site, with the anchor text 'Cleardata Document Management' SEOMoz is telling me that these are seen as external links (as they are on a different URL's), and I'm just clarifying that would this be a major possible factor in the website not ranking highly? How should I rectify this issue?
Technical SEO | | janc0 -
How to find and fix 404 and broken links?
Hi, My campaign is showing me many 404 problems and other tools are also showing me broken links, but the links they show me dose work and I cant seem to find the broken links or the cause of the 404. Can you help?
Technical SEO | | Joseph-Green-SEO0 -
Having www. and non www. links indexed
Hey guys, As the title states, the two versions of the website are indexed in Google. How should I proceed? Please also note that the links on the website are without the www. How should I proceed knowing that the client prefers to have the www. version indexed. Here are the steps that I have in mind right now: I set the preferred domain on GWMT as the one with www. I 301 redirect any non www. URL to the www. version. What are your thoughts? Should I 301 redirect the URL's? or is setting the preference on GWMT enough? Thanks.
Technical SEO | | BruLee0 -
Drop Down Menu - Link Juice Depletion
Hi, We have a site with 7 top level sections all of which contain a large number of subsections which may then contain further sub sections. To try and ensure the best user experience we have a top navigation with the 7 top level sections and when hovered a selection of the key sub sections. Although I like this format for the user as it makes it easier for them to find the most important sections / sub sections it does lead to a lot of links within every page on the site. In general each top section has a drop down with approx 10 - 15 subsections. This has therefore lead to SeoMoz's tools issuing its too many internal links warning. Then alongside this I am left wondering if I shouldn’t have to many links to my subsections and whether I would be better off being more selective of when I link to them. For instance I could choose the top 5 sub sections and place a link to them from our homepage and by doing so I would be passing a greater amount of link juice down the line. So I guess my dilemma is between ensuring the user has as easy a time traversing the site as possible whilst I try to keep a close watch on where, and how, our link juice is distributed. One solution I am considering is whether no-follow links could be utilised within the drop down menus? This way I could then have the desired user navigation and I would be in greater control of what pages link to which sub sections. Would that even work? Any advice would be greatly appreciated, Regards, Guy
Technical SEO | | guycampbell1 -
Add to Cart Link
We have shopping cart links (<a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p"></a> <a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p">The SEOMoz site crawls are flagging these as a massive number of 302 redirects and I also wonder what sort of effect this is having on linkjuice flowing around the site. </a> <a href's,="" not="" input="" buttons)="" that="" link="" to="" a="" url="" along="" the="" lines="" of="" cart="" add="" 123&return="/product/123. </p">I can see several possible solutions: Make the links nofollow Make the links input buttons Block /cart/add with robots.txt Make the links 301 instead of 302 Make the links javascript (probably worst care) All of these would result in an identical outcome for the UX, but are very different solutions. What would you suggest?</a>
Technical SEO | | Aspedia0