Are DA and PA still important?
-
I am quite vague about this concept, a few years ago it was quite effective and I used it to evaluate the quality of the website. But lately, I have a headache because some of the DA, PA pages are lower than my website (This keyword doing SEO: Intel core i9) but the rank is still higher.
Hope to be answered.
Thanks! -
Yenu, I also find that occasionally, websites with poorer PA and DA rank higher than sites with weaker PA and DA. This certainly backs the point that these are Moz metrics and not Google metrics. That said, Moz professionals (good SEOs in general) would not encourage you to set goals to increase PA and DA, but rather set goals that bring the highest value to the website. In the SEO world, that could equate to more pages in the top 3 (or 5, or 10) position, which is not a Moz metric, but is straight from Google.
I hope this helps!
Zack
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
A large number of high spam links are negatively affecting my DA, how do I remove them?
I have identified a large number of very high spam score links to "free wallpaper" coming into my site.
Technical SEO | | beckygh
I am running a wordpress blog and would like some advice on the best course of action. There are thousands of spam domains linking to various images on my site with the anchor text "get free high quality hd wallpaper" The webmasters for these domains are not contactable so I am planning to submit a disavow file to google. I am aware these links have negatively affected my DA so would like to do more to remove them. My questions are: will deleting the images they link to help?
As this is on a wordpress site deleting the images will result in a soft 404, should I force a hard 404 to properly break the link?
Will this positively improve my DA?1 -
Why seomoz.org still in Google index?
I searched in Google, the number of URLs indexed left in the seomoz.org domain since it changed to moz.comI am surprised that after all this time more than 15,000 URLs indexed:https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site%3Aseomoz.org%20inurl%3Aseomoz.org If I clicked on any of the results it will be redirect (301) to the new domain, so it is working, but Google still keep these URLs in the index.
Technical SEO | | Yosef
What could be the reason?Will not cause duplicated content issue on moz.com?0 -
Implemented google adwords via tag manager do it still require to paste script at thank you page?
Hi All Experts, I have implemented google adwords with tag manager, so now query is still it is required to place the google adwords scripts at thank you page?
Technical SEO | | varo0 -
Top 10 keywords is still going strong but the rest just got smashed!
Hi SEOMOZ and it's USERs, Been trying to find my answer online but now after three weeks of reading blogposts I'm going to try this. 🙂 My website was ranking really good on 10 important keywords but not so good on the long tail, between 11 - 50 on maybe 30 different other, not so important keywords.
Technical SEO | | Drillo
So I began doing some work (I'm a newbie) but this is what I did:
1. Changed top navigation structure to get 4 other pages (w/ keywords as links) in it. (used a dropdown)
2. Wrote plenty of text that was a good fit for the page. (The text is OK and not to spammy looking.)
3. Added three links from high quality sites with keywords as links to these pages. I added them from my own site that is on the same server, same IP. 😉 I know, not looking so good.
4. Changed URL structure on a couple of pages to get a keyword in it. (did a correct 301)
5. Changed to better Titles and headings on the page. Keywords in them both but not the same. The result:
1. My 10 most important keywords I began ranking even better. I rank no. 1 on 9 out of 10.
2. Almost all the other pages went from ranking ~ 15 - 50 to not > 50. It has now been 4 weeks since I did most of the changes and 3 weeks since all the pages was hit > 50. So now I'm thinking about what to do?
1. Should I clean up my text, titles so they don't look to over optimized?
2. Should I remove my links from my own pages? (my link profile in general is actually pretty good.)
3. or should I just wait? Because changing more will just indicate to Google that somehing fishy is going on 😉 ? In the beginning I hoped that Google killed my rankings just because of the big changes. But now after 3 weeks I'm more sceptical and thinks I've been hit by a over-optimizing filter. According to webmaster tools I've not been hit by a manually penalty. Please, help me. I would really appreciate all ideas from people here with more experience.0 -
Is the If-Modified-Since HTTP Header still relevant?
I'm relatively new to the technical side of SEO and have been trying to brush up my skills by going through Google's online Web-master Academy, which suggests that you need a If-Modified-Since HTTP Header tag on your site. I checked and apparently our web server doesn't support this. I've been told by a good colleague that the If-Modified-Since tag is no longer relevant as the spiders will frequently revisit a site as long as you regularly update and refresh the content (which we do). However our site doesn't seem to of been reindexed for a while as the cached version's are still showing the pages from over a month ago. So two question really - is the If-Modified-Since HTTP Header still relevant and should I make sure this is included? And is there anything else I should be doing to make sure the spiders crawl our pages? (apart from keeping them nice, fresh and useful)
Technical SEO | | annieplaskett0 -
Why Do Transparent Networks Still Work
Hi Mozzers, My client has a major competitor that dominates several industry head terms. A check of their link profile reveals that they have 50 low DA domains that are identical to the main site, the only difference being that they all link to the main domain for these terms. They're not even attempting to disguise the network but it works. Can anyone tell me why? See: www.omega.com/vhpc/
Technical SEO | | waynekolenchuk0 -
Advice on importing content please to keep page fresh
Hi i am working on a site at the moment http://www.cheapflightsgatwick.com which is a travel news and holiday news site but i am trying to find out what is best reference content for the following section http://www.cheapflightsgatwick.com/humberside-airport What i am thinking of doing to keep google keep coming to my section of the site is to import content, what i mean is, to use google for keywords such as humerside airport and have the stories appear on this page as well as writing my own content. I was thinking of importing content because it would help keep the page fresh without my original content become too old but i am concerned about this and not sure if it is the right thing to do. i am concerned if i do this if my rankings would reduce because i am importing content and for people to read the rest of the story they will have to leave my site. i am also worried that google will reduct points for duplicate content. can anyone let me know what i should do, if i should just stick to original content and not import it or to import it and if i should import it using google news how would i do this. many thanks
Technical SEO | | ClaireH-1848860 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0