Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Why is our pagerank is still only 3/10?
-
Hi,
Our site https://soundbetter.com has been live for 2 years now, and as of yet we haven't yet been able to get our PageRank above 3/10.
We have thousands of unique pages and plenty of original contextual content, we avoid duplicate content best we can, follow google's best practices for site structure, deal with any issues that come up in webmaster tools, have schema.org markup, avoid link spamming, have inbound links from authority sites (though OSE doesn't show most of them for some reason), lots of social shares to our pages and the domain has been owned by us for 12 years.
Any thoughts on why we would still have a PR of 3?
Thanks for helping
-
The pagerank won't update. What you have now is what you'll have (probably forever). Good thing is that pagerank doesn't count. Your low DA/PA is due to the few links or low quality links you may have. If you want to increase it, you'll need more links (from high PA pages of course).
-
Shaq, the reason it is not higher is, because it wasn't updated at all since 2013.
Regarding DA, check again the http://moz.com/learn/seo/domain-authority link that I have provided in my previous answer.
Moz clearly states, that you can not directly(or it can be a hard task to) influence this metric (check the SEO best practices part in the bottom of the article).
Quote:
"Unlike other SEO metrics, Domain Authority is difficult to influence directly. It is made up of an aggregate of metrics (MozRank, MozTrust, link profile, and more) that each have an impact on this score. This was done intentionally; this metric is meant to approximate how competitive a given site is in Google.com. Since Google takes a lot of factors into account, a metric that tries to calculate it must incorporate a lot of factors, as well.The best way to influence this metric is to improve your overall SEO. In particular, you should focus on your link profile—which influences MozRank and MozTrust—by getting more links from other well-linked-to pages."
-
Thanks Alick, Moosa and Keszi,
We realize PR is no longer as important a factor as it used to be, but it still is a factor and represents something, and we can't figure out why it's not higher.
We look at domain authority in OSE as well and it's 29, also lower than we'd expect it to be.
-
As Alick300 has mentioned, Pagerank is a metric that we do not use anymore.
I would advise you to check the metrics that Moz is using: PA, DA, mR, DmR, and in case you are a Pro member, you could also check mT and DmT:
- PA: Page authority: http://moz.com/learn/seo/page-authority
- DA: Domain authority: http://moz.com/learn/seo/domain-authority
- mR&DmR: mozRank and domain mozRank: http://moz.com/learn/seo/mozrank
- mT&DmT: mozTrust and domain level mozTrust: http://moz.com/learn/seo/moztrust
- spam score: http://moz.com/blog/spam-score-mozs-new-metric-to-measure-penalization-risk
Be aware that these are only metrics and in some cases increasing these metrics doesn't necessarily mean that you will rank better: Correlation <> Causation (check the article I have linked into the spam score, by Rand).
Also you could check the whiteboard Friday: http://moz.com/blog/understanding-and-applying-mozs-spam-score-metric-whiteboard-friday
I hope this will be helpful to you.
Keszi
-
As far as my understanding goes, Domain Authority counts the number of link coming to your website and not really the quality of links so if you want to increase your DA you not only have to look in to quality but quantity as well.
Hope this helps!
-
Hi,
Page rank last updated on December 2013 & John Muller says "we’re probably not going to be updating it going forward".
So I would like to suggest you to check MozRank ( moz version of PR) instead of PR & PA ,DA.
Thanks
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ogranization Schema/Microformat for a content/brand website | Travel
Hi, One of our clients have a website specific to a place, for eg. California Tourism in which they publish local information related to tourism, blogs & other useful content. I want to understand how useful is to publish Organization Schema on such website mentioning the actual Organization, which in this case is a Travel Agency? Or any other schema would fit in for such websites?
Intermediate & Advanced SEO | | ds9.tech0 -
What does Disallow: /french-wines/?* actually do - robots.txt
Hello Mozzers - Just wondering what this robots.txt instruction means: Disallow: /french-wines/?* Does it stop Googlebot crawling and indexing URLs in that "French Wines" folder - specifically the URLs that include a question mark? Would it stop the crawling of deeper folders - e.g. /french-wines/rhone-region/ that include a question mark in their URL? I think this has been done to block URLs containing query strings. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Question about Indexing of /?limit=all
Hi, i've got your SEO Suite Ultimate installed on my site (www.customlogocases.com). I've got a relatively new magento site (around 1 year). We have recently been doing some pr/seo for the category pages, for example /custom-ipad-cases/ But when I search on google, it seems that google has indexed the /custom-ipad-cases/?limit=all This /?limit=all page is one without any links, and only has a PA of 1. Whereas the standard /custom-ipad-cases/ without the /? query has a much higher pa of 20, and a couple of links pointing towards it. So therefore I would want this particular page to be the one that google indexes. And along the same logic, this page really should be able to achieve higher rankings than the /?limit=all page. Is my thinking here correct? Should I disallow all the /? now, even though these are the ones that are indexed, and the others currently are not. I'd be happy to take the hit while it figures it out, because the higher PA pages are what I ultimately am getting links to... Thoughts?
Intermediate & Advanced SEO | | RobAus0 -
How much does dirty html/css etc impact SEO?
Good Morning! I have been trying to clean up this website and half the time I can't even edit our content without breaking the WYSIWYG Editor. Which leads me to the next question. How much, if at all, is this impacting our SEO. To my knowledge this isn't directly causing any broken pages for the viewer, but still, it certainly concerns me. I found this post on Moz from last year: http://moz.com/community/q/how-much-impact-does-bad-html-coding-really-have-on-seo We have a slightly different set of code problems but still wanted to revisit this question and see if anything has changed. I also can't imagine that all this broken/extra code is helping our page load properly. Thanks everybody!
Intermediate & Advanced SEO | | HashtagHustler0 -
301 redirect with /? in URL
For a Wordpress site that has the ending / in the URL with a ? after it... how can you do a 301 redirect to strip off anything after the / For example how to take this URL domain.com/article-name/?utm_source=feedburner and 301 to this URL domain.com/article-name/ Thank you for the help
Intermediate & Advanced SEO | | COEDMediaGroup0 -
Article Marketing / Article Posting
I am working on the SEO on a few different websites and I have built out an article marketing campaign so that I can get high quality backlinks for my website. I have been writing the content myself and I have been manually building out the top Web 2.0, Article Directory, and Doc Sharing sites. today I was creating an account on squidoo and I wondered if it mattered if I had the username be one of two things: my keyword as a user name, like: [keyword+geotag] example: roofinghouston just my first and last name as the username (or just a username I always use) (The reason behind #1 would be to have the optimized keyword and location I am trying to rank for, inside of the username. The reason for #2 would be that I don't want to get into trouble by having "too much" optimization.) I know a bit about optimization and that getting your keyword out there is great in a lot of areas, but I am not sure if it looks "suspicious" if I have my username be the keyword+geotag. I am just worried that all of this hard work will be torn down if I look like I'm trying too hard to be optimized, etc etc. There is no one answer, I am mainly looking for shared experiences. If you do have a definite answer, then I would like that too 🙂 Thanks SEOMoz!
Intermediate & Advanced SEO | | SEOWizards0 -
Robots.txt: Can you put a /* wildcard in the middle of a URL?
We have noticed that Google is indexing the language/country directory versions of directories we have disallowed in our robots.txt. For example: Disallow: /images/ is blocked just fine However, once you add our /en/uk/ directory in front of it, there are dozens of pages indexed. The question is: Can I put a wildcard in the middle of the string, ex. /en/*/images/, or do I need to list out every single country for every language in the robots file. Anyone know of any workarounds?
Intermediate & Advanced SEO | | IHSwebsite0 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0