Site not passing page authority....
-
Hi,
This site powertoolworld.co.uk is not passing page authority. In fact every page shows no links unless it has a link from an external source.
Originally this site blocked Roger from crawling it but that block was lifted over 6 months ago. I also ran a crawl test last night and it shows the same thing. PA of 1 and no links.
I would like to point out that the problem seems to be the same for all sites on the same platform. Which points me in the direction of code. for example there is a display: none tag in the ccs which is used to style where the side bar links are. It's a Blue Park platform.
What could be causing the problem?
Thanks in advance.
EDIT
Turns out that blocking the ezooms crawler stopped it from being included.
-
Hi there Kyle,
Thanks for writing in! Sorry for the delay, I was able to take a look at the data on the tests that I ran and to look at your crawl. From what I can tell, these are the things I noticed:
- Our crawler is pulling up your internal links, if you check out your crawl diagnostic from your "Crawl Diagnostic" tab (export to CSV), you will find that we are pulling your internal links, they are under "link count."
- The internal links count we are displaying 0 is because that data is based off of our Mozcape index, so pages that we have not indexed yet will display "0" links, which is something that can be misleading since if we have no data for that page, then technically we didn't crawl the links.
- Our Mozcape index found some of your higher index pages, you can check them out here: www.opensiteexplorer.org/pages?site=www.powertoolworld.co.uk%2F
In terms of why your pages isn't passing authority, I don't have a straight answer for that. Since Google can index more of your pages than us, they are looking at a more broader picture than OSE, so the metrics in OSE should be used as a secondary stat rather than as your primary source.
I hope that helps! Let me know if you have any questions here or in our help ticket
Best,
Peter
SEOmoz Help Team. -
I've got it in the support queue but don't have a firm answer yet. I suspect it would require a data update - those are getting faster (about 2-3 weeks).
-
Great.
I'm hoping its a moz issue over a site issue.
If it is a moz issue I'm assuming this wont update until the next Linkscape update?
Thanks
-
Ah, got it - you've got PA on a few pages, but that's it. Yeah, that definitely seems wrong. Let me ping support and see if we can get any answers.
-
Hey Peter,
Thanks for looking into this.
I'm checking the www version.
There is a page authority on around 6 pages in total. All have external links.
I've checked all of that too and everything looks normal. The ccs display: none thing is maybe just clutching at straws.
-
I'm not seeing any issues in the source code, and Xenu (desktop crawler) is seeing the internal links. You've got 22K+ pages indexed in Google, and the cached version looks normal (no cloaking or other oddities).
Are you check the "www" or non-www version? I notice you redirect to "www", so some of our tools may give you odd stats on the non-canonical version. I've seeing a PA of 39 in Open Site Explorer, though.
Let me know where you're looking, and maybe I can get Support to take a peek. It is possible something happened with blocking our bots in the past (I'm not sure how often we re-check that).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Crawl -> Duplicate Page Content -> Same pages showing up with duplicates that are not
These, for example: | https://im.tapclicks.com/signup.php/?utm_campaign=july15&utm_medium=organic&utm_source=blog | 1 | 2 | 29 | 2 | 200 |
Technical SEO | | writezach
| https://im.tapclicks.com/signup.php?_ga=1.145821812.1573134750.1440742418 | 1 | 1 | 25 | 2 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=blog&utm_campaign=brightpod-article | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=tapclicks&utm_medium=marketplace&utm_campaign=homepage | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?utm_source=blog&utm_campaign=first-3-must-watch-videos | 1 | 119 | 40 | 4 | 200 |
| https://im.tapclicks.com/signup.php?_ga=1.159789566.2132270851.1418408142 | 1 | 5 | 31 | 2 | 200 |
| https://im.tapclicks.com/signup.php/?utm_source=vocus&utm_medium=PR&utm_campaign=52release | Any suggestions/directions for fixing or should I just disregard this "High Priority" moz issue? Thank you!0 -
Disallowing WP 'author' page archives
Hey Mozzers. I want to block my author archive pages, but not the primary page of each author. For example, I want to keep /author/jbentz/ but get rid of /author/jbentz/page/4/. Can I do that in robots by using a * where the author name would be populated. ' So, basically... my robots file would include something like this... Disallow: /author/*/page/ Will this work for my intended goal... or will this just disallow all of my author pages?
Technical SEO | | Netrepid0 -
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect?
If I want clean up my URLs and take the "www.site.com/page.html" and make it "www.site.com/page" do I need a redirect? If this scenario requires a 301 redirect no matter what, I might as well update the URL to be a little more keyword rich for the page while I'm at it. However, since these pages are ranking well I'd rather not lose any authority in the process and keep the URL just stripped of the ".html" (if that's possible). Thanks for you help! [edited for formatting]
Technical SEO | | Booj0 -
Pages Linking to Sites that Return 404 Error
We have just a few 404 errors on our site. Is there any way to figure out which pages are linking to the pages that create 404 errors? I would rather fix the links than create new 301 redirects. Thanks!
Technical SEO | | jsillay0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Diagnostic says too many links on a page and most of the pages are from blog entries. Are tags considered links? How do I decrease links?
I just ran my first diagnostic on my site and the results came back were negative in the area of too many links one a page. There were also quite a few 404 errors. What is the best way to fix these problems? Most of the pages with too many links are from blog posts, are the tags counted as well and is this the reason for too many links?
Technical SEO | | Newport10300 -
Paginated Home Page Duplicates on Wordpress Sites
A number of my websites created on WP are displaying duplicate home pages with these types of urls. http://www.example.com/page/10/ http://www.example.com/page/11/ http://www.example.com/page/12/ I found these duplicates using the site:search command. Basically, put in any number and the Home Page opens. With the above mentioned url structure. Any idea on why they are created, how they can be stopped and what kind of an impact they would have in terms of SEO and the penalty that comes with duplicate content.
Technical SEO | | AsadMemon1 -
We're working on a site that is a beer company. Because it is required to have an age verification page, how should we best redirect the bots (useragents) to the actual homepage (thus skipping ahead of the age verification without allowing all browsers)?
This question is about useragents and alcohol sites that have an age verification screen upon landing on the site.
Technical SEO | | OveritMedia0