Site not passing page authority....
-
Hi,
This site powertoolworld.co.uk is not passing page authority. In fact every page shows no links unless it has a link from an external source.
Originally this site blocked Roger from crawling it but that block was lifted over 6 months ago. I also ran a crawl test last night and it shows the same thing. PA of 1 and no links.
I would like to point out that the problem seems to be the same for all sites on the same platform. Which points me in the direction of code. for example there is a display: none tag in the ccs which is used to style where the side bar links are. It's a Blue Park platform.
What could be causing the problem?
Thanks in advance.
EDIT
Turns out that blocking the ezooms crawler stopped it from being included.
-
Hi there Kyle,
Thanks for writing in! Sorry for the delay, I was able to take a look at the data on the tests that I ran and to look at your crawl. From what I can tell, these are the things I noticed:
- Our crawler is pulling up your internal links, if you check out your crawl diagnostic from your "Crawl Diagnostic" tab (export to CSV), you will find that we are pulling your internal links, they are under "link count."
- The internal links count we are displaying 0 is because that data is based off of our Mozcape index, so pages that we have not indexed yet will display "0" links, which is something that can be misleading since if we have no data for that page, then technically we didn't crawl the links.
- Our Mozcape index found some of your higher index pages, you can check them out here: www.opensiteexplorer.org/pages?site=www.powertoolworld.co.uk%2F
In terms of why your pages isn't passing authority, I don't have a straight answer for that. Since Google can index more of your pages than us, they are looking at a more broader picture than OSE, so the metrics in OSE should be used as a secondary stat rather than as your primary source.
I hope that helps! Let me know if you have any questions here or in our help ticket
Best,
Peter
SEOmoz Help Team. -
I've got it in the support queue but don't have a firm answer yet. I suspect it would require a data update - those are getting faster (about 2-3 weeks).
-
Great.
I'm hoping its a moz issue over a site issue.
If it is a moz issue I'm assuming this wont update until the next Linkscape update?
Thanks
-
Ah, got it - you've got PA on a few pages, but that's it. Yeah, that definitely seems wrong. Let me ping support and see if we can get any answers.
-
Hey Peter,
Thanks for looking into this.
I'm checking the www version.
There is a page authority on around 6 pages in total. All have external links.
I've checked all of that too and everything looks normal. The ccs display: none thing is maybe just clutching at straws.
-
I'm not seeing any issues in the source code, and Xenu (desktop crawler) is seeing the internal links. You've got 22K+ pages indexed in Google, and the cached version looks normal (no cloaking or other oddities).
Are you check the "www" or non-www version? I notice you redirect to "www", so some of our tools may give you odd stats on the non-canonical version. I've seeing a PA of 39 in Open Site Explorer, though.
Let me know where you're looking, and maybe I can get Support to take a peek. It is possible something happened with blocking our bots in the past (I'm not sure how often we re-check that).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Thousands of 404-pages, duplicate content pages, temporary redirect
Hi, i take over the SEO of a quite large e-commerce-site. After checking crawl issues, there seems to be +3000 4xx client errors, +3000 duplicate content issues and +35000 temporary redirects. I'm quite desperate regarding these results. What would be the most effective way to handle that. It's a magento shop. I'm grateful for any kind of help! Thx,
Technical SEO | | posthumus
boris0 -
Page disappeared from Google index. Google cache shows page is being redirected.
My URL is: http://shop.nordstrom.com/c/converse Hi. The week before last, my top Converse page went missing from the Google index. When I "fetch as Googlebot" I am able to get the page and "submit" it to the index. I have done this several times and still cannot get the page to show up. When I look at the Google cache of the page, it comes up with a different page. http://webcache.googleusercontent.com/search?q=cache:http://shop.nordstrom.com/c/converse shows: http://shop.nordstrom.com/c/pop-in-olivia-kim Back story: As far as I know we have never redirected the Converse page to the Pop-In page. However the reverse may be true. We ran a Converse based Pop-In campaign but that used the Converse page and not the regular Pop-In page. Though the page comes back with a 200 status, it looks like Google thinks the page is being redirected. We were ranking #4 for "converse" - monthly searches = 550,000. My SEO traffic for the page has tanked since it has gone missing. Any help would be much appreciated. Stephan
Technical SEO | | shop.nordstrom0 -
How do I influence what page on my site google shows for specific search phrases?
Hi People, My client has a site www.activeadventures.com. They provide adventure tours of New Zealand, South America and the Himalayas. These destinations are split into 3 folders in the site (eg: activeadventures.com/new-zealand, activeadventures.com/south-america etc....). The actual root folder of the site is generic information for all of the destinations whilst the destination specific folders are specific in their information for the destination in question. The Problem: If you search for say "Active New Zealand" or "Adventure Tours South America" our result that comes up is the activeadventures.com homepage rather than the destination folder homepage (eg: We would want activeadventures.com/new-zealand to be the landing page for people searching for "active new zealand"). Are there any ways in influence google as to what page on our site it chooses to serve up? Many thanks in advance. Conrad
Technical SEO | | activenz0 -
Many Pages Being Combined Into One Long Page
Hi All, In talking with my internal developers, UX, and design team there has been a big push to move from a "tabbed" page structure (where as each tab is it's own page) to combining everything into one long page. It looks great from a user experience standpoint, but I'm concerned that we'll decrease in rankings for the tabbed pages that will be going away, even with a 301 in place. I initially recommending#! or pushstate for each "page section" on the long form content. However there are technical limitations with this in our CMS. The next idea I had was to still leave those pages out there and to link to them in the source code, but this approach may get shot down as well. Has anyone else had to solve for this issue? If so, how did you do it?
Technical SEO | | AllyBank1 -
Staging site and "live" site have both been indexed by Google
While creating a site we forgot to password protect the staging site while it was being built. Now that the site has been moved to the new domain, it has come to my attention that both the staging site (site.staging.com) and the "live" site (site.com) are both being indexed. What is the best way to solve this problem? I was thinking about adding a 301 redirect from the staging site to the live site via HTACCESS. Any recommendations?
Technical SEO | | melen0 -
Why would this site outrank a Pr2 site with higher domain authority?
I am trying to get a pr2 site to be on top 7 local spot for the keyword Van Nuys Bail bonds but have discovered a site which has barely any back links and is not even a year old on top results. Their backlinks are from lower authority domains than what we have. How could this site be beating a 7 year old pr2 website? The site I'm working on is http://bbbail.com/ The site that is ranking in 5th spot local with pr0 is http://www.vipbailbonds.org/ is it maybe because it is a .org site? Also I notice that all websites in top spots have www, could that be a factor as well?
Technical SEO | | jesse13410 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0