Subpages have Page Authority of 1 behind Home with DA 50
-
How come that our subpages all have a PA 1 if the home got DA 50?
technical specialities:
- Megamenue opens on click only
- Category pages dont exist (home/i-do-not-exist-as-page-category/PA-1-subpage)
- All subpages have a high amount of links to ressources (over 200)
what would be the most obvious cause for the low PA? would the external link profile be the main reason?
thanks in advance. I would be happy to answer your questions
Kind regards
-
Hello,
if anybody has an idea, we would much appreciate some input.
Thank you in advance.
-
Hi,
no problem. I am happy to answer them:
the subpages are live since the launch.
the subpages have been crawled:
Linking root domains: 0
Status: 200
the main issue is duplicate content as the different language urls link to the same page:
-
I have a couple questions. How long have these subpages and their backlinks been live? When you use the Moz tools, have your backlinks been indexed?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
Any idea why pages are not being indexed?
Hi Everyone, One section on our website is not being indexed. The product pages are, but not some of the subcategories. These are very old pages, so thought it was strange. Here is an example one one: https://www.moregems.com/loose-cut-gemstones/prasiolite-loose-gemstones.html If you take a chunk of text, it is not found in Google. No issues in Bing/Yahoo, only Google. You think it takes a submission to Search Console? Jeff
Technical SEO | | vetofunk1 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Best practice for author tags: G+ personal or G+ company page?
I work for a company that has a corporate G+ page. I have a personal G+ page. When I write articles for the company blog there are 2 questions that come up: (1) for the rel="author" tag within the blog posting on the company's blog, should I reference my personal G+ page, or the company's G+ page as the author? (2) which G+ page, mine or my company's, should share the link to the blog posting on the company's site? Or should both share it? My goal is to build up author rank for either me or the company I work for (don't care which) so that after a while the Google organic search listing will include the author thumbnail if the article ranks for the search query. I don't care if the thumbnail is me or my company; just trying to figure out how to best link everything to maximize the chance of getting an author thumbnail in the search rankings. Thanks!
Technical SEO | | scanlin0 -
Pros and Cons of Rel Author on Product Pages
I've heard that having rel=author enabled on your pages can be great for increasing click through rate but you should not use it on every page on your site. What are the pros and cons of using rel=author on product pages? Do you use rel=author on your product pages or just on your blog articles?
Technical SEO | | Charlessipe1 -
Server Vs Authority
Deciding on whether to go for a Sub directory or CC tld structure. So the tradeoff would be one server location (which can affect local rankings if the server is outside the country) VS a better passing of link authority. What factor is more important?
Technical SEO | | Tourman0 -
How do I fix duplicate content with the home page?
This is probably SEO 101, but I'm unsure what to do here... Last week my weekly crawl diagnostics were off the chart because http:// was not resolving to http://www...fixed that but now it's saying I have duplicate content on: http://www.......com http://www.......com/index.php How do I fix this? Thanks in advance!
Technical SEO | | jgower0 -
Duplicate Page Content and Title for product pages. Is there a way to fix it?
We we're doing pretty good with our SEO, until we added product listing pages. The errors are mostly Duplicate Page Content/Title. e.g. Title: Masterpet | New Zealand Products MasterPet Product page1 MasterPet Product page2 Because the list of products are displayed on several pages, the crawler detects that these two URLs have the same title. From 0 Errors two weeks ago, to 14k+ errors. Is this something we could fix or bother fixing? Will our SERP ranking suffer because of this? Hoping someone could shed some light on this issue. Thanks.
Technical SEO | | Peter.Huxley590