Internal Linking Conundrum
-
Hello
I have a web site with a menu structure with three levels
Top Level (Single Item) > First Level (Single Item) > Second Level (Multiple Items)
The first level pages do not rank well. The top level and second level pages rank well.
Search Console acknowledges 600 internal links to the top level.
600 to the second level.
But only 100 to the first level.
This is true across 6 top level items. It is not to do with page level links. It just isn't acknowledging all the links to the first level.
Does anyone know why this might be?
Thanks for any assistance you can give me.
-
The number of internal links you make to a page shows the level of importance you place on that page within your domain. Fact check first sentence - https://support.google.com/webmasters/answer/138752?hl=en
That's the lesser point here though. There's a bigger issue I need to resolve. My most important pages are the ones google perceives to have far fewer internal links and these pages are not being shown in the serps, instead the lower pages are showing.
I will progress with checking the js and robots to see if I can get this crawl sorted.
-
You need to fat check the statement that 'google rates the importance of pages by the number of internal links they have' That's not true anymore. The number of links hasn't been a thing for about 6 years. It's how people are using your site. Search console is also notorious for getting stuff wrong.
Perhaps try a crawl with Screaming Frog and you'll have more success. Or maybe someone else can shed some light?
-
Im going to be checking the javascript and robots to see if there's any reason the links aren't being counted. But if anyone else has any insight it would be most welcome
-
Thanks for the response but that's not really what I'm asking.
Search console is not registering the internal links to the middle level of menu items.
It is saying that the lower pages have many more internal links which is not true. Google rates the importance of pages by the number of internal links they have.
-
Could it be that the top level pages are much more competitive than their sub levels? Usually top level pages are more generic keywords that have massive volume and a whole lot more competition. Also could it be worth using a visitor recording software like HotJar to see how people are navigating through your site.
I have tried to do (the almost definct now) practice of 'page rank sculpting' to use my power pages to give authority to my less powerful or newer ones to help them rank. What I've found is that it's not just having the link present but it's people using the link and navigating from the powerful page to the newer page or not so powerful one that gives it the ability to rank.
So to give you an example I run a dentists and put a link from my very authoritative Veneers page to my much newer and less authoritative Dental Implants page. But when I looked at the stats in Analytics and the recordings on Hotjar I saw that nobody ever clicked the link because those customers didn't want that product.
However, when I did the same thing to my 'teeth whitening page' it shot up the rankings because people were actually leaving the veneers page to look at whitening because it's a related product.
So I guess what I'm trying to say is that it's not just the presence of the link but people using it and user generated feedback that seems to pass ranking power. So I took out all the lesser used links and saw really great results.
Perhaps this is not the answer you're looking for but if you could give more specifics about the terms and words and topics then maybe we could help some more. Navigation is something I really took a while to get my head around and even longer to make it work effectively to rank for the my main topics. Also remember that the longer tail sub--categories usually carry more commercial intent so ranking for "Composite Veneers Cost" In my case makes me 10x more money for my practice than ranking for 'Veneers' where people want to research and just look at them - and sometimes they're not even looking for dental but wooden floors and kitchen veneers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Websites - Hosting, Domain Names, & Configuration
What is the best way to configure a website that targets a number of countries and languages around the world? For example, Apple has websites optimized for just about every country and language in the world (see: https://www.apple.com/choose-country-region/). When you choose the UK it takes you to: https://www.apple.com/uk/ When you choose China it take you to: https://www.apple.com/cn/ Etc. When you go to apple.co.uk it forwards you to the UK version of the website. The same is true for apple.cn. Is this the ideal way to set it up? I have also seen websites that have each version of the website on its own TLD such as exampleBrand.co.uk and exampleBrand.cn - in this example they don't forward to the .com. My concern with Apple's solution is SEO and hosting. Do consumers favor seeing their country's TLD in search results over exampleBrand.com/uk? For hosting, shouldn't the mainland China version of the website be hosted in China? Is it possible to just host a folder of a website in a certain country such as the cn folder for China? Any suggestions would be appreciated. I was unable to find much info on this.
Web Design | | rx3000 -
Do more links from sub-domains to domain (website) hurt rankings?
Hi all, If there are multiple sub-domains like abc.website.com, 123.website.com, etc...and if the top pages of website are linked from multiple sub-domains via top menu or footer links; will this hurts? Will too much interlinking of few top pages of a website from it's sub-domains dilute link juice? How many links ideally we can add to website from a sub-domain? Thanks
Web Design | | vtmoz0 -
Too Many Outbound Links on the Home Page - Bad for SEO?
Hello Again Moz community, This is my last Q of the day: I have a LOT of outbound links on the home page of www.web3.ca Some are to clients projects, most are to other pages on the website. Can reducing this to the core pages have a positive impact on SEO? Thanks, Anton
Web Design | | Web3Marketing870 -
Too many Internal Links: A good thing or bad thing?
According to Moz i have over 500 links on some pages, GWT says about 400 with most (90%) having about 210 links. When I eyeball the site it should have no more than 30 to 40, but my web agency tells me its because the way the CMS works (DotNetNuke) that all pages are linked via the top navigation. Does this mean that my internal linking strategy is flawed before i even start to consider which pages I want to have more links to? Our site www.over50choices.co.uk was rebranded & moved to the new CMS in June 2013. Should i be concerned or is this OK? Ash
Web Design | | AshShep10 -
Why do site links appear under one keyword and not another? Any ideas?
Hi everyone, I have a client whose website is doing the strangest thing. When I search the branded keyword (the company name), Google doesn't show any site links under the result. However, when I search for the company name plus Inc., I do get site links. Now, the website is ranking first in both searches, so that's not the issue. And, as near as I can tell, the site only contains one or two uses of the company name plus the word "inc." Most of the text on the page and all of the meta data only uses the company name, and most of the links that connect to the site use only the company name as well. Even the Who Is for the site doesn't use the term "inc." And ideas what might be going on? I know Google says that the process is still automated but for the life of me I can't figure out what kind of automated process would result in these results. Thanks! Megan (Rebecca's minion)
Web Design | | RebeccaRalston0 -
Managing international sites
Hi all, I am trying to figure out the best way to manage our international sites. We have two locations, 1 in the UK and 1 in the USA. I currently use GEOIP to identify the location of the browser and redirect them using a cookie to index.php?country=uk or index.php?country=usa. Once the cookie is set I use a 301 redirect to send them to index.php, so that Google doesnt see each url as duplicate content, which Webmaster tools was complaining about. This has been working wonderfully for about a year. It means I have a single php language include file and depending on the browser location I will display $ or £ and change the odd ise to ize, etc. Problem I am starting to notice is that we are starting to rank better and better in the USA search result. I am guessing this is because the crawlers must be based out of the USA. This is great, but my concern is that I am losing rank in the UK, which is currently where most of our business is done out of... So I have done my research and because I have a .net will go for a /uk/ or /us/ sub folder and create two separate webmaster tools site and set them up to target each geographic location. Is this okay? http://support.google.com/webmasters/bin/answer.py?hl=en&answer=182192#2 HERE IS THE PROBLEM: I don't was to have to run two separate website with two separate sets of copy. Also, I dont want to lose all the rank data on urls like: http://www.mysite.net/great-rank-result.html now becomes http://www.mysite.net/uk/great-rank-result.html. On top of this I will have two pages, the one just mentioned and now adding http://www.mysite.net/us/great-rank-result.html, which I presume would be seen as duplicate copy? (Y/n) Can I use rel canonical to overcome this? How can I don't this without actually running the two pages. Could you actually have 1 site in the root folder and just use the same GEOIP techology to do a smart MOD REWRITE adding either UK or US to the url therefore being able to create two webmaster accounts targeting each geographic location? Any advise is most welcome.
Web Design | | Mediatomcat0 -
What's the best way to sculpt links on a page?
I know PR isn't a top ranking factor anymore, so "PR sculpting" isn't something to focus on. But isn't it still true that having more links that you need on any given page is worse than having fewer, in terms of that page's authority? I'm managing a site that has a lot of navigational links in the footer, which are duplicative because they're almost all included in the top nav bar, and several are triplicated in the sidebar as well. I wanted to remove 85% of these duplicative links from the footer, thinking they diluted the page authority and that most users probably won't scroll there anyway when we launch the site. The site owner is pushing back, though, not wanting to remove so many links because he believes they might be useful to some users. We can test our respective user-behavior theories after launching, but right now I have two questions: Will having a sizable number of duplicative links in the footer dilute the page's authority? and 2) Are there any other ways to reduce this dilution, aside from simply removing the links? (I know nofollow is not the answer, but possibly using iframes or Java or something like that?)
Web Design | | KyleJB0 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090