Too Many On-Page Links
-
Most of my pages have "Too Many On-Page Links".
If you view the website you will see this is mainly down to the top navigation drop down menu: http://www.cwesolutions.co.uk
So if I wanted to reduce the number of links I would have to have category links with landing pages.
How much does having "Too Many On-Page Links" effect my website ranking? Is it really important and would I notice a difference if I changed it?
-
In the sense that a report is alerting that there are too many links on a page though is going to be because it is warning of authority leakage more than anything else though.
Granted, the insertion of 'just' in my previous statement probably misleads...
I agree though that if you were to link every page to every page, it would just cause a royal headache to crawlers - why one would do that though is beyond me.
-
Sorry Pete, that was a typo where I meant to say 'internal linkage is classed differently as external linkage' (I've amended my reply).
-
To say "linking just dilutes the authority from the page" is not strictly true.
In this particular instance I do not believe that the menu is too OTT in terms of links but if you have every page linking to every page you just end up creating a mesh of links that search engines can waste crawl time trying to decipher.
Too many links with poor structure can mean that your crawl allocation is wasted which results in less of your pages being indexed properly.
-
Thanks Geoff, but want did you mean here...
Internal linkage is classed differently as internal linkage?
-
What makes you think you have too many on page links? Internal linkage is classed differently as external linkage, linking just dilutes the authority from the page (amongst other crawlability and usability factors), but if it's internal linkage, at least the authority is kept on your domain.
Simple, effective and easy to use navigation is critical to website usability and will have an affect on performance in search engines. In your example, I wouldn't expect any negative ranking affects to occur as a result of the style of navigation menu your website utilises. If that is deemed most useful for your customers, then that's the best approach.
-
Use a tool such as SEOmoz' Crawl Test or Xenu's Link Sleuth to determine any crawling issues. There are a number of free online tools for this too.
-
Thanks for your reply and explanation. One more question... How would I know if the bots are having a problem crawling my website?
-
Too many links on a page can make it difficult for bots to crawl your site. If you have over 100 links on page, make certain you have a sitemap that lays out a road map for the bots.
If your site is being effectively crawled, don't worry about it. I would worry if the linking structure is impacting the user experience or if deeper pages weren't getting crawled.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internally linked pages from different subdomain must be well optimised?
Hi all, We have guide/help pages from different subdomain (help.website.com). And we have linked these from 3rd hierarchy level pages of our website (website.com/folder1/topic2). But help.website sumdomain & pages are not well optimised. So, I am not sure linking these subdomain pages from our website pages hurts our rankings? Thanks,
Web Design | | vtmoz0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Do Follow Link In Footer Only: How Do I Do it?
In a past Q&A forum about web design companies adding footer links to the websites they make, I really liked Irving Weisses' solution where he stated: "I think the best solution is a dofollow homepage ONLY footer link. This is the highest PR page, usually the most traffic so good visibility for advertising, you're not creating tons of sitewide links with identical anchor texts, and the owner is only leaking some PR on their homepage." I want to implement this but would like to know the best way to do this. I deal Wordpress 95% of the time. Is there a plugin or CSS code that would allow me to put a Do follow link in the footer but make the link disappear on all the other pages? Thanks in advanced everyone 🙂 Wesley Barras, Houston, TX
Web Design | | Wesley-Barras0 -
FTP hacked - links build...
Hi guys, our ftp got hacked a few days ago and the person uploaded a bunch of html pages with links to fake rolex websites. They also build a ton of links to the urls they had created in the image folder on our site with very spammy keyword anchor texts all in line of the same fake watches. What do you recommend as a plan of action? Make a list of all the backlinks and upload them in the Google disavow tool? Obviously i have already deleted the pages they were linking to and secured our ftp. Looking forward to your advice 🙂
Web Design | | Immanuel0 -
Attachment Pages
i have hundreds/thousands of images on my site, but for some reason the images on this page - http://indigocarhire.co.uk/top-of-the-range-car-hire/ - are being flagged as attachment pages, meaning im getting errors for duplicate titles, missing metas ect why are these images and only these ones being flagged up, they have been added in exactly the same way as every other image on the site appreciate any advice Thanks
Web Design | | RGOnline0 -
Too Many On Page Links, rel="nofollow" and rel="external"
Hi, Though similar to other questions on here I haven't found any other examples of sites in the same position as mine. It's an e-commerce site for mobile phones that has product pages for each phone we sell. Each tariff that is available on each phone links through to the checkout/transfer page on the respective mobile phone network. Therefore when the networks offer 62 different tariffs that are available on a single phone that means we automatically start with 62 on page links that helps to quickly tip us over the 100 link threshold. Currently, we mark these up as rel="external" but I'm wondering if there isn't a better way to help the situation and prevent us being penalised for having too many links on page so: Can/should we mark these up as rel="nofollow" instead of, or as well as, rel="external"? Is it inherently a problem from a technical SEO point of view? Does anyone have any similar experiences or examples that might help myself or others? As always, any help or advice would be much appreciated 🙂
Web Design | | Tinhat0 -
301 Redirect ! Joomla Pages, Already ranking. ( just wanted to change the url
hey guys hope everyone had a new year. I am ranking for a page on my site that i want to ( not specifically move ), but just change the url name: It is too long i think and i want to move it from one portion of my architecture to another menu. I have never physically done a 301 redirect myself, always had someone do it for me. I wanted some pointers. Since it is a fairly new site 4 months old! What are my options. Do i need to 301 redirect the page, if i am changing the Structure and AI of my site, or can i just change the url as is and it will still get ranked? How do i keep that url put delete the page and redirect it ? Sorry its very simple but i wanted to get the communities help to continue on ! Best Wishes HAmpig
Web Design | | BizDetox0 -
Dynamic pages and code within content
Hi all, I'm considering creating a dynamic table on my site that highlights rows / columns and cells depending on buttons that users can click. Each cell in the table links to a separate page that is created dynamically pulling information from a database. Now I'm aware of the google guidelines: "If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few." So we wondered whether we could put the dynamic pages in our sitemap so that google could index them - the pages can be seen with javascript off which is how the pages are manipulated to make them dynamic. Could anyone give us a overview of the dangers here? I also wondered if you still need to separate content from code on a page? My developer still seems very keen to use inline CSS and javascript! Thanks a bundle.
Web Design | | tgraham0