Too Many Links on One Page - What to Do?!
-
Hello Geniuses, Prodigies, and Experts of the Field,
My website pages for www.1099pro.com have too many links on one page, something like 150-175, and I understand that each page should ideally be under 100. Most of these links, approx 105, come from dropdown navigation options in the header toolbar or the footer links.
It is my take that these links make our site easier to navigate but I'm sure that they are hurting my pagerank / SERPs. Is there a best way to handle a situation like this? I'd really prefer not to alter the header/footer layout of the entire site by removing 50-75 navigational links. The only other idea I have is below but I have no idea if it would work.
- For any link that I do not care to pass pagerank, institute a "nofollow" parameter. This would be my favorite option if it is viable.
-
That's good to hear and thanks for the input!
The MOZ page grader told me that over 100 links was too many and so did a commenter from a separate post. All clear now though.
-
The reasoning behind limiting the number of links is because the amount of authority that is passed by a page is divided by the total number of links on that page - regardless of nofollow or not. So, the fewer links the more authority you are passing to each of those internal pages. Answering your subsidiary question, there would be no SEO benefit from nofollowing these links.
That being said, usability trumps this in my book always. Go into your Google Analytics, and see which of these links people are actually clicking. If they are going into your drop down links, then leave them. If they are only clicking on the head link, then consider chopping them.
-
As long as they are all listed in the sitemap, dont worry about it. William is correct in that Google lifted the 100 links per page limit. The question is, do the links serve a purpose, or are they there to increase the page count? Perhaps you are not seeing the engagement or seo results you want due to the organization of the site structure?
BTW, who or what told you there was a need to reduce the link count? Unless you got manually penalized, why is there a need to do this?
-
Google updated their guidelines a while ago, and no longer suggests the 100 or less links per page. Now the guideline simply states, "Keep the links on a given page to a reasonable number," which is subjective. https://support.google.com/webmasters/answer/35769?hl=en
With a site like yours, full of different kinds of forms and such, it's logical to consider having 100+ links per page. There are other options for you as well, if you believe these links are hurting, but according to Google they likely are not.
If you wanted to try something different, you could think about building out detailed category pages for each sections of things you offer on the site, and make those the pages that rank for your terms. This way, the number of links on your main page is dramatically reduced, and the user experience might improve, since things aren't quite as condensed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will Google Judge Duplicate Content on Responsive Pages to be Keyword Spamming?
I have a website for my small business, and hope to improve the search results position for 5 landing pages. I recently modified my website to make it responsive (mobile friendly). I was not able to use Bootstrap; the layout of the pages is a bit unusual and doesn't lend itself to the options Bootstrap provides. Each landing page has 3 main div's - one for desktop, one for tablet, one for phone.
Web Design | | CurtisB
The text content displayed in each div is the same. Only one of the 3 div’s is visible; the user’s screen width determines which div is visible. When I wrote the HTML for the page, I didn't want each div to have identical text. I worried that
when Google indexed the page it would see the same text 3 times, and would conclude that keyword spamming was occurring. So I put the text in just one div. And when the page loads jQuery copies the text from the first div to the other two div's. But now I've learned that when Google indexes a page it looks at both the page that is served AND the page that is rendered. And in my case the page that is rendered - after it loads and the jQuery code is executed – contains duplicate text content in three div's. So perhaps my approach - having the served page contain just one div with text content – fails to help, because Google examines the rendered page, which has duplicate text content in three div's. Here is the layout of one landing page, as served by the server. 1000 words of text goes here. No text. jQuery will copy the text from div id="desktop" into here. No text. jQuery will copy the text from div id="desktop" into here. ===================================================================================== My question is: Will Google conclude that keyword spamming is occurring because of the duplicate content the rendered page contains, or will it realize that only one of the div's is visible at a time, and the duplicate content is there only to achieve a responsive design? Thank you!0 -
Log-in page ranking but not homepage
Our homepage is outranked by log-in page for "primary keyword" in Google search results; for which actually our homepage was optimised. I have gone through the other answers for the same question here. But I couldn't find them related with our website. We are not over optimised. We have link from top navigation menu of blog to our homepage. Does this causing this?
Web Design | | vtmoz1 -
Location of body text on page - at top or bottom - does it matter for SEO?
Hi - I'm just looking at the text on a redesigned homepage. They have moved all the text to the very bottom of the page (which is quite common with lots of designers, I notice - I usually battle to move the important text back up to the top). I have always ensured the important text comes at the top, to some extent - does it matter where on the page the text comes, for SEO? Are there any studies you can point me to? Thanks for your help, Luke
Web Design | | McTaggart1 -
Are these doorway pages or not? Concerned due to Panda 4.0
For a new site we're building, the Products team wants the header (let's call this Product-Header) to have links to every subsection of every section on every page. Since this is a bad idea, I want Product-Header to be coded in such a way that it doesn't appear in the code or the links are nofollow, noindex. I want to instead create static versions of these pages without the Product-Header. The homepage links to the static URL section pages, those main section pages link to static subsection pages, and so on. It's one nice silo. I am concerned though that Google won't like this due to these static pages are being created specifically for search engines. Users could click through to this static parallel site from the homepage, or they could use the dynamic URL site. This is similar to what etsy.com is doing where you can search Google for "mermaid bridal" and get this page https://www.etsy.com/market/mermaid_bridal but the dynamic version of the page does not show up. However you can search on etsy.com for " mermaid bridal" and get https://www.etsy.com/search?q=mermaid bridal&ship_to=US. Could these static versions that show up in search engines be seen as doorway pages? I know ebay.com got spanked for doorway pages and I don't want to do anything that would get this site penalized.
Web Design | | CFSSEO0 -
Redirecting duplicate pages
For whatever reason, X-cart creates duplicates of our categories and articles so that we have URLs like this www.k9electronics.com/dog-training-collars
Web Design | | k9byron
www.k9electronics.com/dog-training-collars/ or http://www.k9electronics.com/articles/anti-bark-collar
http://www.k9electronics.com/articles/anti-bark-collar/ now our SEO guy says that we dont have to redirect these because google is "smart enough" to know they are the same, and that we should "leave it as-is". However, everything I have read online says that google sees this as dupe content and that we should redirect to one or the other / or no /, depending on which most of our internal links already point to, which is with a slash. What should we do? Redirect or leave it as is? Thanks!0 -
Sudden dramatic drops in SERPs along with no snippet and no cached page?
We are a very stable, time tested domain (over 15 yrs old) with thousands of stable, time tested inbound links. We are a large catalog/e commerce business and our web team has over a decade's experience with coding, seo etc. We do not engage in link exchanges, buying links etc and adhere strictly to best white hat seo practices. Our SERPs have generally been very stable for years and years. We continually update content, leverage user generated content etc, and stay abreast of important algorithm and policy changes on Google's end. On Wednesday Jan 18th, we noticed dramatic, disturbing changes to our SERPs. Our formerly very stable positions for thousands of core keywords dropped. In addition, there is no snippet in the SERPs and no cached page for these results. Webmaster tools shows our sitemap most recently successfully downloaded by Google on Jan 14th. Over the weekend and monday the 16th, our cloud hosted site experienced some downtime here and there. I suspect that the sudden issues we are seeing are being caused by one of three possibilities: 1. Google came to crawl when the site was unavailable.
Web Design | | jamestown
However, there are no messages in the account or crawl issues otherwise noted to indicate this. 2. There is a malicious link spam or other attack on our site. 3. The last week of December 2011, we went live with Schema.org rich tagging on product level pages. The testing tool validates all but the breadcrumb, which it says is not supported by Schema. Could Google be hating our Schema.org microtagging and penalizing us? I sort of doubt bc category/subcategory pages that have no such tags are among those suffering. Whats odd is that ever since we went live with Schema.org, Google has started preferring very thin content pages like video pages and articles over our product pages. This never happened in the past. the site is: www.jamestowndistributors.com Any help or ideas are greatly, greatly appreciated. Thank You DMG0 -
Mobile Site Pages: Word Count Help
Hi there I am doing a mobile website for a client and they asked me what the dieal word count would be per page. They are SEO conciosu but we are not doing SEO on this site. I would just like to know a general rule of thumb. Regards Stef
Web Design | | stefanok0 -
Does on page links have an effect on SERP rankings with PANDA
I have been doing some competitive analysis basing my company on others and have noticed a pattern. Very high ranking sites seem to have limited the internal and external on page links on their subdomains to under 100. my site has a lot of links but all are relevant and lead to unique content. I am interested to know if anyone else has noticed this pattern in changes in the SERP results. bIs google now penalizing pages with to many on site nav links? And if a full site restructure is needed to allow google to index and rank these pages or if a it is a non issue and does not need to be addressed. Panda confuses me!!!!! HELP!
Web Design | | Brother220