To many links on a single page Error
-
I've seen it a few times where you should have less then 100 links per page to help crawling unless your a massively authoritative website.
But what happens when your a large ecommerce website with categories and sub categories, you could have a category called 'computers' with a drop down list containing lots of sub cat links.
Whats the solution to this?
Cheers
-
I'm not an expert on this, but I know the 100 links thing doesn't matter anywhere near as much as it used to, or I'm not sure if it ever did in terms of being penalised. I think it was just a guideline that said if a page had more than 100 links, Google might not crawl them all, or PageRank might not be passed to anything over the 101st link...something like that. Google's crawling is a lot more sophisticated now, so as long as your site doesn't look like a link-farm I think you'll be fine.
I haven't read it yet, but I've just found this, it looks like it'll help: http://www.seomoz.org/blog/how-many-links-is-too-many
-
Sorry I cant offer you an answer but I also have the same issue, my report is reporting to many links on a page but I am not sure how to reduce this number due to categories - sub cats and the amount of products listed on the page.
Would be very interested to hear of a work around for this1
Thanks
Daniel
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On page link question, creating an additional 'county' layer between states and zips/cities
Question We have a large site that has a page for all 50 states. Each of these pages has unique content, but following the content has a MASSIVE amount of links for each zip AND city in that state. I am also in the process of creating unique content for each of these cities and zips HOWEVER, I was wondering would it make sense to create an additional 'county' layer between the states and the zips/cities. Would the additional 'depth' of the links bring down the overall rank of the long tail city and zip pages, or would the fact that the counties would knock the on page link count down from a thousand or so, to a management 50-100 substantially improve the overall quality and ranking of the site? To illustrate, currently I have State -> city and zip pages (1200+ links on each state page) what i want to do is do state -> county (5-300 counties on each state page) -> city + zip (maybe 50-100 links on each county page). What do you guys think? Am I incurring some kind of automatic penalty for having 1000+ links on a page?
On-Page Optimization | | ilyaelbert0 -
Link Building
I have to be doing something wrong. I have been trying to get homes for sale in Casa Grande AZ, and Casa Grande Real Estate to rank well in google. However, I am dropping in rank. What am I doing wrong http://azbestlistings.com/casa-grande-az-real-estate-homes-for-sale-in-casa-grande-az
On-Page Optimization | | sansonj0 -
Would I be safe canonicalizing comments pages on the first page?
We are building comment pages for an article site that live on a separate URL from the article (I know this is not ideal, but it is necessary). Each comments page will have a summary of the article at the top. Would I be safe using the first page of comments as the canonical URL for all subsequent comment pages? Or could I get away with using the actual article page as the canonical URL for all comment pages?
On-Page Optimization | | BostonWright0 -
Internal Linking
Hi Everyone, Should i be careful of how many internal links I have on one page? For example if i have a page that contains 700 words, can i have 5, 10, 15 internal links linking to other pages on my website? Is there a ratio for best practice. I know that internal linking is important for spiders to crawl your website but at the same time dont make it look spammy. I hope this makes sense. Thanks for any help.
On-Page Optimization | | Paul780 -
Too many On-Page Links on a WP based Website
Hi, I've already browsed through various of the Q&As on the "too many On-Page links" issue, but I would really need some advice concerning a WP Site with a dropdown navigation. As outlined in the on-page report, every site has about 180 outgoing links, which pretty much is the number of site featured in the navigation. Even though the 100 link limit is somewhat outdated I'm still worried about the distribution of linkjuice from the starting page and how Google perceives the importance of the various pages. Would it make sense to adapt the structure of the navigation, so that the starting page only links to the 5 category pages and the category pages only link to the detail pages they contain? The site has good rankings for several pages and I assume that Google can tell that the large number of links is caused by the navigation. But with every page having appr. 180 links it may be difficult for Google to tell, which of those pages are the most important regarding internal link structure... Looking foward to your opinion and insights! Cheers, Chris
On-Page Optimization | | adwordize0 -
Too many 301 redirects to home page - is this possible?
If a site has a bunch of 404’s that are basically old URL’s that no longer work and point to pages or documents that don’t exist anymore - Can someone clarify if it’s a problem when fixing a bunch of these 404’s to point them all to the home page? so if there is not really anywhere else that is applicable for the old broken URL, is it really a problem to 301 old pages to the site home page? I have read some different things on this recently on some different sites, so I just wondered what the latest thinking on this was….thanks...
On-Page Optimization | | inhouseninja0 -
Page speed tools
Working on reducing page load time, since that is one of the ranking factors that Google uses. I've been using Page Speed FireFox plugin (requires FireBug), which is free. Pretty happy with it but wondering if others have pointers to good tools for this task. Thanks...
On-Page Optimization | | scanlin0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5