Why are the bots still picking up so many links on our page despite us adding nofollow?
-
We have been working to reduce our on-page links issue. On a particular type of page the problem arose because we automatically link out to relevant content. When we added nofollows to this content it resolved the issue for some but not all and we can't figure out why is was not successful for every one. Can you see any issues?
Example of a page where nofollow did not work for...
http://www.andor.com/learning-academy/4-5d-microscopy-an-overview-of-andor's-solutions-for-4-5d-microscopy
-
ahhh, duh! Dr. Pete shed light on what we should be thinking about here. You're not getting messages for sending out too much PR but for too many links. He's right; nofollow will not remove them from being counted. Nofollow stops PR from being passed.
Link equity is a broader concept than PageRank. Link equity considers relevance, authority and trust, link placement, accessibility, any value of relevant outbound links, etc. It sounds as if you need to focus more on how you implement the links on your site.
If you need to reduce links, as mentioned earlier, use AJAX as an external file if those links are needed on the page. If they don't offer any value, then remove them. I viewed your page earlier but cannot access it now. They didn't appear to help the user experience anyway. Often what's good for the user is good for Google.
-
The main issue with too many on-page links is just dilution - there's not a hard limit, but the more links you have, the less value each one has. It's an unavoidable reality of internal site architecture and SEO.
Nofollow has no impact on this problem - link equity is still used up, even if the links aren't follow'ed. Google changed this a couple of years back due to abuse of nofollow for PageRank sculpting.
Unfortunately, I'm having a lot of issues loading your site, even from Google's cache, so I'm not able to see the source code first-hand.
-
I don't see 197 on that page I only see 42 external followed links. See the screenshot below:
-
This suggestion for AJAXing the tabs would put the content in a separate file. Such would be a great way to guarantee a reduction in on-page links!
Also, the suggestions to clean up those meta tags and the massive VIEW STATE are spot on. A little optimization will go a long way to ensuring the bots crawl all your pages. If you do have speed issues and crawl errors, it could be that the bots are not getting to subsequent pages to read your nofollows. Just a consideration of the whole pie.
-
Yes, would nofollow all the links.
To address the mystery, are you sure your other pages have since been crawled? Or is it that you are still getting warnings after subsequent crawls?
-
Whoa! Your view state is HUGE (That's what she said).
I couldn't decode it but somewhere along the lines the programmer didn't turn off session management and, likely, the entire copy of the page is encoded in the view state. This is causing load speed issues.
-
You meta tags are in more trouble then your link count:
id="MetaDescription" name="DESCRIPTION" content="Page Details" />
AND
name="Description" />
I see you are using DNN: what version and what module are you using? There are a ton of things one can do to DNN to make it SEO enhanced.
-
My suggestion is to try AJAXing the tabs. If the outbound links are more of a concern then the keywords of the link, AJAX loading of the tab content would remove them from consideration. Google won't index content pulled in from an external source.
However, be careful to put a rel="nofollow" on the link that loads the content as you don't want SEs indexing the source.
Do not put a meta nofollow in the head, it will kill all links on the page and seriously mess up your link flow. Your use of rel="nofollow" is correct in the context of the specific link tags.
I wouldn't sweat the shear number of links - the 100 count is a left over from the days when spiders only downloaded 100k from the page. It has since risen to the point that the practical limitations of over 100 links is more pressing (IE, do you visitors actually value and use that many links?)
If each link is valuable and usable, no need to worry. If not, perhaps there is a structural way to reduce the count.
Also, load the footer by AJAX onscroll or on demand. Assuming all of the pages can be found in the top navigation, the bottom links are just exacerbating your issues. Primarily, this section is giving far too much weight to secondary or auxiliary pages.
For instance, your Privacy Policy only needs to be linked to where privacy is a concern (ie the contact form). Good to put it on the home or about pages too if you have a cookie policy.
-
Hi Karl,
Would this suggestion not stop crawling to all links on the page?
Also, the issue is we have seen the rel='nofollow' work on other pages and reduce our warnings but then for some pages it has not. This is where the mystery lies.
-
it may be how the nofollow tag is formated? It should be;
and yours is rel='nofollow'
-
Hi James,
Thanks for responding. The issue is that we are still getting a link count of 197 on page links when there is not this many links on the page.
-
What do you mean the nofollow did not work? I noticed on the example page that some of your external links in the papers section are nofollow while the videos are not nofollowed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should internal links in my table of contents be tagged as nofollow?
Hi All, I have the LuckyWP Table of Contents plugin installed. I recently noticed that you can tag your internal links with and nofollow. I understand that it's always a good idea to link internally and to pass link juice to my own content. But with detailed posts that have over 20 headings, I'm thinking that internal linking for headings may actually hurt me because it takes my links well above 100. Any ideas what the best practises are in this scenario? Thanks.
Technical SEO | | nomad_blogger0 -
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Moving Some Content From Page A to Page B
Page A has written content, pictures, videos. The written content from Page A is being moved to Page B. When Google crawls the pages next time around will Page B receive the content credit? Will there not be any issues that this content originally belonged to Page A? Page A is not a page I want to rank for (just have great pictures and videos for users). Can I 301 redirect from Page A to B since the written content from A has been deleted or no need? Again, I intent to keep Page A live because good value for users to see pictures and videos.
Technical SEO | | khi50 -
Too Many On Page LInk
The analysis of my site is showing that I have a problem with too many on-page links. Most of this is due to our menu, and wanting users to be able to quickly get to the shopping category they are looking for. We end up with over 200 links in order to get the menu we want. How are other people dealing with a robust menu, but avoiding getting dinged for too many links? One of our pages in question is: http://www.milosport.com/category/2176-snowboards.aspx
Technical SEO | | dantheriver0 -
Google picking up wrong page title
Hi, When searching for "Tottenham Forum" on google.co.uk (link below) http://www.google.co.uk/search?q=tottenham+forum&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:en-GB:official&client=firefox-a The site I manage (THFCTalk.com) is listed as 4th in the search results, but was hacked a few months ago and the search results lists the page title as "Free Shipping. Order Cialis Online. - Online Pharmacy" when the actual page title of THFCTalk is not actually set at that. Any idea how to fix this so Google updates this header on the search results? - as it is surely putting people off from clicking on our search result
Technical SEO | | WalesDragon0 -
Backlinks to home page vs internal page
Hello, What is the point of getting a large amount of backlinks to internal pages of an ecommerce site? Although it would be great to make your articles (for example) strong, isn't it more important to build up the strength of the home page. All of My SEO has had a long term goal of strengthening the home page, with just enough backlinks to internal pages to have balance, which is happening naturally. The home page of our main site is what comes up on tons of our keyword searches since it is so strong. Please let me know why so much effort is put into getting backlinks to internal pages. Thank you,
Technical SEO | | BobGW0 -
We have been hit with the "Doorway Page" Penalty - fixed the issue - Got MSG that will still do not meet guidelines.
I have read the FAQs and checked for similar issues: YES / NO
Technical SEO | | LVH
My site's URL (web address) is:www.recoveryconnection.org
Description (including timeline of any changes made): We were hit with the Doorway Pages penalty on 5/26/11. We have a team of copywriters, and a fast-working dev dept., so we were able to correct what we thought the problem was, "targeting one-keyword per page" and thin content. (according to Google) Plan of action: To consolidate "like" keywords/content onto pages that were getting the most traffic and 404d the pages with the thin content and that were targeting singular keywords per page. We submitted a board approved reconsideration request on 6/8/11 and received the 2nd message (below) on 6/16/11. ***NOTE:The site was originally designed by the OLD marketing team who was let go, and we are the NEW team trying to clean up their mess. We are now resorting to going through Google's general guidelines page. Help would be appreciated. Below is the message we received back. Dear site owner or webmaster of http://www.recoveryconnection.org/, We received a request from a site owner to reconsider http://www.recoveryconnection.org/ for compliance with Google's Webmaster Guidelines. We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from http://www.recoveryconnection.org/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visit https://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team Any help is welcome. Thanks0