Why are the bots still picking up so many links on our page despite us adding nofollow?
-
We have been working to reduce our on-page links issue. On a particular type of page the problem arose because we automatically link out to relevant content. When we added nofollows to this content it resolved the issue for some but not all and we can't figure out why is was not successful for every one. Can you see any issues?
Example of a page where nofollow did not work for...
http://www.andor.com/learning-academy/4-5d-microscopy-an-overview-of-andor's-solutions-for-4-5d-microscopy
-
ahhh, duh! Dr. Pete shed light on what we should be thinking about here. You're not getting messages for sending out too much PR but for too many links. He's right; nofollow will not remove them from being counted. Nofollow stops PR from being passed.
Link equity is a broader concept than PageRank. Link equity considers relevance, authority and trust, link placement, accessibility, any value of relevant outbound links, etc. It sounds as if you need to focus more on how you implement the links on your site.
If you need to reduce links, as mentioned earlier, use AJAX as an external file if those links are needed on the page. If they don't offer any value, then remove them. I viewed your page earlier but cannot access it now. They didn't appear to help the user experience anyway. Often what's good for the user is good for Google.
-
The main issue with too many on-page links is just dilution - there's not a hard limit, but the more links you have, the less value each one has. It's an unavoidable reality of internal site architecture and SEO.
Nofollow has no impact on this problem - link equity is still used up, even if the links aren't follow'ed. Google changed this a couple of years back due to abuse of nofollow for PageRank sculpting.
Unfortunately, I'm having a lot of issues loading your site, even from Google's cache, so I'm not able to see the source code first-hand.
-
I don't see 197 on that page I only see 42 external followed links. See the screenshot below:
-
This suggestion for AJAXing the tabs would put the content in a separate file. Such would be a great way to guarantee a reduction in on-page links!
Also, the suggestions to clean up those meta tags and the massive VIEW STATE are spot on. A little optimization will go a long way to ensuring the bots crawl all your pages. If you do have speed issues and crawl errors, it could be that the bots are not getting to subsequent pages to read your nofollows. Just a consideration of the whole pie.
-
Yes, would nofollow all the links.
To address the mystery, are you sure your other pages have since been crawled? Or is it that you are still getting warnings after subsequent crawls?
-
Whoa! Your view state is HUGE (That's what she said).
I couldn't decode it but somewhere along the lines the programmer didn't turn off session management and, likely, the entire copy of the page is encoded in the view state. This is causing load speed issues.
-
You meta tags are in more trouble then your link count:
id="MetaDescription" name="DESCRIPTION" content="Page Details" />
AND
name="Description" />
I see you are using DNN: what version and what module are you using? There are a ton of things one can do to DNN to make it SEO enhanced.
-
My suggestion is to try AJAXing the tabs. If the outbound links are more of a concern then the keywords of the link, AJAX loading of the tab content would remove them from consideration. Google won't index content pulled in from an external source.
However, be careful to put a rel="nofollow" on the link that loads the content as you don't want SEs indexing the source.
Do not put a meta nofollow in the head, it will kill all links on the page and seriously mess up your link flow. Your use of rel="nofollow" is correct in the context of the specific link tags.
I wouldn't sweat the shear number of links - the 100 count is a left over from the days when spiders only downloaded 100k from the page. It has since risen to the point that the practical limitations of over 100 links is more pressing (IE, do you visitors actually value and use that many links?)
If each link is valuable and usable, no need to worry. If not, perhaps there is a structural way to reduce the count.
Also, load the footer by AJAX onscroll or on demand. Assuming all of the pages can be found in the top navigation, the bottom links are just exacerbating your issues. Primarily, this section is giving far too much weight to secondary or auxiliary pages.
For instance, your Privacy Policy only needs to be linked to where privacy is a concern (ie the contact form). Good to put it on the home or about pages too if you have a cookie policy.
-
Hi Karl,
Would this suggestion not stop crawling to all links on the page?
Also, the issue is we have seen the rel='nofollow' work on other pages and reduce our warnings but then for some pages it has not. This is where the mystery lies.
-
it may be how the nofollow tag is formated? It should be;
and yours is rel='nofollow'
-
Hi James,
Thanks for responding. The issue is that we are still getting a link count of 197 on page links when there is not this many links on the page.
-
What do you mean the nofollow did not work? I noticed on the example page that some of your external links in the papers section are nofollow while the videos are not nofollowed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
Old domain still being crawled despite 301s to new domain
Hi there, We switched from the domain X.com to Y.com in late 2013 and for the most part, the transition was successful. We were able to 301 most of our content over without too much trouble. But when when I do a site:X.com in Google, I still see about 6240 URLs of X listed. But if you click on a link, you get 301d to Y. Maybe Google has not re-crawled those X pages to know of the 301 to Y, right? The home page of X.com is shown in the site:X.com results. But if I look at the cached version, the cached description will say :This is Google's cache of Y.com. It is a snapshot of the page as it appeared on July 31, 2014." So, Google has freshly crawled the page. It does know of the 301 to Y and is showing that page's content. But the X.com home page still shows up on site:X.com. How is the domain for X showing rather than Y when even Google's cache is showing the page content and URL for Y? There are some other similar examples. For instance, you would see a deep URL for X, but just looking at the <title>in the SERP, you can see it has crawled the Y equivalent. Clicking on the link gives you a 301 to the Y equivalent. The cached version of the deep URL to X also shows the content of Y.</p> <p>Any suggestions on how to fix this or if it's a problem. I'm concerned that some SEO equity is still being sequestered in the old domain.</p> <p>Thanks,</p> <p>Stephen</p></title>
Technical SEO | | fernandoRiveraZ1 -
Too Many On Page Links Error On Wordpress Blog
I have a wordpress blog. I am getting an error message from SEOmoz "too many on page links" However SEOmoz is counting a full month of blogs as one page. For example-3 onpage internal links in each blog times 30 different blog article in a month is recorded as 90 on page links. Is there any mechanism to fix this on wordpress
Technical SEO | | wianno1680 -
Too many on page links
Hi All, As we all know, having to much links on a page is an obstacle for search engine crawlers in terms of the crawl allowance. My category pages are labeled as pages with to many "one page" links by the SEOmoz crawler. This probably comes from the fact that each product on the category page has multiple links (on the image and model number). Now my question is, would it help to setup a text-link with a clickable area as big as the product area? This means every product gets just one link. Would this help get the crawlers deeper in these pages and distribute the link-juice better? Or is Google smart enough already to figure out that two links to the same product page shouldn't be counted as two? Thanks for your replies guys. Rich
Technical SEO | | Horlogeboetiek0 -
Would nofollowing the footer throw an unnatural blance between followed and nofollowed links?
I have been getting errors for too many on-page links. All the major navigation pages are found in links within the navigation tabs and are identical to the footer links. So my question is, would nofollowing the footer look unnatural and throw off the balance between followed and nofollowed links on the site and negatively effect SEO?
Technical SEO | | smilingbunny0 -
How many strong tags is too many
Hi everyone, just a quick question, what are your views on the use of strong tags in content? how many is too many? What is you have strong tags around every keywords for a sentance etc?
Technical SEO | | pauledwards1 -
301ed Pages Still Showing as Duplicate Content in GWMT
I thank anyone reading this for their consideration and time. We are a large site with millions of URLs for our product pages. We are also a textbook company, so by nature, our products have two separate ISBNs: a 10 digit and a 13 digit form. Thus, every one of our books has at least two pages (10 digit and 13 digit ISBN page). My issue is that we have established a 301 for all the 10 digit URLs so they automatically redirect to the 13 digit page. This fix has been in place for months. However, Google still reports that they are detecting thousands of pages with duplicate title and meta tags. Google is referring to these page URLs that I already have 301ed to the canonical version many months ago! Is there anything that I can do to fix this issue? I don't understand what I am doing wrong. Example:
Technical SEO | | dfinn
http://www.bookbyte.com/product.aspx?isbn=9780321676672
http://www.bookbyte.com/product.aspx?isbn=032167667X As you can see the 10 digit ISBN page 301s to 13 digit canonical version. Google reports that they have detected duplicate title and meta tags between the two pages and there are thousands of these duplicate pages listed. To add some further context: The ISBN is just a parameter that allows us to provide content when someone searches for a product with the 10 or 13 digit ISBN. The 13 digit version of the page is the only physical page that exists, the 10 digit is only a part of the virtual URL structure of the website. This is why I cannot simply change the title and meta tags of the 10 digit pages because they only exist in the sense that the URL redirects to the 13 digit version. Also, we submit a sitemap every day of all the 13 digit pages so Google knows exactly what our physical URL structure is. I have submitted this question to GWMT forums and received no replies.0 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0