No-follow tags on links in the footer...do it or don't do it?
-
With some of the great reports SEOMoz has provided I've been able to start to take the correct steps towards fixing crawl issues, on-page issues, etc.
One of my websites allows a customer to drill down to their specific state and then their city to apply for an auto loan. The SEOMoz reports told me I had too many links on these pages specifically. One of my ways to remedy this would be to add "no-follow" tags on the links in the footer as well as the links to the cities.
Am I steering myself in the right/wrong direction? Should I be approaching this problem from a different perspective?
Any help is greatly appreciated!
-
This is more a question than discussion.
If you no-follow on the footer, than that will be for the entire site (same footer on all pages, which means all footer links on all pages are no-follow), which is probably not desired. Can you code to not show the footer links, or reduced links on those pages?
Footer links should not be your site map on the bottom of each page, but rather, a list of key resource pages such as contact, company, high level product categories, FAQ, sign up for email, social networking links, etc.
If you want more details, please provide a link.
-
It depends... are these links the only direct links to those particular pages? I mean, would you have to go an indirect route through the main menu or something (i.e. via a category)? If they're the only direct ones, keep them. You want the pages linked to from the homepage so they're as shallow as possible to get crawled.
Also, if they are not the only direct links, do the other ones have the appropriate anchor text?
It might be a good idea to keep them for the anchor text alone.
Are there any other links you could get rid of... how many links are there on this page?
-
I would not place nofollow on the links for two reasons...... 1) pagerank that would have flowed into that link will evaporate with the nofollow (at least that is the latest word I've heard from google on this - although on this very issue they have had a tendency to change their mind without tellin').... 2) nofollow is often used for "we don't trust this".
How many links are you talking about... if just a few or a few dozen I would leave them as is. If you have a ton of links down there then maybe a different link structure would be useful... might consider that for other reasons as well. Not saying that what you have is bad.. just an opportunity to consider new things.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google's Importance on usability issues in sub directories or sub domains?
Hi Moz community, As the different usability issues like pagespeed or mobile responsiveness are playing a key role in website rankings; I wonder how much the same factors are important for sub directories or sub domain pages? Do each and every page of sub directory or sub domain must be optimised like website pages? Does Google gives same importance? Thanks
Algorithm Updates | | vtmoz0 -
How Google's "Temporarily remove URLs" in search console works?
Hi, We have created new sub-domain with new content which we want to highlight for users. But our old content from different sub-domain is making top on google results with reputation. How can we highlight new content and suppress old sub-domain in results? Many pages have related title tags and other information in similar. We are planing to hide URLs from Google search console, so slowly new pages will attain the traffic. How does it works?
Algorithm Updates | | vtmoz0 -
Relevant Link, but Low DA...good idea?
If a website has a low DA (not because of spam. Just because it's new or because there isn't a ton of content) but it is industry specific/relevant, then is that worth pursuing? I have read how relevancy is supposed to be a major portion determining a link's benefit, but I"m leery about about something with a low DA - like under 15 low. Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Guest Blog Post or Article Content Should be Do-Follow or No-Follow Link ? Help Plz !
Many SEO writers and blogs after Google Matt Cutt said, You should not allow no-follow link in Guest Post. What should we do. ? I am allowing Guest post - what they ask in return a do-follow link to their site or blog. other articles or post i wrote about inspiration collected from different source or single portfolio site - i credit them(as blogger - we should respect them). What i am doing right or wrong ? Please advise and help me on this ! http://searchengineland.com/google-guest-blogging-for-links-you-better-nofollow-those-links-166218
Algorithm Updates | | Esaky0 -
Does Link Exchanges and Reciprocal Links Is Dead - Now Days ?
Hello, As we know randfish Rand discusses the egress of old link building practices and the ingress of new (old) link _earning _strategies, Rand has also discussed on Link Exchanges and Reciprocal Links, I have few questions which r related to Link Exchanges and Reciprocal Links. Few Question **1) Does Now Days Reciprocal Links Are Important Or Not For Link Building Strategies. ** ** 2) Webmaster Has To Perform Reciprocal Links Or Not.** 3) Can Reciprocal Links Boost Search Engine Ranking. 4) Does Reciprocal Links Has Negative Impact On Search Engine. Regards,
Algorithm Updates | | sumit60
Sumit0 -
Does Schema.org markup create a conflict with Power Reviews' standard microformat markup for e-commerce product pages?
Does anyone have experience implementing Schema.org markup on e-commerce websites that are already using Power Reviews (now Bazaar)? In Google's documentation they say that it's generally not a good idea to use two types of semantic markup for the same item (reviews in this case), but I wouldn't think that there would be a problem marking up other items on the page with Schema such as price, stock status, etc... Anyone care to provide some insight? Also in a related topic, have you all noticed that Google has really dialed back the frequency in which they display rich snippets for product searches? A few weeks ago the site that I'm referring to had hundreds of products that were displaying snippets, now it seems that only about 10% (roughly) of them are still showing. Thanks everybody.
Algorithm Updates | | BrianCC0 -
How important are links after Panda
I have noticed that the sites in my niche that were at the top of the SERP's are still at the top of the SERP's after panda. I have also heard people theorizing that links are no longer important, its now all about bounce rates, time on site, etc. Is there any consensus about how important links are after Panda? thx Paul
Algorithm Updates | | diogenes1 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0