Do Follow and No Follow Attributes?
-
I have a blog on Trulia and Active Rain which are real estate websites. I see that they both are Do Follow.
My question is how much link juice do these sites pass a long to my site when I write a blog post on these sites and create a link to my main site?
I've heard somewhere that even though a link has a Do Follow attribute it passes little link juice in certain cases like a no follow attribute.
Also, if a link is a No Follow link, does it still pass along some link juice or is it completely juiceless.
Is there a way to see which sites pass the most link juice to my site?
Thanks all.
-
Well put , also keep in mind that to many no follow links are also not a good link profile Variation
-
There's nothing wrong per se with a 'do follow' link. It's just sort of a redundancy. You can argue that 'Do Follow' links could pass little or no link juice if a blog for example has enabled 'do follow' on comments on the blog. In this case you're kind of inviting spam and as a result, little or no link juice may be passed.
-
You can look at the domain and page authority of individual pages for some idea. Because of some of the delays in how SEOmoz updates its index, you might not get as an accurate picture for newer posts.
Links near the top of the page tend to hold more weight than ones in sidebars, footers, and traditional author bios at the end of posts. Also search engines will only follow the first link to a page that it finds. For instance, if you have three links to example.com/homes, each with different (or the same) anchor texts, the search engine will only follow the first one that has a Do Follow attribute. Though you can have multiple links to different pages on the site - if the first link goes to example.com/home and the second goes to example.com/home2 then the search engine will follow both.
Further, the more posts you have on one site with links pointing to your website, the less power each individual link has. It's not going to hurt you, but the individual gains each link brings will be smaller and smaller.
So the short answer is, no there's no real way to see which sites pass "the most" link juice to your site. You can get an idea for which ones might be helping more, but it's only going to be an estimate.
I'm of the opinion that No Follow links help show that you have a natural link portfolio since having none implies that you could be spamming, but by themselves they won't do anything to influence your rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
x-default hreflang attribute value
Hi When is it best to use the x-default hreflang attribute value tags? Screaming frog is flagging lots as missing, but is it necessary? Thanks
Intermediate & Advanced SEO | | BeckyKey0 -
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
Intermediate & Advanced SEO | | SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://moz.com/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://moz.com/blog/large-site-seo-basics-faceted-navigation1 -
Huge organic drop following new site go live
Hi Guys, I am currently working on a site that's organic traffic suffered ( and is still suffering ) a huge drop in organic traffic. From a consistent 3-400 organic visits a day to almost zero. This happened as soon as the new site went live. I am now digging to find out why. 301s were put in place ( over 2, 500 over them ) and there are still over 1,100 outstanding after review search console this morning. Having looked at the redirect file that was put in place when the new site went live, it all look OK, apart from the redirects look like this... http://www.physiotherapystore.com/ to http://physiotherapystore.com/ Where the new URL is missing www. - I am concerned this is causing a large duplicate issue as both www. and non www. work fine. I am right to have concern or is this something not to worry about?
Intermediate & Advanced SEO | | HappyJackJr0 -
To nofollow or follow internal links, that is the question...
"...Whether 'tis Nobler in the mind to suffer the slings and arrows of outrageous fortune or..." Okay, I'll drop the Hamlet riff. I'm working on a site with a forum. Top pages may have 20 to 30 answers. Each answer is by a member with an image/link and a name link to their member profile. A member profile may contain alot of info or none. We've noiondexed memeber profile pages, yet we still have these links to member profile pages. Is it better to nofollow these internal links to profile pages or what? Again, with 25 answers on a page and two links per answer to each member profile (image and name), that's a ton of internal links to noindexed pages. Thanks! Best... Darcy
Intermediate & Advanced SEO | | 945010 -
Following Penguin 2.0 hit in May, my site experienced another big drop on August 13th
Hi everyone, my website experienced a 30% drop in organic traffic following the Penguin 2.0 update in May. This was the first significant drop that the site has experienced since 2007, and I was initially concerned that the new website design I released in March was partly to blame. On further investigation, many spammy sites were found to be linking to my website, and I immediately contacted the sites, asked for the removal of the sites, before submitting a disavow file to Google. At the same time, I've had some great content written for my website over the last few months, which has attracted over 100 backlinks from some great websites, as well as lots of social media interaction. So, while I realise my site still needs a lot of work, I do believe I'm trying my best to do things in the correct manner. However, on August 11th, I received a message in Google WMTs : Googlebot found an extremely high number of URLs on your site I studied the table of internal links in WMTs and found that Google has been crawling many URLs throughout my site that I didn't necessarily intend it to find i.e. lots of URLs with filtering and sorting parameters added. As a result, many of my pages are showing in WMTs as having over 300,000 internal links!! I immediately tried to rectify this issue, updating the parameters section in WMTs to tell Google to ignore many of the URLs it comes across that have these filtering parameters attached. In addition, since my access logs were showing that Googlebot was frequently crawling all the URLs with parameters, I also added some Disallow entries to robots.txt to tell Google and the other spiders to ignore many of these URLs. So, I now feel that if Google crawls my site, it will not get bogged down in hundreds of thousands of identical pages and just see those URLs that are important to my business. However, two days later, on August 13th, my site experienced a further huge drop, so its now dropped by about 60-70% of what I would expect at this time of the year! (there is no sign of any manual webspam actions) My question is - do you think the solutions I've put in place over the last week could be to blame for the sudden drop, or do you think I'm taking the correct approach, and that the recent drop is probably due to Google getting bogged down in the crawling process. I'm not aware of any subsequent Penguin updates in recent days, so I'm guessing that this issue is somehow due to the internal structure of my new design. I don't know whether to roll back my recent changes or just sit tight and hope that it sorts itself out over the next few weeks when Google has more time to do a full crawl and observe the changes I've made. Any suggestions would be greatly appreciated. My website is ConcertHotels.com. Many thanks Mike
Intermediate & Advanced SEO | | mjk260 -
Follow or nofollow to subdomain
Hi, I run a hotel booking site and the booking engine is setup on a subdomain.
Intermediate & Advanced SEO | | vmotuz
The subdomain is disabled from being indexed in robots.txt Should the links from the main domain have a nofollow to the subdomain? What are you thoughts? Thanks!0 -
Sitespeed: Do images require width and height attributes?
Currently working on a sitespeed issue, and was wondering if not having width and height for images actually do cause a problem. We simply Photoshop the resolution we require for the image and add it to the page as is. I though this would actually speed it up, but I am getting from www.gtmetrix.com that we should have them. What's your experience? Thanks!
Intermediate & Advanced SEO | | cyberlicious0 -
Language Attribute - does changing it make a difference to SEO and Search?
I am an SEO newbie, but learning fast. 🙂 I am based in London, UK and have a website: www.twofourseven.co.uk. I noticed that the language attribute was set to 'en-US'. I work in London as well in international locations in the Middle East and Asia. Thinking of this I wanted to ask the experts if given that I am based in the UK, would changing the language attribute make a difference to search results? If so, would 'en' be better than 'en-GB', which might be too specific? Thanks in advance!
Intermediate & Advanced SEO | | twofourseven0