How can I tell if a website is a 'NoFollow'?
-
I've been link building for a long time but have recently discovered that most of my links are from NoFollow links, such as twitter and Youtube.
How can I tell if a website is a 'NoFollow'?
-
Thanks to both of you, I've now downloaded the toolbar which is making things so much easier.
Much appreciated guys
- Paul
-
There's also various plugins for Chrome / Firefox which will highlight nofollowed links on a page - including SEOMoz's Mozbar (http://www.seomoz.org/seo-toolbar)
-
Either by looking at the source on the page (right click in most browsers) or right-click an example of the link you want and "Inspect element." In the <a>tag, you will see "rel=nofollow."</a>
<a>Inspect element is great for looking at the source code of only the section of a page you want (instead of scrolling and scanning forever). It is helpful with de-bugging code (for on-page SEO).
If you are prospecting links en masse (like through OSE or another backlink research tool) there is usually a field to tell you what links are nofollowed.
Does that make sense?</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO'ing a sports advice website
Hi Team Moz, Despite being in tech/product development for 10+ years, I'm relatively new to SEO (and completely new to this forum) so was hoping for community advice before I dive in to see how Google likes (or perhaps doesn't) my soon to be built content. I'm building a site (BetSharper, an early-stage work in progress) that will deliver practical, data orientated predictive advice prior to sporting events commencing. The initial user personas I am targeting would need advice on specific games so, as an example, I would build a specific page for the upcoming Stanley Cup Game 1 between the Capitals and the Tampa Bay Lighting. I'm in the midst of keyword research and believe I have found some easier to achieve initial keywords (I'm realistic, building my DA will take time!) that include the team names but don't reference dates or state of the tournament. The question is, hypothetically if I ranked for this page for this sporting event this year, would it make sense to refresh the same page with 2019 matchup content when they meet again next year, or create a new page? I am assuming I would be targeting the same intended keywords but wondering if I get google credit for 2018 engagement post 2019 refresh. Or should I start fresh with a new page and specifically target keywords afresh each time? I read some background info on canonical tabs but wasn't sure if it was relevant in my case. I hope I've managed to articulate myself on what feels like an edge case within the wonderful world of SEO. Any advice the community delivers would be much appreciated...... Kind Regards James.
Intermediate & Advanced SEO | | JB19770 -
How to Evaluate Original Domain Authority vs. Recent 'HTTPS' Duplicate for Potential Domain Migration?
Hello Everyone, So our site has used ‘http’ for the domain since the start. Everything has been set up for this structure and Google is only indexing these pages. Just recently a second version was created on ‘httpS’. We know having both up is the worst case scenario but now that both are up is it worth just switching over or would the original domain authority warrant just keeping it on ‘http’ and redirecting the ‘httpS’ version? Assuming speed and other elements wouldn’t be an issue and it's done correctly. Our thought was if we could do this quickly it would be easier to just redirect the ‘httpS’ version but was not sure if the Pros of ‘httpS’ would be worth the resources. Any help or insight would be appreciated. Please let us know if there are any further details we could provide that might help. Looking forward to hearing from all of you! Thank you in advance for the help. Best,
Intermediate & Advanced SEO | | Ben-R1 -
Can you create town focused landing pages for a website without breaking Google guidelines?
I recently watched a webmaster video that said that town focused landing pages are seen as doorway pages if they only exist to capture search traffic. And then I read that just because you can sell your product/service in a certain area, doesn't mean you can have a page for it on your website. Is it possible to create town focused landing pages for a website without breaking Google guidelines?
Intermediate & Advanced SEO | | Silkstream1 -
Tool that can retrieve mysite URL's
Hi, Tool that can retrieve mysite URL's I am not talking about href,open explorer, Majestic etc I have a list of 1000 site URL's where my site name is mentioned. I want to get the exact URL of my site next to the URL i want to query with Example http://moz.com/community is the URL i have and if this page has mysite name then i need to get the complete URL captured. Any software or tool that can do this? I used one for sure which got me this info but now i don't remember it Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
I'm having an exteremly hard time with SERPs despite my best efforts. Can someone help?
My site is www.drupalgeeks.org Our traffic is going up but our SERPS are not. We simply don't rank for any of our targeted keywords. I have covered nearly every white hat SEO strategy possible. Our site has a great social presence (Facebook, Twitter, LinkedIn, Pinterest), we write blogs regularly, and even guest blog. We have a YouTube channel, an RSS feed. We've cleaned up page speed times, set 301 redirects, checked for duplicate content. We use Bing and Google webmaster tools and have submitted a sitemap. We are indexed and webmaster tools see our keywords as relevant in our content. We have a robots.txt file configured properly. The only thing I can think of is that our services pages also display (as a truncated summary) on our homepage. Could this be considered duplicate content, and is this causing a problem? Is there anything else we can do? Or are we missing something vital? We thank you in advance for your help! Candice
Intermediate & Advanced SEO | | candylotus0 -
What if you can't navigate naturally to your canonicalized URL?
Assume this situation for a second... Let's say you place a rel= canonical tag on a page and point to the original/authentic URL. Now, let's say that that original/authentic URL is also populated into your XML sitemap... So, here's my question... Since you can't actually navigate to that original/authentic URL (it still loads with a 200, it's just not actually linkded to from within the site itself), does that create an issue for search engines? Last consideration... The bots can still access those pages via the canonical tag and the XML sitemap, it's just that the user wouldn't be able to access those original/authentic pages in their natural site navigation. Thanks, Rodrigo
Intermediate & Advanced SEO | | AlgoFreaks0 -
How to move website to new domain?
We have a website that has run under the same domain name for the past 10 years. We have built up a decent amount of SEO "mojo" (and traffic) over time, however, the original domain name no longer applies to the business model. A little over 1 year ago we started using a new brand name for the website and created a landing page for that domain name. Everything on that landing page links over to pages on the original domain name (to preserve the SEO value that we have built up over the years). We would like to move all (or most) of the pages/content to the new domain name. Would using 301 redirects be the safest, most effective way of doing this? I have heard of other people doing it this way, and often they will see their traffic drop for a few weeks before it eventually comes back. Anyone else had experience with this? What worked? What didn't? Thanks!
Intermediate & Advanced SEO | | seo-mojo0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0