What's the best way of crawling my entire site to get a list of NoFollow links?
-
Hi all, hope somebody can help.
I want to crawl my site to export an audit showing:
- All nofollow links (what links, from which pages)
- All external links broken down by follow/nofollow.
I had thought Moz would do it, but that's not in Crawl info. So I thought Screaming Frog would do it, but unless I'm not looking in the right place, that only seems to provide this information if you manually click down each link and view "Inlinks" details.
Surely this must be easy?! Hope someone can nudge me in the right direction...
Thanks....
-
-
Ah fantastic, thank you Mazen! My SF skills are clearly rusty - thanks for your help.
-
On ScreamingFrog, you can go to Bulk Export then All Outlinks and export all the outgoing links from all pages on your site into a CSV file. You can open the file and look under the "Follow" column for False entries.
The All Inlinks report would also show you the same information from the perspective of links coming into pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
Intermediate & Advanced SEO | | Jade1 -
Something happened within the last 2 weeks on our WordPress-hosted site that created "duplicates" by counting www.company.com/example and company.com/example (without the 'www.') as separate pages. Any idea what could have happened, and how to fix it?
Our website is running through WordPress. We've been running Moz for over a month now. Only recently, within the past 2 weeks, have we been alerted to over 100 duplicate pages. It appears something happened that created a duplicate of every single page on our site; "www.company.com/example" and "company.com/example." Again, according to our MOZ, this is a recent issue. I'm almost certain that prior to a couple of weeks ago, there existed both forms of the URL that directed to the same page without be counting as a duplicate. Thanks for you help!
Intermediate & Advanced SEO | | wzimmer0 -
Stock lists - follow of nofollow?
a bit of a catch 22 position here that i could use some advice on please! We look after a few Car dealership sites that have daily (some 3 times a day) stock feeds that add and remove cars form the site, which in turn removes/creates pages for each vehicle. We all know how much search engines like sites that have content that is updated regularly but the frequency it happens on our sites means we are left with lots of indexed pages that are no longer there. now my question is should i nofollow/disallow robots on all the pages that are for the details of the vehicles meaning the list pages will still be updated daily for "new content" or allow google to index everything and manage the errors to redirect to relevant pages? is there a "best practice" way to do this or is it really personal preference?
Intermediate & Advanced SEO | | ben_dpp0 -
Best way to start a fresh site from a penalized one
Dear all, I was dealing with a penalized domain (Penguin, Panda), hundred of spamy links (Disavoved with no success), tiny content resolved in part and so on .... I think the best way is to start a new fresh domain but we want to use some of the well written content from the old (penalized site). To do this task I will mark as NOINDEX the source (penalized) page and move this content to the new fresh domain. Question: do you think this is a non-dangerous aprouch or do you know other strategy? I'll appreciate your point of view Thank you
Intermediate & Advanced SEO | | SharewarePros0 -
Add noindex,nofollow prior to removing pages resulting in 404's
We're working with another site that unfortunately due to how their website has been programmed creates a bit of a mess. Whenever an employee removes a page from their site through their homegrown 'content management system', rather than 301'ing to another location on their site, the page is deleted and results in a 404. The interim question until they implement a better solution in managing their website is: Should they first add noindex,nofollow to the pages that are scheduled to be removed. Then once they are removed, they become 404's? Of note, it is possible that some of these pages will be used again in the future, and I would imagine they could submit them to Google through Webmaster Tools and adding the pages to their sitemap.
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Why are indented listings coming up from our old site?
We recently redesigned our e-commerce site and we've submitted the new site map and fetched as google bot but old site pages are still coming up indented under our homepage Google results. The new meta description is coming up for the homepage but the Quilt Guard page is showing up under indented results and it goes to a 404 error because the link is no longer on the new site. Is there anyway to control the indented results/pages that show up in results? http://www.google.com/webhp?source=search_app#hl=en&output=search&sclient=psy-ab&q=sleep+city&oq=sleep+city&gs_l=hp.3..0l4.3702.1178235.0.1178474.21.19.1.1.1.1.613.3532.0j11j3j0j1j1.16.0...0.0...1c.tBBERBg0aIo&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&fp=dddf8a6dafb67f55&biw=1280&bih=923
Intermediate & Advanced SEO | | mmgmontana0 -
Best multi-language site strategy?
When reading about multi-language site structure, general knowledge says that there are 2 right ways of doing it right: Assign one domain per region/ language: www.domain.fr www.domain.de www.domain.co.uk ... If a country has more than one language, such as Switzerland, you can create folders for those languages: www.domain.ch/fr - in french www.domain.ch/de - in german Have a unique domain www.domain.com for the whole site and create folders for language region: www.domina.com/fr www.domain.com/uk ... If a language is spoken in more than one country, you can create subfolders www.domain.com/fr-ch - french in switzerland www.domain.com/de-ch - german in switzerland At first sight, it seems that option 1 is the right one. However, sites such as www.apple.com are using option 2. I am unable to decide... what would you recommend? Any objective criteria?
Intermediate & Advanced SEO | | hockerty0