How to identify number of internal links to page?
-
Hi Guys,
Besides OSE & screaming frog - are there any tools which can check internal links to a page?
I know ahrefs, majestic cannot.
Cheers.
-
If you want to go with a free tool, you can also check the xenu link sleuth (http://home.snafu.de/tilman/xenulink.html)
Just make a full crawl with the tool, export page map to tab separated file. Then you can open this file in Excel (or any similar software). It should do the job
-
Hi Wozniak65,
OSE is not the right tool to use for checking internal links and I'm not sure why you would want to find an alternative to Screaming Frog for this?
If you want to use the best tool for the job - use Screaming Frog.
Cheers,
David
-
check this list and see which one is useful for you
http://smallseotools.com/website-links-count-checker/
http://tools.seochat.com/tools/page-link-analyzer-seo/
https://www.webceo.com/internal-link-analysis-tool.htm -
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
What Are Internal Linking Best Practices For Blogs?
We have a blog for our e-commerce site. We are posting about 4-5 blog posts a month, most of them 1500+ words. Within the content, we have around 10-20 links pointing out to other blog posts or products/categories on our site. Except for the products/categories, the links use non-optimized generic anchor text (i.e guide, sizing tips, planning resource). Are there any issues or problems as far as SEO with this practice? Thank You
Intermediate & Advanced SEO | | kekepeche0 -
Should we optimise our internal links?
Hi again, We recently had a technical search audit done by a specialist agency and they discovered a number of internal links that caused redirects to happen. The agency has recommended we update all of these links to link directly to the destination so we don't lose out on link equity. We'd just like to know if you think this would be a worthwhile use of our time. Our web team seem to think that returning a 301 to a crawler means that the crawler will stop indexing the original URL and instead index the redirected destination? Thanks all. Clair
Intermediate & Advanced SEO | | iescape2 -
Links: Links come from bizzare pages
Hi all, My question is related to links that I saw in Google Search Console. While looking at who is linking to my site, I saw that GSC has some links that are coming from third party websites but these third party webpages are not indexed and not even put up by their owners. It looks like the owner never created these pages, these pages are not indexed (when you do a site: search in Google) but the URL of these pages loads content in the browser. Example - www.samplesite1.com/fakefolder/fakeurl what exactly is this thing? To mention more details, the third party website in question is a Wordpress website and I guess is probably hijacked. But how does one even get these types pages/URLs up and running on someone else's website and then link out to other websites. I am concerned as the content that I am getting link from is adult content and I will have to do some link cleansing soon.
Intermediate & Advanced SEO | | Malika10 -
I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2\. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect?
I'm going through the crawl report and it says I've got duplicate pages. For example, blog/page/2 is the same as author/admin/page/2/ Now, the author/admin/page/2 I can't even find in WordPress, but it is the same thing as blog/page/2 nonetheless. Is this something I should just ignore, or should I create the author/admin/page2 and then 301 redirect it to blog/page/2?
Intermediate & Advanced SEO | | shift-inc0 -
How to handle broken links to phantom pages appearing in webmaster tools
Hi,Would love to hear different experiences and thoughts on this one. We have a site that is plagued with 404's in the Webmaster Tools. A significant number of them have never existed, for instance affiliates have linked to them with the wrong URL or scraper sites have linked to them with a truncated version of the URL and an ellipsis eg; /my-nonexistent... What's the best way to handle these? If we do nothing and mark as fixed, they reappear in the broken links report. If we 301 redirect and mark as fixed they reappear. We tried 410 (gone forever) and marking as fixed; they re-appeared. We have a lot of legacy broken links and we would really like to clean up our WMT broken link profile - does anyone know of a way we can make these links to non extistent pages disappear once and for all? Many thanks in advance!
Intermediate & Advanced SEO | | dancape0 -
Internal Linking from Menu or body text or both with exact match keyword?
I used to have my menu link to every page with my exact match keywords. I am a Magician and have pages for each county / town so I had a link to /magician-hampshire with the anchor text Magician Hampshire in the menu. I recently had my website updated and the developer told me this was very spammy have a menu that said Magician Hampshire, Magician Surrey, Magician Berkshire He suggested that I should now have a menu structure that says Areas Covered>Hampshire - Surrey - Berkshire etc.Google will know my website is about a magician and relate the two together. Is this correct or should I revert my menu back to anchor text of Magician (County) I am running wordpress and he said the title attribute can say Magician Hampshire but the Visible text is for the user and not Google. I also use the technique of doing site:rogerlapin.co.uk magician hampshire and then seeing the top 10 pages google has for me and placing a text link from each of these pages in the body text. When doing link analysis I now see I have two links to each page but understand that google will only account for the first one (from the menu) Questions:
Intermediate & Advanced SEO | | rnperki
Should I link to every main page from the Menu with the exact anchor text?
Does google only take into account the first link to a page it discovers?
Will it associate a link to a page with just the text of the county (Berkshire) to be related to Magicians in Berkshire as that is what the page is about? A few years ago I used to have at the bottom of each page Magician Hampshire | Magician Surrey | Magician Berkshire | Magician Sussex links - and to date a a lot of other Magicians employ this same technique. I was told google would slap them for it but so far it has not and it seems to be working for them. Many Thanks Roger http://www.rogerlapin.co.uk0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0