Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
What’s the best tool to visualize internal link structure and relationships between pages on a single site?
-
I‘d like to review the internal linking structure on my site. Is there a tool that can visualize the relationships between all of the pages within my site?
-
I know this question is quite old but still it is valid since there has been no realistic solution for this issue.
I developed Link Map Viewer (Windows app) which can visualize internal link of your site. I took totally different approach to visualize internal link structure.
Please watch this demo video https://www.youtube.com/watch?v=epgXDgpm9Rg for quick review.
Or visit my site https://kappa-project.co.jp/seo/en/.
-
OMG sitebulb is epic. There actually is an app for everything in the entire universe. Great call David, thanks.
-
Hi John,
For visualizing internal link data, I think Sitebulb is the only tool that does it automatically.
You can do it with Screaming Frog data but it's a bit of a process to visualize it (check out this guide if you're interested).
Cheers,
David
-
Yep, screaming frog SEO spider is great to get the data into excel. However, it's not great for visualising it. I'd go low-tech and use Word or a pen and pad!
It's all about topics so write all your main topics out in bubbles with a little cloud of subtopics around them and draw lines from the ones that are relevant to each other. Like when you're planning an essay at at school. It really is that simple.
Keep it simple!
Hope this helps.
Ed.
-
Hello John,
What you are looking for is ScreamingFrog. Remember to take a look at their User Guide.
Free membership allows you to crawl until 500 URLs, even thogh its a really afordable tool (just £149/year)
Best luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I safely asume that links between subsites on a subdirectories based multisite will be treated as internal links within a single site by Google?
I am building a multisite network based in subdirectories (of the mainsite.com/site1 kind) where the main site is like a company site, and subsites are focused on brands or projects of that company. There will be links back and forth from the main site and the subsites, as if subsites were just categories or pages within the main site (they are hosted in subfolders of the main domain, after all). Now, Google's John Mueller has said: <<as far="" as="" their="" url="" structure="" is concerned,="" subdirectories="" are="" no="" different="" from="" pages="" and="" subpages="" on="" your="" main="" site.="" google="" will="" do="" its="" best="" to="" identify="" where="" sites="" separate="" using="" but="" the="" is="" same="" for="" a="" single="" site,="" you="" should="" assume="" that="" seo="" purposes,="" network="" be="" treated="" one="">></as> This sounds fine to me, except for the part "Google will do its best to identify where sites are separate", because then, if Google establishes that my multisite structure is actually a collection of different sites, links between subsites and mainsite would be considered backlinks between my own sites, which could be therefore considered a link wheel, that is, a kind of linking structure Google doesn't like. How can I make sure that Google understand my multisite as a unique site? P.S. - The reason I chose this multisite structure, instead of hosting brands in categories of the main site, is that if I use the subdirectories based multisite feature I will be able to map a TLD domain to any of my brands (subsites) whenever I'd choose to give that brand a more distinct profile, as if it really was a different website.
Web Design | | PabloCulebras0 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Can anyone recommend a tool that will identify unused and duplicate CSS across an entire site?
Hi all, So far I have found this one: http://unused-css.com/ It looks like it identifies unused, but perhaps not duplicates? It also has a 5,000 page limit and our site is 8,000+ pages....so we really need something that can handle a site larger than their limit. I do have Screaming Frog. Is there a way to use Screaming Frog to locate unused and duplicate CSS? Any recommendations and/or tips would be great. I am also aware of the Firefix extensions, but to my knowledge they will only do one page at a time? Thanks!
Web Design | | danatanseo0 -
How to make sure category pages rank higher than product pages?
Hi, This question is E-Commerce related. We have product categories dividing products by color. Let's say we have the category 'blue toy cars' and a product called 'blue toy car racer', both of these could rank for the keyword 'blue toy car'. How do we make sure the category 'blue toy cars' ranks above the product 'blue toy car racer'? Or is the category page automatically ranked higher because of the higher page authority of that page? Alex
Web Design | | WebmasterAlex0 -
URL structure for multiple cities?
Hi, i am in the process of setting up a business directory site that will be used in a number of cities, though i am initially launching with only one city. My question is, what is the best URL structure to use for the site and should i start using this URL structure from day one? At the moment i am using www.mysite.com.au as my primary website where it contains all listings for the the one initial launch city. Though to plan for the future i was considering this URL structure: www.mysite.com.au/cityname so for example, if i launch in the city Sydney initially then all website traffic that goes to www.mysite.com.au would simply be redirected (302 temp redirect?) to www.mysite.com.au/sydney. When i expand to other cities www.mysite.com.au would simply be a "select your city" screen that then redirects to the city of choice (similar to www.groupon.com page). How would doing a 302 redirect from www.mysite.com.au to www.mysite.com.au/city impact on SEO for the initial launch? Or should i just place this on the root domain since no other cities exist at the moment?
Web Design | | adamkirk0 -
Where is the best place to put reciprocal links on our website?
Where should reciprocal links be placed on our website? Should we create a "Resources" page? Should the page be "hidden" from the public? I know there is a right answer out there! Thank you for your help! Jay
Web Design | | theideapeople0 -
Site-wide footer links or single "website credits" page?
I see that you have already answered this question before back in 2007 (http://www.seomoz.org/qa/view/2163), but wanted to ask your current opinion on the same question: Should I add a site-wide footer link to my client websites pointing to my website, or should I create a "website credits" page on my clients site, add this to the footer and then link from within this page out to my website?
Web Design | | eseyo0 -
How is link juice split between navigation?
Hey All, I am trying to understand link juice as it relates to duplicate navigation Take for example a site that has a main navigation contained in dropdowns containing 50 links (fully crawl-able and indexable), then in the footer of said page that navigation is repeated so you have a total of 100 links with the same anchor text and url. For simplicity sake will the link juice be divided among those 100 and passed to the corresponding page or does the "1st link rule" still apply and thus only half of the link juice will be passed? What I am getting at is if there was only one navigation menu and the page was passing 50 link juice units then each of the subpages would get passed 1link juice unit right? but if the menu is duplicated than the possible link juice is divided by 100 so only .5 units are being passed through each link. However because there are two links pointing to the same page is there a net of 1 unit? We have several sites that do this for UX reasons but I am trying to figure out how badly this could be hurting us in page sculpting and passing juice to our subpages. Thanks for your help! Cheers.
Web Design | | prima-2535090