"link_count" column in Crawl Diagnostics report
-
On the Crawl Diagnostics report, does "link_count" represent external (links to this URL), internal, both, or what ?
-
Rock and roll!
Glad you got it all figured out Glenn.
Mike
-
OK. I think I get it
For the URL in question, the "link_count", Title and Meta Description exactly match the custom 404 page, so it looks like there is no page for this URL. The reason it was picked up in the crawl is because a link to this exists in the "referrer" page.
If I get them to correct the referrer page, this should be good.
(First day using SEOMoz Pro)
Thanks much for your help !
-
The site is returning a custom 404 page. That is why SEOmoz and Screaming Frog are returning a 200.
You need to define that page to return a 404 or fix the page.
This article will hopefully shed some light on your situation.
Mike
-
Could you give an example of a URL that goes to a 404.
Edit: NM I see above.
-
Mike - Yep. Screaming Frog also shows a 200 Status Code. So I have to assume the page exists -- altho not sure why I'm directed to a 404 page...
So basically, I think you and George answered my original question: "link_count" represents links on a page pointing to other internal and external pages.
I would appreciate any thoughts on why I'm ending up on a 404 page tho...
-
No Mike. This is a client's site. An example of these URLs is: http://www.teamflexo.com/home/contact_us.asp, which shows a link count of 43.
Good thought tho, I'll take a look at this on Screaming Frog.
-
Are we talking about your gfwebsoft website that you have listed in your profile?
Using Screaming Frog, the only 404 status code I am seeing is from the homepage, contact, costs, about, testimonials, and services pages that are pointing to your facebook page.
Do you have specific URLs you can share that are 404ing?
Mike
-
If these are on-page links, then I have another question...
I had originally assumed that if the page showed up in Crawl Diagnostics, it must actually exist (as opposed to being a URL in a backlink somewhere) but there are several URLs showing "link_count" of 40+ that, when you go to the URL, it goes directly to a 404 page. (However, the "http_status_code" in the diagnostics report is showing 200.)
Any theories that could help me understand this ?
Tx, Glenn
-
It refers to the number of followed links on the page pointing to other pages on your site or other sites.
Source: Using MozBar to compare numbers.
-
Hi Glen,
It looks like those numbers represent the number of hyperlinks (internal and external) on that specific page.
I was able to validate this by looking at the link_count column of 100+ and verifying the same numbers on my Too Many On-Page Links report on the SEOmoz Crawl Diagnostics.
Hope this helps.
Mike
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moz Crawl Report more urls?
Hi. I have used Moz Crawl Test and get my 3,000 urls crawled no issue. However, my site has more than that, is it possible to crawl the entire website? Alot of the crawl urls in the Moz test are search string urls and filters so Ive probably wasted about 2,500 urls on filter urls. Any advise or alternative software that wont cost a fortune?
Moz Pro | | YNWA
Thanks0 -
Can you highlight a list of keywords in reports?
Hello all, My client recently asked me if there was a way to highlight a specific list of keywords to put on the first page of every report. These would be the highest priority keywords that we would always want to know the status of for his site. In the reports section, I am only seeing options for organizing the keywords by rankings improved/declined and comparing to competitors. Would anyone know how to label/categorize this list of keywords and then produce them as part of the monthly reports? Thank you,
Moz Pro | | Level2Designs
Daniel0 -
Order of urls in SEOMoz crawl report
Is there any rhyme or reason to the order of urls in the SEOMoz crawl report, or are the urls just listed in random order?
Moz Pro | | LynnMarie0 -
Crawl Disgnosis only crawling 250 pages not 10,000
My crawl diagnosis has suddenly dropped from 10,000 pages to just 250. I've been tracking and working on an ecommerce website with 102,000 pages (www.heatingreplacementparts.co.uk) and the history for this was showing some great improvements. Suddenly the CD report today is showing only 250 pages! What has happened? Not only is this frustrating to work with as I was chipping away at the errors and warnings, but also my graphs for reporting to my client are now all screwed up. I have a pro plan and nothing has (or should have!) changed.
Moz Pro | | eseyo0 -
How to remove URLS from from crawl diagnostics blocked by robots.txt
I suddenly have a huge jump in the number of errors in crawl diagnostics and it all seems to be down to a load of URLs that should be blocked by robots.txt. These have never appeared before, how do I remove them or stop them appearing again?
Moz Pro | | SimonBond0 -
How long is a full crawl?
It's been now over 3 days that the dashboard for one of our campaigns shows "Next Crawl in Progress!". I am not complaining about the length... but I have to agree that SEOMoz is quite addictive, and it's quite frustrating to see that everyday 🙂 Thanks
Moz Pro | | jgenesto0 -
What causes Crawl Diagnostics Processing Errors in seomoz campaign?
I'm getting the following error when seomoz tries to spider my site: First Crawl in Progress! Processing Issues for 671 pages Started: Apr. 23rd, 2011 Here is the robots.txt data from the site: Disallow ALL BOTS for image directories and JPEG files. User-agent: * Disallow: /stats/ Disallow: /images/ Disallow: /newspictures/ Disallow: /pdfs/ Disallow: /propbig/ Disallow: /propsmall/ Disallow: /*.jpg$ Any ideas on how to get around this would be appreciated 🙂
Moz Pro | | cmaddison0 -
Crawl Diagnostics bringing 20k+ errors as duplicate content due to session ids
Signed up to the trial version of Seomoz today just to check it out as I have decided I'm going to do my own SEO rather than outsource it (been let down a few times!). So far I like the look of things and have a feeling I am going to learn a lot and get results. However I have just stumbled on something. After Seomoz dones it's crawl diagnostics run on the site (www.deviltronics.com) it is showing 20,000+ plus errors. From what I can see almost 99% of this is being picked up as erros for duplicate content due to session id's, so i am not sure what to do! I have done a "site:www.deviltronics.com" on google and this certainly doesn't pick up the session id's/duplicate content. So could this just be an issue with the Seomoz bot. If so how can I get Seomoz to ignore these on the crawl? Can I get my developer to add some code somewhere. Help will be much appreciated. Asif
Moz Pro | | blagger0