SEOMoz Campaign shows Warnings for pages with >200 and <300 links
-
We currently use SEOMoz's campaign tool to review the SEO progress of our site. One thing we are unsure of is that SEOMoz gives us a warning for over 1000 of our pages because we have around 200 links on those pages (all in the Menu Drop Downs). I read the post and watched the video, Whiteboard Friday Flat Site Architecture a while ago and Rand mentioned there is no issue with having a web page with 200 to 300 links and he even encouraged it. So why would these show up as warnings in our Campaign?
-
But SEOMoz also recommends to go up to 300 links per page, per the White Board Video I linked to.
I wish I could get the page below 100 unfortunately, the drown menus are 2 layers deep and I am not sure how to reduce their size to the web crawlers. It also does not seem to be effecting our site negatively.
-
SEOMoz recommends to keep them below 100. Because Google starts out by recommending that links on a page are kept to a reasonable number, which they classify as below 100. This rule of thumb is somewhat outdated and can be traced back to when spiders read a smaller portion of a web page – the more links contained within the page, the longer it was. To ensure its spiders performed effectively, the benchmark of 100 was set. While search engines now read much more of a page before indexing it, having more than 100 links can set off the spam filters and result in a lower perceived page quality.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Forums and soical network pages - link analysis
Hi, I am working on reputation management project and i reach the point when i found that forums pages and social pages are hard to analyse their links and page rank. for example if you search these kind of pages on http://www.opensiteexplorer.org/ , it wont show any result And if you try to find the page rank on Moz toolbar it will show 1 pagerank How would you do guys when u are in this situation, Do you consider these kind of pages different than normal pages? Thanks
Moz Pro | | MohammadSabbagh0 -
Moz crawl only shows 2 pages, but we have more than 1000 pages.
Hi Guys Is there anyway we can test Moz crawler ? it showing only 2 pages crawls. We are running website on HTTPS ? Is HTTPS is issues for Moz ?
Moz Pro | | dotlineseo0 -
Is www.domain.com/page the same url as www.domain.com/page/ for Google? (extra slash at end of url)
Dear all, in open site explorer there is a difference the url's 'www.domain.com/page' and 'www.domain.com/page/' (extra slash at end). There can be different values in pageauthority etc. in the open site explorer tool, but is this also the case for Google? Thanks for replying, Regards, Ben
Moz Pro | | HMK-NL0 -
Internal Links
I've searched high and low, but I could just be missing it... Is there a report on internal links? Maybe a count? Or better yet, a way to see internal anchor text?
Moz Pro | | MCIMaui0 -
How to Stop SEOMOZ from Crawling a Sub-domain without redoing the whole campaign?
I am using SEOMOZ for a client to track their website's performance and fix any errors and issues. A few weeks ago, they created a sub-domain (sub.example.com) to create a niche website for some of their specialized content. However, when SEOMOZ re-crawled the main domain (example.com), it also reported the errors for the subdomain. Is there any way to stop SEOMOZ from crawling the subdomain and only crawl the main domain? I know that can be done by starting a new campaign, but is there any way to work around an existing campaign? I'm asking because we would like to avoid the setting up the campaign again and losing the historical data as well. Any input would be greatly appreciated. Thanks!
Moz Pro | | TheNorthernOffice790 -
SEOmoz keyword rankings in campaign report
Hi. Does anybody know where the rankings summary has moved to in the campaign reports? I want to know how many keywords have moved up or down in the last week and can't find it anywhere!
Moz Pro | | neooptic1 -
Company Name in Page Title creating thousands of "Duplicate Page Title" errors
I am new, and I just got back my crawl results (after a week or more). The first thing I noticed is that the "duplicate page title" is in the thousands, my urls and page titles are different. The only thing I can see is that our company name is at appended to the name of every title. I did search and found one other person with this problem, but no answer was given. Can anyone offer some advice? This doesn't seem right... Thanks,
Moz Pro | | AoyamaJPN0