Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitmap Page - HTML and XML
-
Hi there
I have a domain which has a sitemap in html for regular users and a sitemap in xml for the spiders.
I have a warning via seomoz saying that i have too many links on the html version.
What do i do here?
regards
Stef
-
Sorry for late reply guys. Great advice by both of you.
@ Alan, great display on how Page Rank flows. Great illustration which i totally could never explain to clients
-
220 links on a page is absolutely not too many on any level. Many of the highest ranked sites on the internet present more then 220 links.
The particular page is question is simply a sitemap, and the page is being offered to help users navigate the site. The VerizonWireless.com sitemap I shared has 370+ links on it.
The SEOmoz "warning" is a simple feature which will be set off on any internet page with 100+ links. The SEOmoz tool does not care how well those links are presented, whether they are footer links, whether they are on a content page, what the PA of the page is nor any other SEO factor. It is simply a >100 or not warning. As such, it offers very little value.
I am in the process of compiling a list of suggested features for the tool which will help improve it's usefulness. One of the feature recommendations I am proposing is to allow users to adjust the 100 count to any number they want. Each SEO can then choose to use the default 100 number, or use a number more suited to the particular site.
The link Alan shared is a nice explanation of PR flow. It is a nice page for learning PR, but with respect to this topic it over-complicates an otherwise very simple and straight-forward question. The simple point is, the more links on a page the less link juice will flow to each link.
The goals for any web page links should be as follows:
1. Ensure all links are useful for your site. For example, you probably want PR flowing to your most profitable product/service, and to your latest additions.
2. Ensure your links are actually used. Check analytics.
3. If a link is not used or not useful, remove it.
4. Along the lines above, your links should be presented in a very user-friendly manner. You don't want a page to look like a list of nothing but links as users will have a difficult time choosing what they want. An exception would be a sitemap.
With the above in mind, keep as many links as you see fit on the page. If it is 40, that is fine. If there are 250 links on the page, that is fine as well. When you start down a path of chasing numbers such as forcing your content into "500 words" or forcing your links into "100 maximum" you fall into a pit of SEO fallacies. You are not providing the best experience for your users nor SEO.
TL;DR - Provide your links in a manner which is visually appealing, non-spammy and helpful to users. Keep in mind your need to flow PR to important pages such as your money pages. Otherwise remove unnecessary links. Whatever that number of links is, so be it. Don't try to fit your links into a "I must be under 100" or any other number mindset.
-
too many according to google. make of it what you will, does not look like it is for any technial reason anyymore, but obviously there is a limit to how much of page they will crawl.
http://www.mattcutts.com/blog/how-many-links-per-page/You see how page rank flows, having a lot of links on your home page works to your advantage. Using numbers from Googles original algo,
Assuming every page starts with 1PR, a page passes %85 of its link juice, so if you have 100 links that’s 0.0085 each. To 100 internal pages, making them 1. 0085each , now they all pass back 85% that’s 0.857225 each, x 100 = 85.7225 back to your home page, now we do the sums all over again and again till they numbers stay static. Now this calculation relies on the internal pages having no other links, so you are unlikely to get figures as good as this, but you get the idea.
See link for better explanation.
http://www.webworkshop.net/pagerank.html check out calculator
Remember don’t stuff up your linking stuckture for the users just for the sake of page rank.I see it as like a golf swing after a lesson, if you try to do what you just learnt too much, you will get all stiff and un-natural, it’s better to swing naturally with what you have learnt in the back of your head.
-
Yes, ignore the warning.
It is possible to present 220 links in a neat, categorized manner. It is also possible to present 100 links as a jumble which is not user friendly.
You shared your presentation is similar to the example I shared which means it is user friendly so ignoring the warning is fine.
-
Nice, i really like that example that you gave. My one is similar and categorized too. Question still remains, do i ignore this warning for this page?
-
I have about 220 links
-
Wel how many do you have.
A quick way of checking is with IE, press F12, go to view menu, then link report
-
Your HTML sitemap is for users. It should present your links in such a manner as to be useful for users who are looking for a page on your site.
An example sitemap for a large site: http://www.verizonwireless.com/b2c/sitemap.jsp
It does not contain a link to every last page. It is more of a helpful directory. I would suggest you adjust your HTML sitemap in a similar manner. Treat is as a page of links for users.
-
So do you think that i should ignore this warning for the sitemap html page?
-
Well have a look if you can move a few out, it is good to link to as many pages as you can from the home page for the sake of PR flow. but not go over the limit, Some say the limit is 100, some say 150
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to find orphan pages
Hi all, I've been checking these forums for an answer on how to find orphaned pages on my site and I can see a lot of people are saying that I should cross check the my XML sitemap against a Screaming Frog crawl of my site. However, the sitemap is created using Screaming Frog in the first place... (I'm sure this is the case for a lot of people too). Are there any other ways to get a full list of orphaned pages? I assume it would be a developer request but where can I ask them to look / extract? Thanks!
Technical SEO | | KJH-HAC1 -
Non Published Wordpress Pages
Hi, Is there any negative SEO consequences from having too many pages private or not published. Can it like slow the site down or does it not matter? Someone in my dept. has so many pages started/not complete and besides being messy, I wonder if it has any negative impact on the site. Thanks
Technical SEO | | aua1 -
Is Google suppressing a page from results - if so why?
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again. Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th. I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results. However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages. I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well. I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
Technical SEO | | Datel1 -
How to determine which pages are not indexed
Is there a way to determine which pages of a website are not being indexed by the search engines? I know Google Webmasters has a sitemap area where it tells you how many urls have been submitted and how many are indexed out of those submitted. However, it doesn't necessarily show which urls aren't being indexed.
Technical SEO | | priceseo1 -
Can you 301 redirect a page to an already existing/old page ?
If you delete a page (say a sub department/category page on an ecommerce store) should you 301 redirect its url to the nearest equivalent page still on the site or just delete and forget about it ? Generally should you try and 301 redirect any old pages your deleting if you can find suitable page with similar content to redirect to. Wont G consider it weird if you say a page has moved permenantly to such and such an address if that page/address existed before ? I presume its fine since say in the scenario of consolidating departments on your store you want to redirect the department page your going to delete to the existing pages/department you are consolidating old departments products into ?
Technical SEO | | Dan-Lawrence0 -
Landing Page URL Structure
We are finally setting up landing pages to support our PPC campaigns. There has been some debate internally about the URL structure. Originally we were planning on URL's like: domain.com /california /florida /ny I would prefer to have the URL's for each state inside a "state" folder like: domain.com /state /california /florida /ny I like having the folders and pages for each state under a parent folder to keep the root folder as clean as possible. Having a folder or file for each state in the root will be very messy. Before you scream URL rewriting :-). Our current site is still running under Classic ASP which doesn't support URL rewriting. We have tried to use HeliconTech's ISAPI rewrite module for IIS but had to remove it because of too many configuration issues. Next year when our coding to MVC is complete we will use URL rewriting. So the question for now: Is there any advantage or disadvantage to one URL structure over the other?
Technical SEO | | briankb0 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0 -
How to find links to 404 pages?
I know that I used to be able to do this, but I can't seem to remember. One of the sites I am working on has had a lot of pages moving around lately. I am sure some links got lost in the fray that I would like to recover, what is the easiest way to see links going to a domain that are pointing to 404 pages?
Technical SEO | | MarloSchneider0