Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Sitmap Page - HTML and XML
-
Hi there
I have a domain which has a sitemap in html for regular users and a sitemap in xml for the spiders.
I have a warning via seomoz saying that i have too many links on the html version.
What do i do here?
regards
Stef
-
Sorry for late reply guys. Great advice by both of you.
@ Alan, great display on how Page Rank flows. Great illustration which i totally could never explain to clients
-
220 links on a page is absolutely not too many on any level. Many of the highest ranked sites on the internet present more then 220 links.
The particular page is question is simply a sitemap, and the page is being offered to help users navigate the site. The VerizonWireless.com sitemap I shared has 370+ links on it.
The SEOmoz "warning" is a simple feature which will be set off on any internet page with 100+ links. The SEOmoz tool does not care how well those links are presented, whether they are footer links, whether they are on a content page, what the PA of the page is nor any other SEO factor. It is simply a >100 or not warning. As such, it offers very little value.
I am in the process of compiling a list of suggested features for the tool which will help improve it's usefulness. One of the feature recommendations I am proposing is to allow users to adjust the 100 count to any number they want. Each SEO can then choose to use the default 100 number, or use a number more suited to the particular site.
The link Alan shared is a nice explanation of PR flow. It is a nice page for learning PR, but with respect to this topic it over-complicates an otherwise very simple and straight-forward question. The simple point is, the more links on a page the less link juice will flow to each link.
The goals for any web page links should be as follows:
1. Ensure all links are useful for your site. For example, you probably want PR flowing to your most profitable product/service, and to your latest additions.
2. Ensure your links are actually used. Check analytics.
3. If a link is not used or not useful, remove it.
4. Along the lines above, your links should be presented in a very user-friendly manner. You don't want a page to look like a list of nothing but links as users will have a difficult time choosing what they want. An exception would be a sitemap.
With the above in mind, keep as many links as you see fit on the page. If it is 40, that is fine. If there are 250 links on the page, that is fine as well. When you start down a path of chasing numbers such as forcing your content into "500 words" or forcing your links into "100 maximum" you fall into a pit of SEO fallacies. You are not providing the best experience for your users nor SEO.
TL;DR - Provide your links in a manner which is visually appealing, non-spammy and helpful to users. Keep in mind your need to flow PR to important pages such as your money pages. Otherwise remove unnecessary links. Whatever that number of links is, so be it. Don't try to fit your links into a "I must be under 100" or any other number mindset.
-
too many according to google. make of it what you will, does not look like it is for any technial reason anyymore, but obviously there is a limit to how much of page they will crawl.
http://www.mattcutts.com/blog/how-many-links-per-page/You see how page rank flows, having a lot of links on your home page works to your advantage. Using numbers from Googles original algo,
Assuming every page starts with 1PR, a page passes %85 of its link juice, so if you have 100 links that’s 0.0085 each. To 100 internal pages, making them 1. 0085each , now they all pass back 85% that’s 0.857225 each, x 100 = 85.7225 back to your home page, now we do the sums all over again and again till they numbers stay static. Now this calculation relies on the internal pages having no other links, so you are unlikely to get figures as good as this, but you get the idea.
See link for better explanation.
http://www.webworkshop.net/pagerank.html check out calculator
Remember don’t stuff up your linking stuckture for the users just for the sake of page rank.I see it as like a golf swing after a lesson, if you try to do what you just learnt too much, you will get all stiff and un-natural, it’s better to swing naturally with what you have learnt in the back of your head.
-
Yes, ignore the warning.
It is possible to present 220 links in a neat, categorized manner. It is also possible to present 100 links as a jumble which is not user friendly.
You shared your presentation is similar to the example I shared which means it is user friendly so ignoring the warning is fine.
-
Nice, i really like that example that you gave. My one is similar and categorized too. Question still remains, do i ignore this warning for this page?
-
I have about 220 links
-
Wel how many do you have.
A quick way of checking is with IE, press F12, go to view menu, then link report
-
Your HTML sitemap is for users. It should present your links in such a manner as to be useful for users who are looking for a page on your site.
An example sitemap for a large site: http://www.verizonwireless.com/b2c/sitemap.jsp
It does not contain a link to every last page. It is more of a helpful directory. I would suggest you adjust your HTML sitemap in a similar manner. Treat is as a page of links for users.
-
So do you think that i should ignore this warning for the sitemap html page?
-
Well have a look if you can move a few out, it is good to link to as many pages as you can from the home page for the sake of PR flow. but not go over the limit, Some say the limit is 100, some say 150
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Sitemap_index.xml = noindex,follow
I was running a rapport with Sreaming Frog SEO Spider and i saw: (Tab) Directives > NOindex : https://compleetverkleed.nl/sitemap_index.xml/ is set on X-Robots-Tag 1 > noindex,follow Does this mean my sitemap isn't indexed? If anyone has some more tips for our website, feel free to give some suggestions 🙂 (Website is far from complete)
Technical SEO | | Happy-SEO2 -
Are image pages considered 'thin' content pages?
I am currently doing a site audit. The total number of pages on the website are around 400... 187 of them are image pages and coming up as 'zero' word count in Screaming Frog report. I needed to know if they will be considered 'thin' content by search engines? Should I include them as an issue? An answer would be most appreciated.
Technical SEO | | MTalhaImtiaz0 -
Is Google suppressing a page from results - if so why?
UPDATE: It seems the issue was that pages were accessible via multiple URLs (i.e. with and without trailing slash, with and without .aspx extension). Once this issue was resolved, pages started ranking again. Our website used to rank well for a keyword (top 5), though this was over a year ago now. Since then the page no longer ranks at all, but sub pages of that page rank around 40th-60th. I searched for our site and the term on Google (i.e. 'Keyword site:MySite.com') and increased the number of results to 100, again the page isn't in the results. However when I just search for our site (site:MySite.com) then the page is there, appearing higher up the results than the sub pages. I thought this may be down to keyword stuffing; there were around 20-30 instances of the keyword on the page, however roughly the same quantity of keywords were on each sub pages as well. I've now removed some of the excess keywords from all sections as it was getting in the way of usability as well, but I just wanted some thoughts on whether this is a likely cause or if there is something else I should be worried about.
Technical SEO | | Datel1 -
SEO value of InDesign pages?
Hi there, my company is exploring creating an online magazine built with Adobe's InDesign toolset. If we proceeded with this, could we make these pages "as spiderable" as normal html/css webpages? Or are we limited to them being less spiderable, or not at all spiderable?
Technical SEO | | TheaterMania1 -
Low page impressions
Hey there MOZ Geniuses; While checking my webmaster data I noticed that almost all my Google impressions are generated by the home page, most other content pages are showing virtually no impression data <50 (the home page is showing around 1500 - a couple of the pages are in the 150-200 range). the site has been up for about 8 months now. Traffic on average is about 500 visitors, but I'm seeing very little entry other then the home page. Checking the number Sitemap section 27 of 30 are index Webmaster tools are not reporting errors Webmaster keyword impressions are also extremely low 164 keywords with the highest impression count of 79 and dropping from there. MOZ is show very few minor issues although it says that it crawled 10k pages? -- we only have 30 or so. The answer seems obvious, Google is not showing my content ... the question is why and what steps can I take to analyze this? Could there be a possibility of some type of penalty? I welcome all your suggestions: The site is www.calibersi.com
Technical SEO | | VanadiumInteractive0 -
Determining When to Break a Page Into Multiple Pages?
Suppose you have a page on your site that is a couple thousand words long. How would you determine when to split the page into two and are there any SEO advantages to doing this like being more focused on a specific topic. I noticed the Beginner's Guide to SEO is split into several pages, although it would concentrate the link juice if it was all on one page. Suppose you have a lot of comments. Is it better to move comments to a second page at a certain point? Sometimes the comments are not super focused on the topic of the page compared to the main text.
Technical SEO | | ProjectLabs1 -
HTML Sitemap Pagination?
Im creating an a to z type directory of internal pages within a site of mine however there are cases where there are over 500 links within the pages. I intend to use pagination (rel=next/prev) to avoid too many links on the page but am worried about indexation issues. should I be worried?"
Technical SEO | | DMGoo0