Too many on page links in sitemap.html
-
My crawl report is flagging an issue with too many links to one of my pages, this page is my sitemap.html. However, I have coded the page so that if required is specified it generates an .xml version of the page and if not then the html version is displayed. What is the best way to stop the crawl finding the html version whilst maintaining it on the site for clients navigation?
-
The thing to remember is that the HTML version should only ever be used for users and not to redirect robots if they hit a 404 on your .xml file. The reason for this is that search engines may still see the file as 404 after the redirect or a 301 redirect, if the later you then have an issue of search engines thinking it was there but is now the html page. Which of course is not a good thing.
I would advise ensuring the fall back never happens to robots / spiders - if the file is just a 404 SE's will return to it, they may not if it is 301 redirect.
-
Thanks for the response,
This was the first thought, but I wasn't 100% sure that hiding it in the robot.txt file should solely remove this issue and it is still early.
Thanks again.
-
hide it using a robots.txt file - though you could also use the noindex meta tag ... this being said search engines in general recognize sitemap pages and aren't too fussed by them, its a good jumping off point for them to find info.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do you increase the page footprint, so more links appear on search result?
We want to have more website links to appear in SERP. How to do that?
On-Page Optimization | | WalterHalicki0 -
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Can I use nofollow to limit the number of links on a page?
My website is an ecommerce and we have on homepage about 470 links ! 1. We have a top bar with my account, login, faq, home, contact us and link to a content page. 2 . Then we have multistore selection 3. Then we have the departament menu, with several parants + child category links 4. Then we have a banner 5. Then we have a list of the recently sold and new products. 6. then we have an image grid with the most important cms/content pages (like faq, about us, etc) 7. then we have footer, with all info pages, contact us, about us, my account etc. There are some links that are repeted 2, 3 times. For a user it is easier to find the informations but I'm not sure how search bots (google) deal with that. So I was thinking on how can I have around 150 links to be followed. To remove the links from the page is not possible. What about to add nofollow to repeted links and some child category, as the spider will crawl the father and will access child on the next page? Is this a good strategy?
On-Page Optimization | | qgairsoft0 -
Listing all services on one page vs separate pages per service
My company offers several generalized categories with more specific services underneath each category. Currently the way it's structured is if you click "Voice" you get a full description of each voice service we offer. I have a feeling this is shooting us in the foot. Would it be better to have a general overview of the services we offer on the "Voice" page that then links to the specified service? The blurb about the service on the overview page would be unique, not taken from the actual specific service's page.
On-Page Optimization | | AMATechTel0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Too many links??
Hi mozzers I have a question that I need some feedback on please, I run 2 websites both e commerce retail sites and both doing well with SEO however, Our strongest site and parent site so to speak links to our sister site. Here is the outline. Site1 parent site is hosted on a unique URL and on a VPS. I have 384 links coming into this site from the other sister site. from various pages Site 2 the sister site, has 68,864 links coming in from site 1 as we have a link in the footer on site 1 to the home page of site 2. So far we have had no adverse affects from the Google zoo releases, however I am concerned that this many links will soon get penalized. Thoughts from anyone please? I am considering removing the footer link, thus removing 68,000 + incoming links. Looking for any advice here please. thanks Ryan
On-Page Optimization | | RyanC10 -
Duplicate page
Just getting started and had a question regarding one of the reports. It is telling me that I have duplicate pages but I'm not sure how to resolve that.
On-Page Optimization | | KeylimeSocial0 -
Too Many On-Page Links
Hi All, New to SEOMoz, so thanks in advance for any answers! Looking at our Crawl Diagnostics and "Too Many On-Page Links" is first on the list. The site was build with the intention of users being able to quickly get to where they want to go with drop down menus (sub nav), so we built the navigation using bullet points/css. Yes, agreed there are too many links on each page from our navigation, main nav cats are 4 with sub nav about 40, but what is the best way to resolve the problem other then removing most of the links (from the sub nav drop down)? Could we just use the attribute rel=nofollow for the sub nav links? TIA
On-Page Optimization | | bmmedia0