Should I add my html sitemap to Robots?
-
I have already added the .xml to Robots. But should I also add the html version?
-
As said above, great question though.
-
No, it won't help you at all as it's not a valid extension that they will use. What you can do is add a link to the HTML sitemap from multiple pages on your site so you provide an efficient way for Google to access it and use it to crawl the other pages on your site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Html extensions
I have remodeled an old html site using wordpress. I see some instructions in wordpress that says I can add an .html extension to some of the pages, but it looks pretty complicated. Is there any benefit in going through that hassle? or should I just ask my web guy to rewrite via htaccess | https://sacramentotop10.com/Weddings/Dresses.html | https://sacramentotop10.com/Weddings/Dresses.html becomes https://sacramentotop10.com/weddings/dresses
Technical SEO | | julie-getonthemap0 -
Do I have to create a separate sitemap for my multilingual site?
Hi, I was wondering how should I implement a sitemap for my multilingual site. Currently we have two languanges separated by subdirectories in our site /en (english) and /fr (french) however based on the the articles that I have read there are no clear explanation on the implementation of the sitemap with different languanges. Here are the cases I think is possible for the implementation: Case 1: One sitemap with all the en and fr pages together with hreflang attribution for each pages Case 2: One sitemap with only en pages with hreflang attribution for both languages (en and fr) Case 3: Separate sitemap for en and fr pages with hreflang attribution for both languanges and connect both through sitemapindex creation. If any of my proposed cases are not possible please let me know the best approach in creating a multilingual sitemap for my site. Appreciate your thoughts regarding this. Thank you!
Technical SEO | | ReneAnton0 -
Add trailing slash after removing .html extention
My website is non www ,it has wordpress in subdirectory and some static webpages in the root and other subdirectory 1. i want to remove .html extention from the webpages in the root and
Technical SEO | | Uber_
the others static webpages in subdirectory.
2. add slash at the end.
3. 301 redirect from non slash to url with slash. so it should be http://ghadaalsaman.com/articles.html to http://ghadaalsaman.com/articles/ and http://ghadaalsaman.com/en/poem-list.html to http://ghadaalsaman.com/en/poem-list/ the below code 1. working with non slash at the end **2. **redirect 301 url with slash to non here's my .htaccess <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase /</ifmodule> #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://ghadaalsaman.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L] PS everything is ok with the wordpress , the problems with static pages only. Thanks in advanced0 -
Add selective URLs to an XML Sitemap
Hi! Our website has a very large no of pages. I am looking to create an XML Sitemap that contains only the most important pages (category pages etc). However, on crawling the website in a tool like Xenu (the others have a 500 page limit), I am unable to control which pages get added to the XML Sitemap, and which ones get excluded. Essentially, I only want pages that are upto 4 clicks away from my homepage to show up in the XML Sitemap. How should I create an XML sitemap, and at the same time control which pages of my site I add to it (category pages), and which ones I remove (product pages etc). Thanks in advance! Apurv
Technical SEO | | AB_Newbie0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Half Implemented HTML 5 Structure
Hi there, I have just notced on a website that it has a halt implemented html 5 structure. Well, when I say half implemented, it has the doctype and then one <header>section. After that all of the divs are custom ones that have been added for the CSS. Could this lack of structure have a negative effect on the site? Cheers, Edward </header>
Technical SEO | | edwardlewis0 -
Restaurant menu SEO: PDF or HTML?
Is it better to use a PDF or hard code restaurant menus (or any document for that matter) in HTML? I want the content to be indexed and thought PDF was the way to go for several reasons, but I wanted to get confirmation on this before I move forward.
Technical SEO | | BostonWright0 -
Robots.txt
Hi there, My question relates to the robots.txt file. This statement: /*/trackback Would this block domain.com/trackback and domain.com/fred/trackback ? Peter
Technical SEO | | PeterM220