Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Is sitemap required on my robots.txt?
-
Hi,
I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt?
That's my situation:
- 1 multilang platform which means...
- ... 2 set of pages. One for each lang, of course
- But my CMS (magento) only allows me to have 1 robots.txt file
So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss?
Thanks in advance,
Juan Vicente Mañanas Abad
-
Hi Juan,
You should also know that you can have multiple sitemap directives on one robots file. This is common among international sites and large commerce sites.
-
Well including sitemap in robots.txt is good praticle but isn't required. You can add sitemap in SearchConsole (or Bing WMT) and bots will index site again. Even bots can index site w/o sitemap but sitemap helps them.
That's why don't worry. You can edit your robots.txt at anytime and add sitemap if it's possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallow wildcard match in Robots.txt
This is in my robots.txt file, does anyone know what this is supposed to accomplish, it doesn't appear to be blocking URLs with question marks Disallow: /?crawler=1
Technical SEO | | AmandaBridge
Disallow: /?mobile=1 Thank you0 -
Should I Focus on Video Schema or a Video Sitemap First
Hey all, I'm working on a website that is soon going to launch a video hub that contains over 500 videos. I'm interested in ensuring that the videos show up on the SERPs page in the highest position possible. I know Google recommends that you have on-page schema for your videos as well as an XML sitemap so they can be indexed for SERP. When I look at schema and the XML video sitemap they seem to communicate very similar kinds of information (Title, Description, Thumbnail, Duration). I'm not sure which one to start with; is it more important to have video schema or a video sitemap? Additionally, if anyone knows of any good video sitemap generators (free is best, but cheap is okay too) then please let me know. Cursory google searching has just churned up a number of tools that look sketchy.
Technical SEO | | perfectsearch710 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
Should I include tags in sitemap?
Hello All, I was wondering if you should include tags and categories in your sitemap. In the past on previous blogs I have always left tags and categories out. The reason for this is a good friend of mine who has been doing SEO for a long time and inhouse always told me that this would result in duplicate content. I thought that it would be a great idea to get some input from the SEOmoz community as this obviously has a big affect on your blog and the number of pages indexed. Any help would be great. Thanks, Luke Hutchinson.
Technical SEO | | LukeHutchinson1 -
I accidentally blocked Google with Robots.txt. What next?
Last week I uploaded my site and forgot to remove the robots.txt file with this text: User-agent: * Disallow: / I dropped from page 11 on my main keywords to past page 50. I caught it 2-3 days later and have now fixed it. I re-imported my site map with Webmaster Tools and I also did a Fetch as Google through Webmaster Tools. I tweeted out my URL to hopefully get Google to crawl it faster too. Webmaster Tools no longer says that the site is experiencing outages, but when I look at my blocked URLs it still says 249 are blocked. That's actually gone up since I made the fix. In the Google search results, it still no longer has my page title and the description still says "A description for this result is not available because of this site's robots.txt – learn more." How will this affect me long-term? When will I recover my rankings? Is there anything else I can do? Thanks for your input! www.decalsforthewall.com
Technical SEO | | Webmaster1230 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
How do I create a Video Sitemap for Youtube Embedded Videos?
I've been seeing a lot of people recommend creating a video sitemap or Media RSS feed (mRSS) and submit to Google. We have videos hosted on Brightcove and most on YouTube. Brightcove can generate the sitemap for us. But does anyone know how to generate a YouTube Video Sitemap for those videos embedded on our pages? Note: I realize I could manually assemble the video sitemap, however manually assembling the sitemap is probably not an option for us due to the volume of videos we've published.
Technical SEO | | LDS-SEO1