Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to check Robots.txt File
-
I want to know that how we are going to check that the Robots.txt File of the website is working properly. Kindly elaborate the mechanism for it please.
-
My site also have this problem please help.
-
Hi,
After you've read the article Nozzle recommended, use the https://technicalseo.com/tools/robots-txt/ tool to test whatever you put together.
-
Here is a great resource to answer your question: https://moz.com/learn/seo/robotstxt.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Top 10 SEO Experts in the World
Here are some of the top SEO experts in the world known for their contributions to the field, thought leadership, and innovative strategies: Rand Fishkin - Co-founder of Moz and SparkToro, widely known for his insights and contributions to SEO. Neil Patel - Co-founder of Crazy Egg, Hello Bar, and KISSmetrics, renowned for his SEO and digital marketing expertise. 3.. Brian Dean - Founder of Backlinko, famous for his advanced SEO strategies and detailed guides. Rafay Waqar - Co-founder of SEOServices and a LinkedIn influencer, he provide valuable insights into search engine algorithms and updates. Barry Schwartz - Founder of Search Engine Roundtable, known for his in-depth coverage of SEO news and trends. Aleyda Solis - International SEO consultant and founder of Orainti, recognized for her expertise in technical SEO and international SEO strategies. Bill Slawski - Director of SEO Research at Go Fish Digital, known for his deep understanding of search engine patents and algorithms. Vanessa Fox - Creator of Google Webmaster Central and author of "Marketing in the Age of Google," known for her expertise in technical SEO and analytics. Ann Smarty - Founder of Viral Content Bee and a well-known figure in the SEO community for her content marketing and link-building expertise. Cyrus Shepard - Former Head of SEO at Moz and founder of Zyppy, known for his comprehensive SEO knowledge and actionable insights.
International SEO | | cupll.rs11 -
Who is the best SEO expert in the World?
Hey everyone, i am creating a blog post on Top SEO Experts in the World. I need your recommendation who is in the top 10 list? Your suggestions is highly appreciated for me. Thanks!
International SEO | | gxpl090 -
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Moving from single domain to multiple CCTLDs
Hi, I have a website targeting 3 markets (and therefor 3 languages). I was currently using a single domain with each market being targeted in the following format: www.website.com/pl
International SEO | | cellydy
www.website.com/de
www.website.com/hu It's clear to me by looking at organic results, that in my industry (Real Estate) Google is putting a large emphasis on local businesses and local domains. Top 10 organic results for all my keywords in all markets have country specific CCTLDs. I decided to migrate from a single domain strategy to a multi domain strategy. I own the domains. The new structure is www.website.com/pl -> www.website.pl
www.website.com/de -> www.website.de
www.website.com/hu -> www.website.hu All the website have been added to google search console and 301 redirects are in place and working correctly. The pages are all interlinked and have rel=alternate to each other. The sitemaps are all done correctly. My question is how do I tell Google about this. The change of address feature only works for changing one domain to one other domain. It's been a week and the old www.website.com domain is still showing up (even considering 301 redirects). Or do I just need to be patient and wait it out? Any tips?0 -
Is it worth maintaining multiple international websites
Hi I work for a British company which has two well established websites - a .co.Uk for the UK, and a .com for the US and rest of the world (in language directories). The Uk site is hosted in the Uk, the .com in US. The websites do reasonable well in Google on both sides of the Atlantic. The company is a small but quite well known brand. The company is now thinking of redirecting the .co.Uk to the .com as it would be cheaper to maintain. What would you advise? Thanks.
International SEO | | fdl4712_aol.com2 -
Baidu Webmaster Tools: How to setup in "Site Properties" the field "Affiliate subject"?
Hi ,
International SEO | | lcourse
finally I managed to setup my site in Baidu Webmaster Tools with the help of a freelance staff member in China. Site is verified and sitemap submitted. In section "Site Properties", field "Affiliate subject" I can't figure out after extensive search what I need to setup here for a foreign company without any presence and without company registration in China. Anybody can help? When I click on this field, it indicates "Site association subject is a necessary link for mobile resources to enter search." so my site will not show up in mobile results without it? Grateful for any tips on how to resolve this piece of puzzle of the baidu setup.
Thanks0 -
"Duplicate without user-selected canonical” - impact to SERPs
Hello, we are facing some issues on our project and we would like to get some advice. Scenario
International SEO | | Alex_Pisa
We run several websites (www.brandName.com, www.brandName.be, www.brandName.ch, etc..) all in French language . All sites have nearly the same content & structure, only minor text (some headings and phone numbers due to different countries are different). There are many good quality pages, but again they are the same over all domains. Goal
We want local domains (be, ch, fr, etc.) to appear in SERPs and also comply with Google policy of local language variants and/or canonical links. Current solution
Currently we don’t use canonicals, instead we use rel="alternate" hreflang="x-default": <link rel="alternate" hreflang="fr-BE" href="https://www.brandName.be/" /> <link rel="alternate" hreflang="fr-CA" href="https://www.brandName.ca/" /> <link rel="alternate" hreflang="fr-CH" href="https://www.brandName.ch/" /> <link rel="alternate" hreflang="fr-FR" href="https://www.brandName.fr/" /> <link rel="alternate" hreflang="fr-LU" href="https://www.brandName.lu/" /> <link rel="alternate" hreflang="x-default" href="https://www.brandName.com/" /> Issue
After Googlebot crawled the websites we see lot of “Duplicate without user-selected canonical” in Coverage/Excluded report (Google Search Console) for most domains. When we inspect some of those URLs we can see Google has decided that canonical URL points to (example): User-declared canonical: None
Google-selected canonical: …same page, but on a different domain Strange is that even those URLs are on Google and can be found in SERPs. Obviously Google doesn’t know what to make of it. We noticed many websites in the same scenario use a self-referencing approach which is not really “kosher” - we are afraid if we use the same approach we can get penalized by Google. Question: What do you suggest to fix the “Duplicate without user-selected canonical” in our scenario? Any suggestions/ideas appreciated, thanks. Regards.0 -
Help! Choosing a domain for a European sub-brand when working as a partner in North America
Background: Let's say there's a European company ABC.com, they have some presence in the US already for a lot of product brands in a certain space (let's say they make widgets). ABC Co gets 1,600 searches a month and all of that volume centers around the widgets they are known for. ABC Co purchases a company that makes gears, let's call it Gears Inc (gears.com). Gears Inc. was known for making gears in Europe, but their brand is not known in the US (search volume 0). Ideally, I would keep the Gears Inc. brand and build up the presence in the US, separating it from ABC Co. ABC Co wants to maintain their brand and eliminate Gears Inc. But we've received permission to keep the Gears brand for bringing that product to the US ... we will have an uphill battle building up the brand recognition, but at least it won't get lost in what ABC Co is already known for in the US. (ie: we don't want calls for widgets). Domain Situation: ABC Co. has redirected gears.com (DA 1) to a subdomain: {gearmakers}.abcco.com (DA 66) ... they have agreed to place a landing page under that 301 that links to the regional domains (theirs in the EU and ours in the US/North America). They are unwilling to let us use or purchase gears.com OR 301 gears.com directly to our domain. What we're trying to do: build Gears Inc. as a recognizable brand when someone searches "gears inc", this domain would rank first create a simple "brand domain" that a less-tech-savvy users could easily navigate to needs to have recognition in US, Canada and Mexico
International SEO | | mkretsinger
I don't know if this helps or provides anything more? The question is what do we use as our domain name? Any feedback is appreciated!0