Will password protecting my test sub-domain help keep the SEs from indexing it?
-
Hi, all. I'm working in an unfamiliar area here, so I hope someone can tell me if I'm out in left field.
I am building a sub-domain called http://test.mysite.com, so that I can upload a client's still-under-construction site while working on it. When completed, it'll go up on his server, replacing his old site. Obviously, I want to ensure that it doesn't get indexed while it's on my test platform.
A friend suggested that I password it with htaccess and htpasswd, since we can never be certain the SEs will obey site directives.
My question is, what do you think would be the best (and hopefully, simplest) way to accomplish this?
I'm no code-monkey, so "simple" is a big plus!
Doc
By the way, the platform will be Wordpress CMS.
-
A different Matt but I have to/still agree that you need to password protect the site. This isn't just for a protection against crawlers but also anyone else who might be snooping around. Unless your client is okay with their work being released early into the wild you should password protect it.
The good news is that many hosting companies have tools that will automagically generate the .htaccess files for your.
-
Thanks, Darryl-
Passwording the site seemed like a good option, although I wasn't aware that Matt had ever stated that. That being the case, it would certainly seem like the way to go. Thanks for the input!
-
Also, a good way to go is the following:
- tell search engines to go away in robots.txt
- to insert a meta noindex tag
- block in .htaccess as well
Matt Cutts stated that the only 100% sure way is to password protect the folder
-
Thanks for the response, Matt. So you feel like that's a sure way? There seems to be some different opinions on whether or not all the SEs will respect that. I had always thought it was a solid way to do it,too. But some of the arguments I'm hearing have me in doubt, now.
-
htaccess is a very simple way to protect the site from crawlers. If they can't access the pages they certainly can't index them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Exposure from backlinks for job posting URL. Will soon expire, how best to keep the backlink juice?
Hi All, First post and apologies if this seems obvious. I run a niche jobs board and recently one of our openings was shared online quite heavily after a press release. The exposure has been great but my problem is the URL generated for the job post will soon expire. I was wondering the best way to keep the "link juice" as I can't extend the post indefinitely as the job has been filled. Would a 301 redirect work best in this case? Thanks in advance for the info!
Technical SEO | | MartinAndrew0 -
Please Help! Crawl & Site Errors - Will This Impact My SEO?
Hello Moz, I need urgent help. I remove a tonne of product pages and put everything into one product page to deal with duplicate content. I thought this was a good thing to do until I got an email from Google saying: "Googlebot identified a significant increase in the number of URLs on ****.com that return a 404 (not found) error. " I checked it out and found the problem: 4 Soft 404's
Technical SEO | | crocman
41 Not Found's What do I need to do to fix this? Is it a problem or should I just ignore? I removed all the pages on WordPress but I need to do it somehow manually through Google? I have worked so hard on my SERP's that this will destroy me if I'm penalised. Please can someone advise?0 -
Accidental No Index
Hi everyone, We control several client sites at my company. The developers accidentally had a no index robot implemented in the site code when we did the HTTPS upgrade without knowing it (yes it's true). Ten days later we noticed traffic was falling. After a couple days we found the no index tags and removed them and resumbitted the sitemaps. The sites started ranking for their own keywords again within a day or two. The organic traffic is still down considerably and other keywords they are not ranking for in the same spot as they were before or at all. If I look in Google Search console, it says we submitted for example 4,000 URLs and only 160 have been indexed. I feel like maybe Google is taking a long time to re-index to remainder of the sites?? Has anyone has this issue?? We're starting to get very concerned so any input would be appreciate. I read an article on here from 2011 about a company that did the same and they were ranking for their keywords within a week. It's been 8 days since our fix.
Technical SEO | | AliMac260 -
Large Domain Authority Drop - Please Help
Hey there all, We are having the toughest time trying to figure out why our domain authority went from 12 to 3 with a search visibility score of literally zero. Back in Feb when the D.A.'s were all updated, we went down while all our competitors went up. We've been stuck at 3 for a few months and we can't understand why and aren't sure if we are dealing with a penalty of sorts. www.skycraftstudios.com is our site. We have a total of 60 some odd links in Search Console (some are garbage that we have disavowed, others are quality) but none of them are getting picked up in the newer MoZ index. We have added a few quality links lately, and even sped up our site quite a bit in conjunction with standard best practice optimizations, and even added an SSL cert, yet we stuck at a terrible D.A. of 3 and aren't even able to get into the top 10 pages of our main targeted term which seems incredibly odd to us. The site has been up for almost 2 years. Could this be simply a matter of not enough quality inbound links from the index? Any insight here would be appreciated.
Technical SEO | | SkycraftNate1 -
Mobile website indexing
Hi we have a mobile version of our website at mobile.gardening-services-edinburgh.com its been live for 5, maybe 6 months, it has its own mobile-sitemap.xml have tried submitting this sitemap to google and for some reason it does not index these pages any ideas, most welcome
Technical SEO | | McSEO0 -
We just recently moved site domains, and I tried to set up a new campaign for the new root domain, but it threw an error?
It threw an error saying we cannot access the SERPs of this site? Any reason why? It is an https:// site instead of the http://, but even our older domain had an https://
Technical SEO | | josh1230 -
Domain Switch - With lost control of original domain.
Hey all, A client finally sold a domain name after being harassed to sell for many years, without talking to us about it first. They moved the site to a new domain, and the purchasing company took over the original domain. Then they called me, wondering why the site is no longer showing up in Google. I've done some initial research, and everything I find for advice assumes that you have control over the original domain. We don't. I'm hoping someone here has some creative advice, so we don't have to start from the beginning, and/or painfully update links we've acquired. My only thought was that the new company may be kind enough to post 301's for us if we provided them.... Any thoughts / advice / life rings will be greatly appreciated! 🙂
Technical SEO | | KBK0 -
New Sub-domains or New Directories for 10+ Year Domain?
We've got a one-page, 10+ year old domain that has a 65/100 domain authority that gets about 10k page views a day (I'm happy to share the URL but didn't know if that's permitted). The content changes daily (it's a daily bible verse) so most of this question is focused on domain authority, not the content. We're getting ready to provide translations of that daily content in 4 languages. Would it be better to create sub-domains for those translations (same content, different language) or sub-folders? Example: http://cn.example.com
Technical SEO | | ipllc
http://es.example.com
http://ru.example.com or http://example.com/cn
http://example.com/es
http://example.com/ru We're able to do either but want to pick the one that would give the translated version the most authority both now and moving forward. (We definitely don't want to penalize the root domain.) Thanks in advance for your input.0