Using 2 wildcards in the robots.txt file
-
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string.
So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string?
Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on?
Thanks.
-
I'm not 100% positive, however it does make sense to use it this way.
User-agent: *
Disallow: /*_Q1$
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to compete against search terms that use geo-modifiers?
I should start by saying we are new to SEO. We are introducing new “cycling tours” in new destinations and we are looking for a strategy to combat geo-modified keyword searches. When people search for “cycling tours” they will anchor their search with a geo-modifier such as “cycling tours France” or “cycling tours Italy”. Based in Australia we are keen to communicate to Australians searching for international cycling tours there are new Australian options that they may wish to consider. The geo-modifiers required to find our tours (“eyre peninsula” and “carnarvon gorge”) are currently not on the cycling communities radar. For example to find one of our new tours you need to use “cycling tours eyre peninsula” or “cycling tours carnarvon gorge”. Currently the only solution we have found to let people know about our new tours is by word of mouth. Is there an SEO solution?
Intermediate & Advanced SEO | | Chook10 -
Will using CDN Affect SEO?
I'm got a website with a slider and each of the 6 slides has a 5-second video background. The website is B2B and the user profile for the website is employees at Fortune 1000 companies in the United States using desktop computers to browse. The videos are highly optimized and we did testing using various browsers and bandwidth connections to determine the videos loaded fast enough on down to a 15mbit/s connection (which is pretty low by today's average U.S. business bandwidths.) We tried hosting the videos on Vimeo and YouTube but it caused issues in the timing of the slide show display. (I've not seen any other website do what we do the way we do it. Most sites have a single video background with a single text overlay on top.) The downside to this is that loading all those videos produces a lot of bandwidth usage for our server. The website is serving a niche service industry though so we're not exceeding our current limits. I'm wondering though might there be some benefit to hosting just the video files on a CDN? Obviously that would mean lest bandwidth usage for our server, and possibly quicker load times where the CDN server is closer to the user than our server. But are there benefits or downsides from an SEO perspective noting that I'm proposing only putting the videos on the CDN, not the entire web page.
Intermediate & Advanced SEO | | Consult19010 -
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
Question about robots file on mobile devices
Hi We have a robots.txt file, but do I need to create a separate file for the m.site or can I just add the line into my normal robots file. Ive just read the Google Guidelines (what a great read it was) and couldn't find my answer. Thanks in Advance Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
2 pages ranking for the same term
Hi everyone, I have had two pages ranking on page two of Google for a while now for the same term. I have tried dedicating a page to it but as the other has a url with the search term in Google is ranking both it seems. How can I without deindexing one of the pages help better tell Google which one to rank? I imagine if it only ranked one page I would get a higher result rather than 2 weaker ones? On-site has been done and so has links to the homepage, but the innerpage still ranks also as it has the search term in its url. Would a canonical tag be worth it here? the page is however getting some traffic itself for other terms so I am reluctant to do that. Any help much appreciated.
Intermediate & Advanced SEO | | tdigital0 -
Blocking poor quality content areas with robots.txt
I found an interesting discussion on seoroundtable where Barry Schwartz and others were discussing using robots.txt to block low quality content areas affected by Panda. http://www.seroundtable.com/google-farmer-advice-13090.html The article is a bit dated. I was wondering what current opinions are on this. We have some dynamically generated content pages which we tried to improve after panda. Resources have been limited and alas, they are still there. Until we can officially remove them I thought it may be a good idea to just block the entire directory. I would also remove them from my sitemaps and resubmit. There are links coming in but I could redirect the important ones (was going to do that anyway). Thoughts?
Intermediate & Advanced SEO | | Eric_edvisors0 -
Could targeting 2 geographic locations decrease rankings?
Hello, I think that us targeting 2 different geographic locations (San Francisco, CA and Salt Lake City, UT) is negatively effecting the rank of some of our main keywords. My evidence for this: Since December our main keyword (NLP) dropped in ranking for nlpca(dot)com from about 19th to about 40th. This is about when we started to really target 2 different locations. Other main keywords dropped a lot as well, like the important term "NLP Training" Also, our name, nlpca(dot)com indicates NLP California (CA stands for California in Google) The other day we bolded a sentence with the words "Salt Lake City, Utah" at the top of our content and in one of Google's Databases (the one I was looking at) we dropped in rankings for "NLP California" where we used to be completely sitelinked (where we took up a lot of space at the top of the search). Also, we shot up to 1st on my datacenter for both "NLP Utah" and "NLP Salt Lake City". At the same time, our rankings for the term "NLP" dropped off the map. It has come back up, but we've also targeted California again. A lot of our anchor text has the word "California" in it. We're thinking about building a separate site for Utah and just linking to it from the California website when we need to. Does it sound to you, in your experience, that targeting both locations in our case is what's causing a decrease in rankings? Thank you!
Intermediate & Advanced SEO | | BobGW0 -
Are there any benefits to having dashes in file names?
Through searching, I can find lots of discussion regarding "dash vs underscore", but am having trouble with an even simpler question: Is there any SEO difference between using http://www.broadway.com/shows/milliondollarquartet.php vs. http://www.broadway.com/shows/million-dollar-quartet.php
Intermediate & Advanced SEO | | RyanWhitney150