Disallow: /sr/ and Disallow: /si/ - robots.txt
-
Hello Mozzers - I have come across the two directives above in a robots.txt file of a website - the web dev isn't sure what they meant although he implemented robots.txt - I think just legacy stuff that nobody has analysed for years - I vaguely recall sr means search request but can't remember.
If any of you know what these directives do, then please let me know.
-
Thanks Tomas and Mike - good advice - I have done that and found legacy stuff they've since moved away from - there is indeed no current use for the directives.
I wonder whether there's any resource on the web that lists all robots.txt directives - and interprets them - if not then perhaps it would an idea for Moz?
-
Have a look at your site through http://web.archive.org/. You'll be able to see what the directories were used for.
However, if there's no use for them on the current site then what's the purpose of keeping these disallows in the robots.txt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ability to Transition Completed Wordpress Website to New Coder/Developer
We have worked with the same Wordpress developer since 2012. They recently redesigned our Wordpress site. We purchased a real estate theme and they performed major modifications to it. The project took 8 months. There are many customized widgets and multiple plugins. We hired a new SEO. The SEO is very comfortable coding. The SEO performed certain modifications and the code broke. The original developer stepped in and and helped restore the code. The SEO stated that the site should not be so delicate; that too many plugins and widgets are used making it inherently unstable. The original developer is claiming that the SEO did not follow best practices (they did not use a dev server to test). For a non technical business owner this is very disturbing. We finally agreed that the new SEO would make changes on a dev server and the original developer will check these changes to ensure they do not break the code. My question is, shouldn't a Wordpress site be simple enough to hand over to a decent coder with little risk of breaking the code? Are there any standards regarding the hand over of a site? I am comfortable with my developers, but what if they change professions or close their company? How would I transition the site? There must be standards and protocols that allow a third party, such as an SEO to change code without causing havoc. Any one have some insight?
Web Design | | Kingalan11 -
Pageless/Single Page Design and Migration Questions
Hello, We are starting a content audit and migration to a new CMS. We would like to take content and present more on a pageless/Single Page type design instead of having visitors drill down so many levels to find the content. What should we be aware of from an SEO perspective. Here is Example of current pages and structure: http://www.saintpetershcs.com/GraduateMedicalEducation/PediatricResidency/
Web Design | | sphcs
Subpages include: Overview, Curriculum, Faculty, Residents, Benefits, How to APply Here is example of what we would like to do:
http://themeforest.net/item/medicalpress-health-and-medical-wordpress-theme/full_screen_preview/7789703 As you scroll information is populated: Duke Medicine also has something similar. https://www.dukemedicine.org/treatments/cancer What are your thoughts?0 -
Bing Indexation and handling of X-ROBOTS tag or AngularJS
Hi MozCommunity, I have been tearing my hair out trying to figure out why BING wont index a test site we're running. We're in the midst of upgrading one of our sites from archaic technology and infrastructure to a fully responsive version.
Web Design | | AU-SEO
This new site is a fully AngularJS driven site. There's currently over 2 million pages and as we're developing the new site in the backend, we would like to test out the tech with Google and Bing. We're looking at a pre-render option to be able to create static HTML snapshots of the pages that we care about the most and will be available on the sitemap.xml.gz However, with 3 completely static HTML control pages established, where we had a page with no robots metatag on the page, one with the robots NOINDEX metatag in the head section and one with a dynamic header (X-ROBOTS meta) on a third page with the NOINDEX directive as well. We expected the one without the meta tag to at least get indexed along with the homepage of the test site. In addition to those 3 control pages, we had 3 pages where we had an internal search results page with the dynamic NOINDEX header. A listing page with no such header and the homepage with no such header. With Google, the correct indexation occured with only 3 pages being indexed, being the homepage, the listing page and the control page without the metatag. However, with BING, there's nothing. No page indexed at all. Not even the flat static HTML page without any robots directive. I have a valid sitemap.xml file and a robots.txt directive open to all engines across all pages yet, nothing. I used the fetch as Bingbot tool, the SEO analyzer Tool and the Preview Page Tool within Bing Webmaster Tools, and they all show a preview of the requested pages. Including the ones with the dynamic header asking it not to index those pages. I'm stumped. I don't know what to do next to understand if BING can accurately process dynamic headers or AngularJS content. Upon checking BWT, there's definitely been crawl activity since it marked against the XML sitemap as successful and put a 4 next to the number of crawled pages. Still no result when running a site: command though. Google responded perfectly and understood exactly which pages to index and crawl. Anyone else used dynamic headers or AngularJS that might be able to chime in perhaps with running similar tests? Thanks in advance for your assistance....0 -
Using Multiple links/names for the same product?
I am being asked to change these product links on the home page: Home/Condo
Web Design | | RoxBrock
Watercraft/Boat to Home
Condo
Watercraft
Boat (Along with several other product links) How does this affect the customer experience/usability, and SEO? Is it a good idea or is it confusing? Thank you.0 -
Tips on finding the right Senior Designer / Design Director
Hello Everyone, I manage a fairly large educational website that we are looking to completely redesign to improve the overall site user experience and usability. In the past I, a non-designer, business person, would just roughly draw how I thought the site should look and our developer (who is good, but not a web designer) would just try his best to make everything look profession. Over the years, it has become painfully obvious that we need to invest in more design expertise and move towards a modern, smartly designed website. So my question: Where are the best places to find good freelance designers? I have of course conducted web searches, browsed elance, and asked my network for referrals. However, I am finding that most of the really good ones, ones who have to ability to take charge and lead us through this entire process, and who have at least a basic understanding of SEO principles, work for larger integrated development shops who also expect their people to develop the new site as well. We already have a developer and are primarily looking for the design expertise. Does anyone in the Moz community have any suggestions or even referrals? Thanks! Eric
Web Design | | Eric_R0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
How to verify http://bizdetox.com for google webmaster tools
Hey guys i tried to to make a Preferred Domain choice in webmaster tools, but it is not allowing me to save my choice bec its asking me to verify that i own http://bizdetox.com How do i go about doing that and what are the steps I have already verified www.bizdetox.com
Web Design | | BizDetox0