Cgi-bin folder
-
Is there cg-bin folder of any site related to SEO in anyways and why we always block Cgi-bin folder in robots.txt.
-
As eyepaq says - it's for holding executable "program" files. search engiens don't need access so it gets blocked. If it's empty you can even remove it.
-
No. No correlation with SEO.
However the same way you block your temp or admin folder, there is no need in letting any bot crawling your cgi-bin folder.
Cgi-bin folder holds executable file - you don't want external sources to see it anyway - it's "very" private from security reasons. You need more then knowing what's inside the folder to do harm but keeping it hidden is the wise call. Better safe then sorry.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Htaccess code to 301 redirect a folder change
Hi, I need some help to redirect all my site as there was a folder change. eg, the old structure was www.mysite.com/stuff-1/bags.html and I need it to go to the same structure without the "-1" eg: /stuff/bags.html
Technical SEO | | Paul_MC
The "bags.html" will be lots of different products, so this would be a wildcard? What would the htaccess code need to be? Thanks0 -
Use of Location Folders
I'd like to understand the pro's and con's of using a location subfolder as an SEO strategy (example: http://sqmedia.us/Dallas/content-marketing.html), where the /Dallas folder is holding all of my keyword rich page titles. The strategy is to get local-SEO benefits from the use of the folder titled /Dallas (a folder which is unnecessary in the over all structure of this site), but how much is this strategy taking away from the page-title keyword effectiveness?
Technical SEO | | sqmedia0 -
Our UE team has presented me with a site structure where the content (folders) does not match the hierarchical directory structure (in the CME)
Our UE team has presented me with a new site structure where the content (folders) does not match the hierarchical directory structure (in the CME). I.E Sub-sectors, sectors and product pages are ALL just 1 directory off the root. example.com/sector example.com/sub-sector example.com/productpage FYI 'normal' folder hierarchy would be; example.com/sector/ example.com/sector/sub-sector example.com/sector/sub-sector/productpage I cannot find any SEO disadvantages re; crawl, if anything the SE's will crawl more efficeitly with clearly less depth... higher 'deep content', and a better nav - which is technically a sound solution with link consistency throughout - 1 to 2 clicks to all pages. Only disadvantage might be a user confusion... which can be off-set with contextual breadcrumbs. Are there any PURE SEO disadvantages to a structure this illogical? Note - This does not abuse any Search Engine guidelines. Thanks for reading, Rich
Technical SEO | | richcowley0 -
URL Folders and Naming Convention Changes?
1. We’re looking for some clarification in regards to our URL structure. Currently, at our product level we have http://www.ties.com/v/a/elite-solid-black-black-tie however the parent URL is http://www.ties.com/black-ties. a. So here are the question. How much is this hurting because semantically the naming convention of this URL and weird and doesn’t follow logical patterns. In other words. Should the product page for this be http://ties.com/black-ties/elite-solid-black-tie. How bad is this hurting us? b. If we were to change the ULR structure, should we do it in phases or all at once? We don’t want to get penalized. We have well over 3,000 product pages.
Technical SEO | | Ties.com0 -
New URL or Folder Off Existing Site
I am working on a project that is promoting dining in a particular region of the southwest for a destination marketing company. The parent Web site is an authority in the region and ranks well for almost all terms related to the leisure experience in the region. A completely separate Web site was built to promote this culinary program as it involves a committee of different stakeholders, but it’s solely focused on the region. My question is this. The site is on a different CMS, etc., but the overall experience on the site is similar to the parent DMO site in terms of creative. The client has a brand new domain that they purchased for this initiative, but we are also considering mapping the parent site URL to the new culinary site. Parent: www.regionalsite.com New Themed Site: www.regionalsite.com/theme/ Or www.themeurl.com My fear is that if I take the approach of the new URL that it will take forever for the site to build any link clout at all, as the client doesn’t really get the fact that working a link strategy is so critical. However, I know that having links from the regional site over to the theme URL will have an impact. Also, if I do take the approach of mapping the URL to a new folder off of the parent domain, do I risk that 2<sup>nd</sup> tier links on the micro-site will have a challenge indexing as they will essentially be on tier 3? Any advice would be appreciated.
Technical SEO | | VERBInteractive0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
SEO problem if homepage is 2 folders deep?
We are currently looking at a site for a client, where instead of featuring standard file structure, every folder is being buried two folders deep by the CMS. So the homepage is: www.domain.com.au/folder/folder And a subpage is: www.domain.com.au/folder/folder/subpage Is this necessarily and SEO problem? Will it be positive for rankings to pull out the two redundant folders? Any insights are appreciated! Cheers
Technical SEO | | MarketingResults0 -
CGI Parameters: should we worry about duplicate content?
Hi, My question is directed to CGI Parameters. I was able to dig up a bit of content on this but I want to make sure I understand the concept of CGI parameters and how they can affect indexing pages. Here are two pages: No CGI parameter appended to end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html CGI parameter appended to the end of the URL: http://www.nytimes.com/2011/04/13/world/asia/13japan.html?pagewanted=2&ref=homepage&src=mv Questions: Can we safely say that CGI parameters = URL parameters that append to the end of a URL? Or are they different? And given that you have rel canonical implemented correctly on your pages, search engines will move ahead and index only the URL that is specified in that tag? Thanks in advance for giving your insights. Look forward to your response. Best regards, Jackson
Technical SEO | | jackson_lo0