Directories...
-
So, I have a website, and I have a few pages in directories, and the rest are normal with extensions (i.e. example.com/blah.html instead of example.com/blah/)
Now, the directory page isnt ranking yet for a targeting keyword (although I am still in the process of link building to the page w/ anchor text), however, could it be because it is the odd man out being one of the only pages within a directory?
Also, I would really like to move all my pages into directories, however some of the internal pages are ranking really well and I do not want to lose that once switching. Has anyone has experiences with using 301s to redirect so sub directories without loosing rankings?
-
The reason I would like to make the change, is to make the urls easier for visitors to remember, also, to make the urls look more professional. I personally feel that example.com/blah.html just looks kind of sloppy.
Thanks for your answers.
-
The first question I would ask is, why do you wish to change your current structure? What benefit does it offer you or your visitors? If there isn't a clear, direct benefit I would advise against making a change.
If there is a benefit and you wish to make the change, it shouldn't be a problem at all. You can re-arrange your site as long as you 301 every page properly. There will be a minor loss of link juice, but if this is indeed an improvement for your site, that loss should be offset by your improved site navigation.
I don't believe the directory page you mention is not ranking due to being the only page in a directory. The questions I would ask are how deep is that page within your site? I do not mean how many directories deep. I refer to how many clicks would a user need to make to arrive on that page from your home page? If a user can get there is one click, it should be indexed relatively quickly. If it takes several clicks, then that page is buried within your site and it can take longer to be indexed.
-
I think time is the biggest element here, as there is a chance that the content has yet to be crawled or indexed. While I have read that there is benefit to having links as close to the root page/domain as possible, I don't believe pages being a directory or two away matters all that much.
You would definitely want to do proper redirection if you are going to change your url structure site wide. If you are utilizing a platform like wordpress this is rather easy.
You might also want to consider keeping the current structure in tact, and creating links on the previous well ranked page. I would most likely go with the 301 redirect, but have seen popular blogs/sites simply put a link on the previous page to the new page. That way the popular page will still be seen but the new link will be easily crawled and indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does it hurt your SEO to have an inaccessible directory in your site structure?
Due to CMS constraints, there may be some nodes in our site tree that are inaccessible and will automatically redirect to their parent folder. Here's an example: www.site.com/folder1/folder2/content, /folder2 redirects to /folder1. This would only be for the single URL itself, not the subpages (i.e. /folder1/folder2/content and anything below that would be accessible). Is there any real risk in this approach from a technical SEO perspective? I'm thinking this is likely a non-issue but I'm hoping someone with more experience can confirm. Another potential option is to have /folder2 accessible (it would be 100% identical to /folder1, long story) and use a canonical tag to point back to /folder1. I'm still waiting to hear if this is possible. Thanks in advance!
Intermediate & Advanced SEO | | digitalcrc0 -
Robots.txt: how to exclude sub-directories correctly?
Hello here, I am trying to figure out the correct way to tell SEs to crawls this: http://www.mysite.com/directory/ But not this: http://www.mysite.com/directory/sub-directory/ or this: http://www.mysite.com/directory/sub-directory2/sub-directory/... But with the fact I have thousands of sub-directories with almost infinite combinations, I can't put the following definitions in a manageable way: disallow: /directory/sub-directory/ disallow: /directory/sub-directory2/ disallow: /directory/sub-directory/sub-directory/ disallow: /directory/sub-directory2/subdirectory/ etc... I would end up having thousands of definitions to disallow all the possible sub-directory combinations. So, is the following way a correct, better and shorter way to define what I want above: allow: /directory/$ disallow: /directory/* Would the above work? Any thoughts are very welcome! Thank you in advance. Best, Fab.
Intermediate & Advanced SEO | | fablau1 -
How is Google crawling and indexing this directory listing?
We have three Directory Listing pages that are being indexed by Google: http://www.ccisolutions.com/StoreFront/jsp/ http://www.ccisolutions.com/StoreFront/jsp/html/ http://www.ccisolutions.com/StoreFront/jsp/pdf/ How and why is Googlebot crawling and indexing these pages? Nothing else links to them (although the /jsp.html/ and /jsp/pdf/ both link back to /jsp/). They aren't disallowed in our robots.txt file and I understand that this could be why. If we add them to our robots.txt file and disallow, will this prevent Googlebot from crawling and indexing those Directory Listing pages without prohibiting them from crawling and indexing the content that resides there which is used to populate pages on our site? Having these pages indexed in Google is causing a myriad of issues, not the least of which is duplicate content. For example, this file <tt>CCI-SALES-STAFF.HTML</tt> (which appears on this Directory Listing referenced above - http://www.ccisolutions.com/StoreFront/jsp/html/) clicks through to this Web page: http://www.ccisolutions.com/StoreFront/jsp/html/CCI-SALES-STAFF.HTML This page is indexed in Google and we don't want it to be. But so is the actual page where we intended the content contained in that file to display: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff As you can see, this results in duplicate content problems. Is there a way to disallow Googlebot from crawling that Directory Listing page, and, provided that we have this URL in our sitemap: http://www.ccisolutions.com/StoreFront/category/meet-our-sales-staff, solve the duplicate content issue as a result? For example: Disallow: /StoreFront/jsp/ Disallow: /StoreFront/jsp/html/ Disallow: /StoreFront/jsp/pdf/ Can we do this without risking blocking Googlebot from content we do want crawled and indexed? Many thanks in advance for any and all help on this one!
Intermediate & Advanced SEO | | danatanseo0 -
De-indexed Link Directory
Howdy Guys, I'm currently working through our 4th reconsideration request and just have a couple of questions. Using Link Detox (www.linkresearchtools.com) new tool they have flagged up a 64 links that are Toxic and should be removed. After analysing them further alot / most of them are link directories that have now been de-indexed by Google. Do you think we should still ask for them to be removed or is this a pointless exercise as the links has already been removed because its been de-indexed. Would like your views on this guys.
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Any reason not to redirect entire directory from old site structure to new?
I'm helping on a site that has tons of content and recently moved from a 10 year old .ASP structure to WordPress. There are ~800 404s, with 99% of them in the same directory that is no longer used at all. The old URL structures offer no indication of what the old page contents was. So, there is basically no way to manually redirect page by page to the new site at this point.....is there any reason not to redirect that entire old directory to the new homepage? Matt Cutts seems to think its OK to point an entire old directory to a new homepage, but its not as good as the 1:1 redirects: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93633 Any thoughts?
Intermediate & Advanced SEO | | wattssw0 -
Directory VS Article Directory
Which got hit harder in penguin update. I was looking at SEER Interactive backlink profile (the SEO company that didn't rank for it's main keyword phrases) and noticed a pretty big trend on why it might not rank for its domain name. SEER was in a majority of anchor text, many coming from directories. i'm guessing THEY were effected because they matched the exact match domain link profile rule I'm not an expert programmer, but if i was playing "Google Programmer" I would think the Algo update went something like. If ((exact match domain) & (certain % anchor text==domain) & (certain % of anchor text== partial domain + services/company)) { tank the rankings } So back to the question, do you think that this update had a lot to do with directories, article directories, or neither. Is article directories still a legit way to get links. (not ezine)
Intermediate & Advanced SEO | | imageworks-2612900 -
URL Structure for Directory Site
We have a directory that we're building and we're not sure if we should try to make each page an extension of the root domain or utilize sub-directories as users narrow down their selection. What is the best practice here for maximizing your SERP authority? Choice #1 - Hyphenated Architecture (no sub-folders): State Page /state/ City Page /city-state/ Business Page /business-city-state/
Intermediate & Advanced SEO | | knowyourbank
4) Location Page /locationname-city-state/ or.... Choice #2 - Using sub-folders on drill down: State Page /state/ City Page /state/city Business Page /state/city/business/
4) Location Page /locationname-city-state/ Again, just to clarify, I need help in determining what the best methodology is for achieving the greatest SEO benefits. Just by looking it would seem that choice #1 would work better because the URL's are very clear and SEF. But, at the same time it may be less intuitive for search. I'm not sure. What do you think?0 -
Redirecting www.example.com to www.example.com/directory/
Hi All, There's been some internal debate going back and forth about redirecting the homepage of a site to a directory. There are a few different POVs circulating, one of which is that it's no different than redirecting to a /index page. Basically, the homepage is ranking for the keyword that we want the directory to rank for but I can't seem to justify placing this type of redirect. The content on both pages is different, but for the term both the homepage and the directory make sense to rank. Has anyone ever done anything like this before? Can anyone see any reason to do something like this? I believe this move would dilute the link value we currently have going to the homepage and potentially cause us to lose our #2 slot with the homepage in favor of a lower spot with the directory. I'd love to hear any thoughts on this/learn if anyone has experimented with this tactic. Thanks in advance!
Intermediate & Advanced SEO | | JamieCottle280