Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Real Vs. Virtual Directory Question
-
Hi everyone. Thanks in advance for the assistance. We are reformatting the URL structure of our very content rich website (thousands of pages) into a cleaner stovepipe model. So our pages will have a URL structure something like http://oursite.com/topic-name/category-name/subcategory-name/title.html etc.
My question is… is there any additional benefit to having the path /topic-name/category-name/subcategory-name/title.html literally exist on our server as a real directory? Our plan was to just use HTACCESS to point that URL to a single script that parses the URL structure and makes the page appropriately.
Do search engine spiders know the difference between these two models and prefer one over the other? From our standpoint, managing a single HTACCESS file and a handful of page building scripts would be infinitely easier than a huge, complicated directory structure of real files. And while this makes sense to us, the HTACCESS model wouldn't be considered some kind of black hat scheme, would it?
Thank you again for the help and looking forward to your thoughts!
-
At a fundamental level, you are keeping the data somewhere and it is rendered correctly. In a CMS this data is stored in a database completely outside search engine view. So it does not matter if it is in database or in physical directory somehow. So there is no benefit in keeping the structure same physically.
Having said that and my own experience (we manage website with millions of pages) managing this using HTACCESS script is NOT a good idea. You will be limited by what you can do and maintaining will be quite challenging.
I strongly suggest consider moving to a CMS (like drupal) and store all you content inside a database and the CMS script takes care of HTACCESS plus gives other goodies. There are several tool available to get your content from disk into a database.
-
Search engines can't tell the difference so all good.
-
I believe that the preferred method is in the HTAccess file. When we reformatted the URLs on our site this was the most efficient, cleanest way to do it. This kind of Dynamic Redirect protects you from 404 pages and losing your page values. I didn't see any negative effects using this method of restructure. I had about 6000 pages that each had to change URL, it was a nightmare. We migrated to a completely new platform and file server, so we had to change URLs.
I hope that is helpful. I don't see one method benefiting your engines more than the other. I would suggest doing whatever will be the least amount of work, will be the cleanest way to do it and will in the long run keep your URLs clean and without erroneous information.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question regarding subdomains and duplicate content
Hey everyone, I have another question regarding duplicate content. We are planning on launching a new sector in our industry to satisfy a niche. Our main site works as a directory with listings with NAP. The new sector that we are launching will be taking all of the content on the main site and duplicating it on a subdomain for the new sector. We still want the subdomain to rank organically, but I'm having struggles between putting a rel=canonical back to main site, or doing a self-referencing canonical, but now I have duplicates. The other idea is to rewrite the content on each listing so that the menu items are still the same, but the listing description is different. Do you think this would be enough differentiating content that it won't be seen as a duplicate? Obviously make this to be part of the main site is the best option, but we can't do that unfortunately. Last question, what are the advantages or disadvantages of doing a subdomain?
White Hat / Black Hat SEO | | imjonny0 -
Good vs Bad Web directories
Hi this blog post Rand mentions a list of bad web directories - I asked couple of years ago if there is an updated list as some of these (Alive Directory for example) do not seem to be blacklisted anymore and are coming up in Google searches etc? It seems due to old age of the blog post (7 years ago ) the comments are not responded to. Would anyone be able to advise if which of these good directories to use? https://moz.com/blog/what-makes-a-good-web-directory-and-why-google-penalized-dozens-of-bad-ones
White Hat / Black Hat SEO | | IsaCleanse0 -
Are link directories still effective? is there a risk?
We've contracted a traditional SEO firm, mostly for link building. As part of their plan they want to submit our site to a large list of link directories, and we're not sure if that's a good option. As far as we know, those directories have been ineffective for a long time now, and we're wondering if there is the chance of getting penalized by google. When I asked the agency their opinion about that, they gave me the following answer - Updated and optimized by us - We are partnered with these sites and control quality of these sites. Unique Class C IP address - Links from unique Referring Class C IP plays a very important role in SEO. Powered by high PR backlinks Domain Authority (DA) Score of over 20 These directories are well categorized. So they actually control those directories themselves, which we think is even worse. I'm wondering what does the Moz community think about link directory submission - is there still something to be gained there, is there any risk involved, etc. Thanks!
White Hat / Black Hat SEO | | binpress0 -
Macrae's Blue Book Directory LIsting
Does anyone know more information about this directory? Is it a good quality directory that I should pay to get listed on?
White Hat / Black Hat SEO | | EcomLkwd0 -
Google places VS position one ranking above the places.
Hi Guys, Will creating a new Google places listing for a business have any effect their current position one spot for their major geo location keyword? I.e restaurants perth - say they are ranking no 1 above all the places listings if they set up a places listing would they lose that position and merge with all the other places accounts? Or would they have that listing as well as the places listing? I have been advised it could be detrimental to set up the places account if this is the case does anyone know any ways around this issue as the business really needs a places page for google maps etc. Appreciate some guidance Thanks. BC
White Hat / Black Hat SEO | | Bodie0 -
Rel Noindex Nofollow tag vs meta noindex nofollow
Hi Mozzers I have a bit of thing I was pondering about this morning and would love to hear your opinion on it. So we had a bit of an issue on our client's website in the beginning of the year. I tried to find a way around it by using wild cards in my robots.txt but because different search engines treat wild cards differently it dint work out so well and only some search engines understood what I was trying to do. so here goes, I had a parameter on a big amount of URLs on the website with ?filter being pushed from the database we make use of filters on the site to filter out content for users to find what they are looking for much easier, concluding to database driven ?filter URLs (those ugly &^% URLs we all hate so much*. So what we looking to do is implementing nofollow noindex on all the internal links pointing to it the ?filter parameter URLs, however my SEO sense is telling me that the noindex nofollow should rather be on the individual ?filter parameter URL's metadata robots instead of all the internal links pointing the parameter URLs. Am I right in thinking this way? (reason why we want to put it on the internal links atm is because the of the development company states that they don't have control over the metadata of these database driven parameter URLs) If I am not mistaken noindex nofollow on the internal links could be seen as page rank sculpting where as onpage meta robots noindex nofolow is more of a comand like your robots.txt Anyone tested this before or have some more knowledge on the small detail of noindex nofollow? PS: canonical tags is also not doable at this point because we still in the process of cleaning out all the parameter URLs so +- 70% of the URLs doesn't have an SEO friendly URL yet to be canonicalized to. Would love to hear your thoughts on this. Thanks, Chris Captivate.
White Hat / Black Hat SEO | | DROIDSTERS0 -
A Straight Answer to Outsourcing Backlinking, Directory Submission and Social Bookmarking
Hey SEOmoz Community! I've spent a bit of time now reading about SEO in books as well as online here within the SEOmoz community. However, I've still struggled to find a straight answer to whether or not directory submissions to non-penalized websites is acceptable.I suspect the reason I haven't found a straight YES or NO answer is because it isn't so straightforward and I respect that. My dilemma is as follows: I want to raise the domain authority for a few websites that I optimize for. I've submitted and gotten listed a bunch of excellent backlinks, however it still is a painfully slow process. My clients understandably want to see results faster, and because they have virtually no past outsourced link-building campaigns, I am beginning to think that I can invest some money for outsourcing directory submissions. I see more and more people talking about the latest Penguin updates, and how many of these sites are now penalized. BUT, is there any harm to submitting to directories such as the ones on SEOmoz's spreadsheet that aren't penalized? My concern is that in the future these will be penalized anyways, and is there a chance then that my site will also be de-listed from Google? At what point does Google completely 'blacklist' your site from its engine? Furthermore, I don't understand how Google can penalize a website to the point of de-listing it, because what would prevent other competitors from sending mass spammy back-links to another? What it all comes down to: At this point, are verified mass directory submissions through outsourcing still much more beneficial than detrimental to the ranking of a website? Thanks SEOmoz community, Sheldon
White Hat / Black Hat SEO | | swzhai0 -
Subdomains vs. Subfolders Wordpress Multisite
I am in the process redesigning my organization's website using wordpress multisite. I am currently planning on creating subdomains for each of the locations, as I thought that having a keyword saturated domain name would provide the best rankings. So the Omaha office would look like this: omaha.example.com Would it be better to go with example.com/omaha? Things to consider: Google adwords is currently a huge source of our traffic. Despite having very good organic rankings, we receive most of our traffic from pay-per-click sources. The "display URL" has dramatic effect on our CTR, so I want to avoid subfolders if possible. (example OmahaEmergencyDental.com receives far more click thru's than EmergencyDental.com) Each location currently has it's own domain and website (omahaemergencydental.com) these sites/pages have been in place for several years Thanks in advance!
White Hat / Black Hat SEO | | LoganYard0