Question about web site structure
-
Is there an SEO advantage for individual pages to be in sub folders vs not being in a folder? Of course site managemnt is easier with folders if you have 100;s of pages...clearly a shorter URL is easier for humans to naviagte.
vs.
-
I'm not sure I understand your question John, and the second URL is a 404. Could you expand your question a bit? Thanks!
-
Hi Keri and community,
So is it a link juice mistake on my site
www.shearerpainting.com/recyclepaint.index
I created the sub folder and landing page specifically for a new campaign "recycle paint" with video and content and links
-
I've gone ahead and marked this thread as answered, even though we haven't heard from John. Thanks for your great recap!
-
Hi John. I'm surprised this question isn't considered answered yet as the group seems to touch on all the bases. Here's a recap:
- Richard Getz highlights the ability to add keywords via folders but cautions against adding too many folders (historically due to crawling issues).
- David Lenehan cautions against too many folders causing duplicate content issues and ungainly website architecture.
- Keri Morgret highlights the usefulness of folders in Analytics to help track specific portions of content. The moz also discusses this in their excellent post: http://www.seomoz.org/blog/a-powerful-analytics-tip-every-website-should-employ
"By segmenting out traffic to URLs that include /blog/ and those that include /ugc/ (YOUmoz), we can see when/where/how each section is rising or falling in traffic and contributing to the overall site's performance."
- Fatwallet cautions against spam negating benefits in either and emphasizes linking as driving more value regardless.
- Aaron Dicks recommends a CMS to give you strength and flexibility in organizing your content.
- While Pashmina reminds us of the usefulness of redirects when curtailing duplicate content or sending lost link strength to a page in greater need.
And finally since you were asking about domain\folder\page vs domain\page you're not going to run into subdomain issues. If a short folder category makes since in analytics it's definitely worth it. Just look at the URL above... we're in the 'q' folder. Hope that helps.
-
John,
Did any of these responses answer your question, or do you still have more questions? If you could add a comment or mark a helpful response, that'd be great!
-
There are a lot of answers on here with regards to .html files and folders. The most efficient and easily-manageable solution here is to migrate to a good Content Management System that can handle categorys and page parents (I prefer Wordpress)
Products can be categorised one or two deep (suggested max for Search Engines) and URL's will reflect the product description. I.e is the item is a widget or type foo, being in url www.example.com/foo/widget/product-name would be a great structure, as both foo and widget might be part of the search term for the product, and they will also appear on the product page naturally as you describe the product.
This also helps the Keyword cannibalisation problem, as you will be able to see through administration that there are multiple pages doing the same thing.
Essentially in answer to your question, go one or two deep if it will help your users. Don't go more than 2 as Search Engines may not crawl that far if you have a young/non-authorative domain.
Hope this helps,
Aaron
-
I am gonna vote up with 1 folder level. There is no evidence of it, but its possibly that the juice would not pass through as well if there are many directories/sub-directories
-
I agree with Richard and Joel. No more than 3 levels deep for categories. And I'd like to add, that it's good to create redirects for alternative categories or links. If a product can belong to 2 sub categories, have both links work. eg. domain.com/category/subcategory-primary/product.html -> would be main link domain.com/category/subcategory-secondary/product.html -> would redirect to above And while this is not necessarily an SEO advantage, having clean, short and organized categories helps create a good user experience and easy way finding for your users and leads to higher conversion rates.
-
That's a really good point.
I'm glad GA form fields accept regular expressions
-
Don't overlook the usefulness of folders when it comes to Google Analytics. Lunametrics has a post on designing a site that is friendly with GA at http://www.lunametrics.com/blog/2010/09/22/designing-google-analytics-friendly-site/.
-
Google is going to trust your link structure more than the subdirectories in the url.
Make your urls clean, try to get a good keyword in there, but DON'T stuff and make them obnoxious and spammy.
There IS typically an advantage to a flatter architecture, but if the content is rich and the longer tail potential is high a deeper architecture will serve just fine.
-
Too many sub folders isn't good for SEO and you can have problems with duplicate content. Personally I would go with the first option. I try to ensure products have the following URL structure
store.com/product-laser-gadget.html
You need to avoid the following situation;
store.com/gadget/product-laser-gadget.html
-
Thanks Joel can you give me an example?
www.donuts/glazed/chocalate glazed
-
If Susan does not get you some backlinks, nothing will
-
Richard you are going to laugh...so I just made a new video and added sitemap, and I am jacked up about getting another keyword in so i made
http://www.shearerpainting.com/PaintColors/susanmarinello.php
but my HTML is so poori cant figure out menu's page architecture, blah , blah so I tempoarily put this up:
-
None that I can think of.
If you have categories, it not only allows the use of another keyword, but you get to make a landing page for that keyword. domain.com/category/index.php would be used for keywords and also redirecting link juice once a product was deleted as explained here: (scroll to the bottom)
http://www.seomoz.org/q/what-do-you-do-about-links-to-constantly-moving-pages
When you link build, you can use these landing pages to point links to. Also good for link baiting.
There are several reasons to have these types of pages. It would depend on what your site is composed of, but you can add videos, how-tos, related blog post, etc.
All of which get a user in a direction, attract links, and help get link juice to deeper pages.
-
Thanks Richard, Is there any advantge to have all my pages only go 1 deep?
-
Sure, the most obvious is the use of a keyword, but don't go more than 3 deep. domain.com/category1/category2/product.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
Lots of Listing Pages with Thin Content on Real Estate Web Site-Best to Set them to No-Index?
Greetings Moz Community: As a commercial real estate broker in Manhattan I run a web site with over 600 pages. Basically the pages are organized in the following categories: 1. Neighborhoods (Example:http://www.nyc-officespace-leader.com/neighborhoods/midtown-manhattan) 25 PAGES Low bounce rate 2. Types of Space (Example:http://www.nyc-officespace-leader.com/commercial-space/loft-space)
Web Design | | Kingalan1
15 PAGES Low bounce rate. 3. Blog (Example:http://www.nyc-officespace-leader.com/blog/how-long-does-leasing-process-take
30 PAGES Medium/high bounce rate 4. Services (Example:http://www.nyc-officespace-leader.com/brokerage-services/relocate-to-new-office-space) High bounce rate
3 PAGES 5. About Us (Example:http://www.nyc-officespace-leader.com/about-us/what-we-do
4 PAGES High bounce rate 6. Listings (Example:http://www.nyc-officespace-leader.com/listings/305-fifth-avenue-office-suite-1340sf)
300 PAGES High bounce rate (65%), thin content 7. Buildings (Example:http://www.nyc-officespace-leader.com/928-broadway
300 PAGES Very high bounce rate (exceeding 75%) Most of the listing pages do not have more than 100 words. My SEO firm is advising me to set them "No-Index, Follow". They believe the thin content could be hurting me. Is this an acceptable strategy? I am concerned that when Google detects 300 pages set to "No-Follow" they could interpret this as the site seeking to hide something and penalize us. Also, the building pages have a low click thru rate. Would it make sense to set them to "No-Follow" as well? Basically, would it increase authority in Google's eyes if we set pages that have thin content and/or low click thru rates to "No-Follow"? Any harm in doing this for about half the pages on the site? I might add that while I don't suffer from any manual penalty volume has gone down substantially in the last month. We upgraded the site in early June and somehow 175 pages were submitted to Google that should not have been indexed. A removal request has been made for those pages. Prior to that we were hit by Panda in April 2012 with search volume dropping from about 7,000 per month to 3,000 per month. Volume had increased back to 4,500 by April this year only to start tanking again. It was down to 3,600 in June. About 30 toxic links were removed in late April and a disavow file was submitted with Google in late April for removal of links from 80 toxic domains. Thanks in advance for your responses!! Alan0 -
Could our drop in organic rankings have been caused by improper mobile site set-up?
Site: 12 year old financial service 'information' site with lead gen business model. Historically has held top 10 positions for top keywords and phrases. Background: The organic traffic from Google has fallen to 50% of what it was over the past 4 months compared to the same months last year. While several potential factors could be responsible/contributing (not limited to my pro-active removal of a dozen old emat links that may be perceived as unnatural despite no warning), this drop coincides with the same period the 'mobile site' was launched. Because I admittedly know the least about this potential cause, I am turning to the forum for assistance. Because the site is ~200 pages and contains many 'custom' pages with financial tables, forms, data pulled from 3rd parties, custom/different layouts we opted for creating a mobile site of only the top 12 most popular pages/topics just to have a mobile presence (instead of re-coding the entire site to make it responsive utilizing a mobile css). -These mobile pages were set up in an "m." subdomain. -We used bi-directional tagging placing a rel=canonical tag on the mobile page, and a rel=alternate tag on the desktop page. This created a loop between the pages, as advised by Google. -Some mobile pages used content from a sub page, not the primary desktop page for a particular topic. This may have broken the bi-directional 'loop', meaning the rel=canonical on the mobile page would point to a subpage, where the rel=alternate would point to the primary desktop page, even though the content did not come from that page, necessarily. The primary desktop page is the one that ranks for related keywords. In these cases, the "loop" would be broken. Is this a cause for concern? Could the authority held by the desktop page not be transferred to the mobile version, or the mobile page 'pull away' or disperse the strength of the desktop page if that 'loop' was not connected? Could not setting up the bi-directional tags correctly cause a drop in the organic rankings? -Our developer verified the site is set up according to Google's guidelines for identifying device screen size and serving appropriate version of page. -Are there any tools or utilities that I can use to identify issues, and/or verify everything is configured correctly? -Are we missing anything important in the set-up/configuration? -Could the use of a brand new subdomain 'm.' in and of itself be causing issues? -Have I identified any negative seo practices or pitfalls? Am I missing or overlooking something? While i would have preferred maintaining a single, responsive, site with mobile css, it was not realistic given the various layouts, and owner's desire to only offer the top pages in mobile format. The mobile site may have nothing to do with the organic drop, but I'd like to rule it out if so, and I have so many questions. If anyone could address my concerns, it would be greatly appreciated. Thanks! Greg
Web Design | | seagreen0 -
URLs appear in Google Webmaster Tools that I can't find on my own site?!?
Hi, I have a Magento e-commerce site (clothing) and when I had a look through some of the sections in Google Webmaster Tools I found URLs that I can't find on my site. For example, a product url maybe http://www.example.co.uk/product-url/ which is fine. In that product there maybe three sizes of the product (Small, Medium, Large) and for some reason Googlebot is sometimes finding a url like: http://www.example.co.uk/product-url/1202/ has been found and when clicked on is a live url (Status code: 200) with is one of the sizes (medium). However I have ran a site crawl in Screaming Frog and other crawl tests and can't seem to find where Googlebot is finding these URLs. I think I need to: 1. Find how Googlebot is finding these urls? 2. Find out how to keep out of index (e.g. robots.txt, canonical etc.... Any help would be much appreciated and I'm happy to share the URL with members if they think they can have a look and help with this problem. I can share specific URLs which might make the issue seem clearer, let me know? Thanks, Darrell
Web Design | | clickyleap0 -
Site structure- category pages
Hi, I'm relatively new to SEO but have tried to apply all best practices to my site. However, I've hit a stumbling block when it comes to whether or not to index my category pages. http://istudyenglishonline.com/category/expressions-idioms/ General info: the site has been created with Wordpress and has a directory of English idioms. Each idiom is associated with one or more categories that it falls under (emotions, sports, food etc). Each category has its own page where the list of idioms will be. As each idiom often has more than one associated category, the same idiom will appear in different category pages, thus creating duplicate content. However, I have given each category page its own unique description. The issue is, when there are numerous idioms, the category page will have more than 1 page. I don't have the ability to create a unique description for each subsequent page of the main category. I know that the very model for some vertical search engines (such as indeed.com) is to create such landing pages and that the more "categories" that they have assigned to their job ads, in this case, the more pages created and the more pages indexed in Google. This seems to work very well for them. My question is, am I doing things right? Should I be doing anything to the subsequent category pages to avoid duplicate content? My plan was to have so many idioms associated with so many categories that I have a fair number of landing pages indexed in google, thus attacking the long tail keywords. However, I'm not sure if I am going the right way. Any advice would be much appreciated!
Web Design | | villarroel0 -
Question #2: All of my INTERNAL links in OSE are being indexed from http://www.e.com/default.asp, and all my EXTERNAL links are linked to http://www.e.com/ am I getting a fraction of the link juice because of that?????
Hey guys, sorry for the really long question, but it appears that I am losing between 50 and 75 % of my link juice to my internal pages. In OSE all main category links (left sidebar) are being indexed from the URL that includes default.asp, even though NONE of my external links include that: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fwww.uncommonthread.com%2FSulky-Thread-s%2F78.htm If you check the PA for http://www.uncommonthread.com/: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fwww.uncommonthread.com%2F You see that it is practically double the PA of http://www.uncommonthread.com/default.asp: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fwww.uncommonthread.com%2FDefault.asp **Also, non of my internal menu links are being indexed. ** Look at the menu on this page: http://www.uncommonthread.com/Sulky-Thread-s/78.htm and then look at the OSE information here for the "invisible thread" item from the menu on the page above^^^: http://www.opensiteexplorer.org/links?site=http%3A%2F%2Fwww.uncommonthread.com%2FSulky-monofilament-s%2F54.htm Thanks SOOO much! Pre-thumbs and thanks to anyone that can lend me a seconds worth of advice! Thanks again for your time, Tyler A.
Web Design | | TylerAbernethy0 -
What reason would scrapers, and syndication sites outrank all of our content?
Typing in any of our titles for content, scrapers and content syndication sites all outrank us by quite a bit. What is the main reason for this usually? I started noticing this happening quite a bit this year, and think maybe it has to do with panda. Has anyone figured out the reasoning?
Web Design | | upbuiltgames0 -
The ideal SEO e-commerce site
Hi All, I am currently writing a spec for moving our current e-commerce website and it got me thinking from an SEO perspective. We are all usually restrained by the current website set-up / CMS and there are things it can never do despite how hard we push for the changes. If you had the chance to start from a blank canvas (like I do currently) what would be on your wishlist?
Web Design | | RikkiD220