Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
-
Dear all,
starting with my .htaccess file:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L]RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L]1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/"My questions are:
A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php"
B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right?
C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**"
Is A) B) the best practice? and C) ?
Thanks for all replies!
Holger -
I think you have it correct there. I always like to end in a slash for index pages
http://inlinear.com/ - this is your home index page
http://inlinear.com/products/ - this is your index page for the /products/ folder/group
http://inlinear.com/products/page.php - this is a page within the /products/folder/group.
Hardly anyone ever sets up index web pages like index.php or index.htm anymore, they are really not needed as they just make the URL longer. End in the slash and make sure that you are consistent with ending with that slash (vs dropping it off) when you link to your index pages.
You would need to test the script you mention that rewrites the URL. It looks like it is making sure that the index page ends in a slash, but I could be wrong.
Side story - I have had a CMS that uses http://inlinear.com/products as the index page for http://inlinear.com/products/ and this creates all kinds of issues
-
Most people are used to not having an index page and the URL simply ending in a slash. So even if you had a non slashed version as your index page, people would link to the slash and then you have to setup 301s to fix that. Otherwise you end up with all kinds of duplicate page issues.
-
I know Google Analytics looks at the slashes to group your content into reports.
So the example index page of http://inlinear.com/products
would NOT be included in reports with all the pages in the /products/ group
e.g. http://inlinear.com/products/page.php
http://inlinear.com/products/anotherpage.php
as /products is not "within" /products/ You then have a report on /products/ that leaves out the index page and this is normally your most important page!
Good luck!
-
-
Thank you, but in practice how does it work without file-extension?
As I understood its fine if I put the following link to link on my homepage-index:
http://inlinear.com/ <--- without anything...
As well when I link to the products page:
http://inlinear.com/products/ <--- again without anything (index.php)
But in case of a specific page for example in the products-folder:
http://inlinear.com/products/my-product-1.php <--- how can I live without extension?
I googled and found this .htaccess code. Seems it takes away .php and ads a "/"... is this the best practice?:
Options +FollowSymLinks -MultiViews
Turn mod_rewrite on
RewriteEngine On
RewriteBase /Adding a trailing slash
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !/$
RewriteRule . %{REQUEST_URI}/ [L,R=301]RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME}.php -f
RewriteRule ^(.*?)/?$ /$1.php [L]Is this what you mean?
-
Best practice for all three cases is to never use the file extensions. You should never link to the file extension names, and make sure in your htaccess file that you dont use the file extensions for any reason moving forward. Why?
1. Lets say you decide to re-do your site and it goes from PHP to another language like ASP or something. You would have to redirect your entire site with file extensions and would shoot yourself in the foot with SEO, traffic and anything else. By not using file extensions, you give yourself the flexibility down the road and you can maintain a constant url structure.
2. Indexing may or may not use the file extensions depending on your htaccess/server settings. You would then essentially be running into duplicate content pages and issues, and thereby negatively affecting your site. Plus, it will dilute your individual page authority.
As a side note, just be consistent with your internal linking. Whether you use relative links or not - some discussion can be had around that. But pick a route and go with it, just as long as you dont use the file extensions
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We switched the domain from www.blog.domain.com to domain.com/blog.
We switched the domain from www.blog.domain.com to domain.com/blog. This was done with the purpose of gaining backlinks to our main website as well along with to our blog. This set us very low in organic traffic and not to mention, lost the backlinks. For anything, they are being redirected to 301 code. Kindly suggest changes to bring back all the traffic.
Technical SEO | | arun.negi0 -
What's the best way to handle product filter URLs?
I've been researching and can't find a clear cut answer. Imagine you have a product category page e.g. domain/jeans You've a lot of options as to how to filter the results domain/jeans?=ladies,skinny,pink,10 or domain/jeans/ladies-skinny-pink-10 or domain/jeans/ladies/skinny?=pink,10 And in this how do you handle titles, breadcrumbs etc. Is the a way you prefer to handle filters and why do you do it that way? I'm trying to make my mind up as some very big names handle this differently e.g. http://www.next.co.uk/shop/gender-women-category-jeans/colour-pink-fit-skinny-size-10r VS https://www.matalan.co.uk/womens/shop-by-category/jeans?utf8=✓&[facet_filter][meta.tertiary_category][Skinny]=on&[facet_filter][variants.meta.size][Size+10]=on&[facet_filter][meta.master_colour][Midwash]=on&[facet_filter][min_current_price][gte]=6.0&[facet_filter][min_current_price][lte]=18.0&per=36&sort=
Technical SEO | | RodneyRiley0 -
My Homepage Won't Load if Javascript is Disabled. Is this an SEO/Indexation issue?
Hi everyone, I'm working with a client who recently had their site redesigned. I'm just going through to do an initial audit to make sure everything looks good. Part of my initial indexation audit goes through questions about how the site functions when you disable, javascript, cookies, and/or css. I use the Web Developer extension for Chrome to do this. I know, more recently, people have said that content loaded by Javascript will be indexed. I just want to make sure it's not hurting my clients SEO. http://americasinstantsigns.com/ Is it as simple as looking at Google's Cached URL? The URL is definitely being indexed and when looking at the text-only version everything appears to be in order. This may be an outdated question, but I just want to be sure! Thank you so much!
Technical SEO | | ccox10 -
Best way to handle pages with iframes that I don't want indexed? Noindex in the header?
I am doing a bit of SEO work for a friend, and the situation is the following: The site is a place to discuss articles on the web. When clicking on a link that has been posted, it sends the user to a URL on the main site that is URL.com/article/view. This page has a large iframe that contains the article itself, and a small bar at the top containing the article with various links to get back to the original site. I'd like to make sure that the comment pages (URL.com/article) are indexed instead of all of the URL.com/article/view pages, which won't really do much for SEO. However, all of these pages are indexed. What would be the best approach to make sure the iframe pages aren't indexed? My intuition is to just have a "noindex" in the header of those pages, and just make sure that the conversation pages themselves are properly linked throughout the site, so that they get indexed properly. Does this seem right? Thanks for the help...
Technical SEO | | jim_shook0 -
How to remove all sandbox test site link indexed by google?
When develop site, I have a test domain is sandbox.abc.com, this site contents are same as abc.com. But, now I search site:sandbox.abc.com and aware of content duplicate with main site abc.com My question is how to remove all this link from goolge. p/s: I have just add robots.txt to sandbox and disallow all pages. Thanks,
Technical SEO | | JohnHuynh0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Robots.txt to disallow /index.php/ path
Hi SEOmoz, I have a problem with my Joomla site (yeah - me too!). I get a large amount of /index.php/ urls despite using a program to handle these issues. The URLs cause indexation errors with google (404). Now, I fixed this issue once before, but the problem persist. So I thought, instead of wasting more time, couldnt I just disallow all paths containing /index.php/ ?. I don't use that extension, but would it cause me any problems from an SEO perspective? How do I disallow all index.php's? Is it a simple: Disallow: /index.php/
Technical SEO | | Mikkehl0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0