Internal file extension canonicalization
-
Ok no doubt this is straightforward, however seem to be finding to hard to find a simple answer; our websites' internal pages have the extension .html. Trying to the navigate to that internal url without the .html extension results in a 404.
The question is; should a 401 be used to direct to the extension-less url to future proof? and should internal links direct to the extension-less url for the same reason?
Hopefully that makes sense and apologies for what I believe is a straightforward answer;
-
As above
example/abc rewrites to example/abc.html
example/abc.html redirects to example/abc
and all internal links link to example/abc
-
Thankyou for the replies.
I will try and clarify what I am trying to get at; apologies in advance for any naivety.
I understand homepage canonicalization; the confusion revolves around how this applies to internal pages.
Logically; I am struggling to see how internal pages are any different to a homepage in terms of the need to avoid multiple urls....and thus an extension-less url seemed appropriate. Not too mention the benefit or cleaner urls, easier to link to, remember etc.
i.e.
example/abc
example/abc.html
example/abc.index.html
-
As nick said, you dont need to do this, but if you are.
1. REWRITE the new url to the old url, as your webserver needs to know the extention
2. REDIRECT the old url to the new one, incase you already have links to the old urls, you dont want5 duplicate content
3. you need to make surer that all internal links point to the new url, you dont want un-necessary redirects as they leak link juice.
-
I'm about to make a whole lot of assumptions about your website to give this answer, just be aware.
Your website is built static, using HTML. Hence the .html file extension. If you're seeing websites that don't have file extension, it's most likely they are using content management systems (or have some serious /folder/index.html stuff going on).
Having a file extension like .html or .aspx or .php is not a bad thing. On websites like yours, it is required (unless you do the above subfolder thing) because it's an actual file the browser is grabbing rather than something being dynamically generated by a CMS. It has nothing to do with future-proofing.
As for 301'ing non-extension URLs to extention'd ones...well I don't know why you'd need to do that for your type of site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Set Up htaccess File
Looking for expert help (willing to pay) to set up a proper htaccess file. I'm having an issue as the site has a subdomain at secure.domain.com and has php extensions there. I tried a couple recommended code sets but it seems to be a mess. The site is working properly but this may be causing rankings issues. It's coded in pure HTML and PHP, no Wordpress stuff.
Technical SEO | | execubob
The delete www causes the secure side to fail. The delete html extensions causes the php extensions to fail.0 -
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
Whats better? A modified .COM EMD or Pure EMD on a new extension?
HI I am planing to built a site. Lets say my keyword is "playgames". I can take for example the domain playgamesnow.com, filler domain. Or I can take playgames.new extension. A pure EMD. My question is whats better?Is it worth to register the domain on a new ( but suitable ) extension just to be shorter and EMD. Or its better to take the modified.COM ( longer and not EMD ). But its a .com It will be a big site, I plan to make an authority, long lasting site. Thanks in advance!
Technical SEO | | Catinas970 -
Which Pagination/Canonicalization Page Selection Approach Should be Used?
Currently working on a retail site that has a product category page with a series of pages related to each other i.e. page 1, page 2, page 3 and Show All page. These are being identified as duplicate content/title pages. I want to resolve this through the applications of pagination to the pages so that crawlers know that these pages belong to the same series. In addition to this I also want to apply canonicalization to point to one page as the one true result that rules them all. All pages have equal weight but I am leaning towards pointing at the ‘Show All’. Catch is that products consistently change meaning that I am sometimes dealing with 4 pages including Show All, and other times I am only dealing with one page (...so actually I should point to page 1 to play it safe). Silly question, but is there a hard and fast rule to setting up this lead page rule?
Technical SEO | | Oxfordcomma0 -
Absolute and relative paths for internal links
I have been looking into absolute and relative paths for internal links, what is better for SEO? Thanks
Technical SEO | | adaptiveconsultancy0 -
Any value in external links to image files?
Let's say you have www.example.com. On this website, you have www.example.com/example-image.jpg. When someone links externally to this image - like below... { is < {a href="www.example.com/example-image.jpg"} {img src="www.example.com/example-image.jpg"} {/a} The external site would be using the image hosted on your site, but the image is also linked back to the same image file on your site. Does this have any value even though the link is back to the image file and not the website? Also - how much value do you guys feel image links have in relation to tech links? In terms of passing link juice and adding to a natural link profile. Thanks!
Technical SEO | | qlkasdjfw1 -
Internal website search
Hi, I'd like to index dynamically generated - internal website search pages - to Google. A mod rewrite to make the URL strings friendlier might be one way, but as these pages are created on the fly and effectively don't exist till the search keywords are inputted, is it even possible to index them? thanks
Technical SEO | | richcowley0 -
Dynamically-generated .PDF files, instead of normal pages, indexed by and ranking in Google
Hi, I come across a tough problem. I am working on an online-store website which contains the functionlaity of viewing products details in .PDF format (by the way, the website is built on Joomla CMS), now when I search my site's name in Google, the SERP simply displays my .PDF files in the first couple positions (shown in normal .PDF files format: [PDF]...)and I cannot find the normal pages there on SERP #1 unless I search the full site domain in Google. I really don't want this! Would you please tell me how to figure the problem out and solve it. I can actually remove the corresponding component (Virtuemart) that are in charge of generating the .PDF files. Now I am trying to redirect all the .PDF pages ranking in Google to a 404 page and remove the functionality, I plan to regenerate a sitemap of my site and submit it to Google, will it be working for me? I really appreciate that if you could help solve this problem. Thanks very much. Sincerely SEOmoz Pro Member
Technical SEO | | fugu0