Homepage Index vs Home vs Default?
-
Should your home page be www.yoursite.com/index.htm or home.htm or default.htm on an apache server? Someone asked me this, and I have no idea. On our wordpress site, I have never even seen this come up, but according to my friend, every homepage HAS to be one of those three. So my question is which one is best for an apache server site AND does it actually have to be one of those three?
Thanks,
Ruben
-
Thanks so much! I really appreciate the insight.
Ruben
-
If you want more information yes Apache server does make
Most Apache web servers have set the following pages as the default page with a directory.
default.htm default.html index.htm index.html index.php
If using WordPress you could see index.php because WordPress
Uses a PHP Database regardless of what your friend told you ideally you should have
Properly 301 redirected static links when talking about links that are going to be seen by Google.
this
Means WordPress websites ideally should not contain publicly visible index file of any kind.
To cut to the chase I have cited this response from Matt Cutts, the head of Google’s web spam team. to Yoast
Referenced from http://yoast.com/wordpress-seo-url-permalink/
I emailed Matt and asked whether it makes sense to add .html for systems like WordPress. His response:
In general I wouldn’t. My WP has urls like http://www.mattcutts.com/blog/remove-result/ and that’s pretty ideal.
So. Case closed.
Should I add .html to my permalink structure?
A good collection of resources. Are posted as links below this line.
Use Yoast http://yoast.com/wordpress/seo/
Is an excellent source of WordPress knowledge I strongly recommend using the Yoast WordPress seo tool the reason a side from being one of the very best Word press plug-ins that enhance your Word press site this one uses
http://yoast.com/change-wordpress-permalink-structure/
http://yoast.com/wp-content/permalink-helper.php
If you need to make changes to your link structure A great
resource I understand redirects is the link under this line
http://24ways.org/2013/url-rewriting-for-the-fearful/
If you need to redirect an index tag using Nginx
http://moz.com/blog/htaccess-file-snippets-for-seos
http://codex.wordpress.org/Linking_Posts_Pages_and_Categories
-
Merry Christmas,
Ruben
Your friend is speaking about a 100% HTML site and the answer is your site should not end using
www.yoursite.com/index.htm or home.htm or default.htm
Using Word press your site should end in www.yoursite.com or www.yoursite.com/
Some people when they ring there site over from an HTML static site to WordPress they might find a parma links
http://codex.wordpress.org/Using_Permalinks
If you do this and 301 redirect any links that would be changed by using WordPress to the new Permalink structure shown below or if you feel like using one of the others which I recommend against you may. However most sites are best served using the settings below.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Indexed Pages Increase and Major Drop June 25th and July 16th?
I am seeing information regarding a possible Google algorithm that may have taken place on June 25th...and seeing total number of pages indexed in GSC increase (cool!)...BUT, then on July 16, I'm seeing a consistent drop (BIG DROP) of pages indexed - not only on our site, but several. Does anyone have any insight into this or experiencing the same issue?
Algorithm Updates | | kwilgus0 -
De-indexed homepage in Google - very confusing.
A website I provide content for has just suffered a de-indexed homepage in Google (not in any of the other search engines) - all the other pages remained indexed as usual. Client asked me what might be the problem and I just couldn't figure it out - no linkbuilding has ever been carried out so clean backlink profile, etc. I just resubmitted it and it's back in its usual place, and has maintained the rankings (and PR) it had before it disappeared a few days ago. I checked WMT and no warnings or issues there. Any idea why this might've happened?
Algorithm Updates | | McTaggart0 -
Local Listings vs. Spreading Too Thin
Hello SEO Community, I'm trying to find the right balance between adapting to Googles move towards local listings and not spreading out my site too thin. We provide our services nationally and currently have local city listings (i.e. http://www.cleanedison.com/courses/city/IL-Chicago) but these do not show up in the SERPs for individual products + city (i.e. Building Analyst Chicago) So I could make individual pages for each product in each city, but that would exponentially increase the number of URLs on the site and probably inundate me with duplicate content. Is there a better way I could take advantage of local listings without creating all the duplicate content and other problems that would arise with individual URLs? Thanks
Algorithm Updates | | CleanEdisonInc0 -
Trying to figure out why one of my popular pages was de-indexed from Google.
I wanted to share this with everyone for two reasons. 1. To try to figure out why this happened, and 2 Let everyone be aware of this so you can check some of your pages if needed. Someone on Facebook asked me a question that I knew I had answered in this post. I couldn't remember what the url was, so I googled some of the terms I knew was in the page, and the page didn't show up. I did some more searches and found out that the entire page was missing from Google. This page has a good number of shares, comments, Facebook likes, etc (ie: social signals) and there is certainly no black / gray hat techniques being used on my site. This page received a decent amount of organic traffic as well. I'm not sure when the page was de-indexed, and wouldn't have even known if I had't tried to search for it via google; which makes me concerned that perhaps other pages are being de-indexed. It also concerns me that I have done something wrong (without knowing) and perhaps other pages on my site are going to be penalized as well. Does anyone have any idea why this page would be de-indexed? It sure seems like all the signals are there to show Google this page is unique and valuable. Interested to hear some of your thoughts on this. Thanks
Algorithm Updates | | NoahsDad0 -
Has Google problems in indexing pages that use <base href=""> the last days?
Since a couple of days I have the problem, that Google Webmaster tools are showing a lot more 404 Errors than normal. If I go thru the list I find very strange URLs that look like two paths put together. For example: http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm If I check on which page Google found that path it is showing me the following URL: http://www.domain.de/languages/languageschools/havanna/spanishcourse.htm If I check the source code of the Page for the Link leading to the London Page it looks like the following: [...](languages/languageschools/london/london.htm) So to me it looks like Google is ignoring the <base href="..."> and putting the path together as following: Part 1) http://www.domain.de/laguages/languageschools/havanna/ instead of base href Part 2) languages/languageschools/london/london.htm Result is the wrong path! http://www.domain.de/languages/languageschools/havanna/languages/languageschools/london/london.htm I know finding a solution is not difficult, I can use absolute paths instead of relative ones. But: - Does anyone make the same experience? - Do you know other reasons which could cause such a problem? P.s.: I am quite sure that the CMS (Typo3) is not generating these paths randomly. I would like to be sure before we change the CMS's Settings to absolute paths!
Algorithm Updates | | SimCaffe0 -
Hyphens vs Underscores
I am optimizing a site which uses underscores rather than hyphens as word separators (such_as_this.php vs. such-as-this.php). Most of these pages have been around since 2007, and I am hesitant to just redirect to a new page because I am worried it will cause the rankings to slip. Would you recommend changing the file names to be in hyphenated format and place 301 redirects on the pages with underscores, or stick with the existing pages? Is there anything else that would work better? Thanks!
Algorithm Updates | | BluespaceCreative1