Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do I need to 301 redirect www.domain.com/index.html to www.domain.com/ ?
-
So, interestingly enough, the Moz crawler picked up my index.html file (homepage) and reported duplicate content, of course. But, Google hasn't seemed to index the www.domain.com/index.html version of my homepage, just the www.domain.com version. However, it looks like I do have links going specifically to www.domain.com/index.html and I want to make sure those are getting counted towards my overall domain strength.
Is it necessary to 301 redirect in the scenario described above?
-
Hi,
I tested the code mentioned above:
RewriteEngine On
RewriteBase /
RewriteRule ^(.*)index.(html|php)$ http://%{HTTP_HOST}/$1 [R=301,L]It works well for index.php, but not for XX-index.php. The site where I tested this code is bilingual, so there is also a GR-index.php file and the above code redirects it to the root domain as well.
Also, another problem is that the code above causes the redirection of index.php inside any directory. For example, http://domain.com/directory/index.php is redirected to http://domain.com/directory/
How can I avoid this and keep only a "basic" redirection of http://domain.com/index.php to http://domain.com?
Yannis
-
I would recommend you to do that. Homepage may have different versions depending on the CMS you are using.
index.htm index.html
index.php
So as @donford mentioned you can fix that with .htaccess
-
Yep,
This code inside your .htaccess file should fix that.
``RewriteEngine On RewriteBase /` RewriteRule ^(.*)index\.(html|php)$ http://%{HTTP_HOST}/$1 [R=301,L]` Hope it helps, Don -
Yes, you will want to redirect it. Be careful though as a lot of times this isn't done correctly and creates a loop. There are multiple ways to do this including but not limited to DirectoryIndex or a RewriteRule
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Problems preventing Wordpress attachment pages from being indexed and from being seen as duplicate content.
Hi According to a Moz Crawl, it looks like the Wordpress attachment pages from all image uploads are being indexed and seen as duplicate content..or..is it the Yoast sitemap causing it? I see 2 options in SEO Yoast: Redirect attachment URLs to parent post URL. Media...Meta Robots: noindex, follow I set it to (1) initially which didn't resolve the problem. Then I set it to option (2) so that all images won't be indexed but search engines would still associate those images with their relevant posts and pages. However, I understand what both of these options (1) and (2) mean, but because I chose option 2, will that mean all of the images on the website won't stand a chance of being indexed in search engines and Google Images etc? As far as duplicate content goes, search engines can get confused and there are 2 ways for search engines
Web Design | | SEOguy1
to reach the correct page content destination. But when eg Google makes the wrong choice a portion of traffic drops off (is lost hence errors) which then leaves the searcher frustrated, and this affects the seo and ranking of the site which worsens with time. My goal here is - I would like all of the web images to be indexed by Google, and for all of the image attachment pages to not be indexed at all (Moz shows the image attachment pages as duplicates and the referring site causing this is the sitemap url which Yoast creates) ; that sitemap url has been submitted to the search engines already and I will resubmit once I can resolve the attachment pages issues.. Please can you advise. Thanks.0 -
Multi region stores, one domain
hi all Wondering if I could get some options on the pros and cons of using one domain for two stores targeting different regions. My website is a fashion retailer, targted at the female market. In addition to the stores the site has a number of blogs, articles etc on. At present we have a co.uk domain and a .com which targets the US market. The trouble with this split approach is having seo two sites at once, in addition to adding content to two sites etc. we are considering combining the stores into one domain and the having the U.S. Shop at /us and UK store at /UK - in wmt we will specifiy the directories as targeted to a specific location, the hotel language etc will be showing UK and U.S. English to further help geo targeting. we are thinking that, in theory, managing just one site will mean it's easier to build the authority and brand name. Pretty much all of the blog and article content is non region specific so it is relevant to both markets, it will also reduce the need to generate unique content for two sites at once. Is there any major downside to merging the sites like this. At present the UK site is da 4 and U.S. site da 0 - they are both pretty new and one of the problems we have at the moment is building up two sites at once. i welcome any opinions. thanks. Carl
Web Design | | WonkyDog0 -
Best way to indicate multiple Lang/Locales for a site in the sitemap
So here is a question that may be obvious but wondering if there is some nuance here that I may be missing. Question: Consider an ecommerce site that has multiple sites around the world but are all variations of the same thing just in different languages. Now lets say some of these exist on just a normal .com page while others exist on different ccTLD's. When you build out the XML Sitemap for these sites, especially the ones on the other ccTLD's, we want to ensure that using <loc>http://www.example.co.uk/en_GB/"</loc> <xhtml:link<br>rel="alternate"
Web Design | | DRSearchEngOpt
hreflang="en-AU"
href="http://www.example.com.AU/en_AU/"
/>
<xhtml:link<br>rel="alternate"
hreflang="en-NZ"
href="http://www.example.co.NZ/en_NZ/"
/> Would be the correct way of doing this. I know I have to change this for each different ccTLD but it just looks weird when you start putting about 10-15 different language locale variations as alternate links. I guess I am just looking for a bit of re-affirmation I am doing this right.</xhtml:link<br></xhtml:link<br> Thanks!0 -
B2C directory website adding B2B ecommerce sub-domain
Hey fellow Mozzers, Just got back from Mozcon and enjoyed getting to know a handful of you. I do in house SEO for a B2B wholesaler. We have a B2C website directory for homeowners to locate contractors to work on their home. On the site we have a products section which includes tech specs but not pricing. Our contractors have been asking us to add the ability to purchase their items online, so we are wanting to add a B2B sub-domain (store.domain.com) to our website for the contractors to purchase products online. We do not want consumers to be able to purchase the items and will have pricing behind a log in. I have a few questions that I'm hoping you might be able to answer: 1. What would be the best practice to not have duplicate content errors with products that are listed on both sites? Should we rel-canonical items shown on both domains or do something else?
Web Design | | AC_Pro
2. We are not against having the new site be crawled, but will Google be upset/ding rankings because pricing is behind a log-in? Are there certain best-practices for B2B ecommerce sites?
3. Do you know of any other sites that have done this/do you have any recommendations on how to best implement this?0 -
Html 5 main and secondary navigation for SEO best performances
I am building a website which will have a main navigation related to the site and each link of the main navigation will have a secondary navigation. We do not want to use a megamenu style navigation. I will try to explain it with a example: Let's start with an example for a computer store "My PC Store", the Main Navigation would be: Desktop PC's Notebook & Tablets
Web Design | | netbuilder
Multimedia When clicking on the "Notebook & Tablets" the user is directed to the page domain.com/notebook-tablet.html and on this page the secondary navigation appears: Laptop Netbook Tablets / iPad I am confused on how I should organize the semantic navigation for best SEO performances and I need advice / suggestions. I thought about 2 different ways to do it but which one is more appropriate in terms of SEO? PROPOSITION A Home Page: <header> My PC Store <nav> Desktop PC's Notebook & Tablets Multimedia </nav> </header> Sub-Page (Notebook & Tablets): <nav>(or <aside>?) Desktop PC's Notebook & Tablets Multimedia </aside> </nav> <header> Notebook & Tablets <nav> Laptop Netbook Tablets / iPad </nav> </header> As you notice on the home page the Main Site Navigation is included in the <header>while it is not in the sub-pages. PROPOSITION B Home Page: <header> My PC Store <nav> Desktop PC's Notebook & Tablets Multimedia </nav> </header> Sub-Page (Notebook & Tablets): <header> Notebook & Tablets <nav> Desktop PC's Notebook & Tablets Multimedia </nav> # Notebook & Tablets * Laptop Netbook Tablets / iPad </header> The main navgation remains always in the <header>(home page / sub-pages) of all page. I need suggestions... How would you guys organize the nav ? </header> </header>0 -
Indexing Dynamic Pages
Hi, I am having an issues among others, regarding indexing dynamic pages. Our website, www.me-by-melia, was just put live and I am concerned the bottom naviagtion pages (http://www.me-by-melia.com/#store, http://www.me-by-melia.com/#facebook, etc) will not be indexed and create duplicate pages. Also, when you open these pages in a new tab, it takes you to homepage. The website was created in HTML5. Please advise.
Web Design | | Melia0 -
Duplicate Content for index.html
In the Crawl Diagnostics Summary, it says that I have two pages with duplicate content which are: www.mywebsite.com/ www.mywebsite.com/index.html I read in a Dream Weaver tutorial that you should name your home page "index.html" and then you can let www.mywebsite.com automatically direct the user to index.html. Is this a bug in SEOMoz's crawler or is it a real problem with my site? Thank you, Dan
Web Design | | superTallDan0