How best to deal with www.home.com and www.home.com/index.html
-
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work.
I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content.
If I simply drop a rel canonical into the page, will this solve my dupe content woes?
The canonical tag would then appear in both www.home.com and www.home.com/index.html cases.
If the above is Ok, which version should I be going with?
- or -
Thanks in advance folks,
James @ Creatomatic -
It certainly does help, many thanks Paul - hugely appreciated.
-
In this situation, using a canonical to point to the primary is a workaround, but the correct way to handle it is with a 301 redirect. Canonicals are to be used when both versions of the page need to be indexed, but all the influence is to be directed to a single URL.
In this case, there is no functional reason why you would want both URLs to remain in the index and be reachable by the two different addresses because they are the exact same page. Therefore the correct solution is to 301 redirect the /index.html URL to the primary URL. (This will also be cleanest to maintain, will pass maximum amount of authority, and is best for usability)
ASP sites are hosted on Microsoft IIS servers. IIS does not use or recognize .htaccess files. Instead, you will need to use the URL Rewrite Module. It should be preinstalled on most IIS servers, or you can request that your host/server admin add it. (If the server's older than IIS 7, you'll need a 3rd part ISAPI Rewrite module instead of Microsoft's own module)
Here's a TechRepublic article on using the Rewrite Module to perform the same sorts of functions as .htaccess on Apache servers. http://ow.ly/fXSAB In many ways, its basics are easier than .htaccess.
Note you should also be redirecting the non-www version of the site to the fully qualified domain name as well if you haven't already
Hope this helps?
Paul
-
That's correct - they are the same page.
To better explain, this is all done old-school via FTP, so any edits or changes I make to the file/page "index.html" apply to the following URL's
Is there any harm in telling search engines that the Canonical version of a page IS the same page?
(Actually, there were LOADS more but I've got fixes in place for most of these)
-
Adam, unfortunately the method you link to won't work, because the two URLs in question here are actually the same page. If this were handled this way, you'd be creating an infinite redirect looping in on itself.
Paul
-
Hi James,
First, run a crawl on your site. Is the /index.html getting picked up in the crawl? If so then it is being linked to internally. Check the navigation bar(s) to see if the link to 'Home' is linking to /index.html. Once you have found all the internal links linking to /index.html, you will then need to change these to point to the home page without the filepath (e.g. http://www.example.com/).
The second step would be to implement a canonical tag on both pages that point to the home page without the filepath. So in your example that would be as follows:
That is one way of solving any duplicate content issues without using 301 redirects via .htaccess. However, I believe there is a way to do this via .asp but you would have to search around for this. I did a quick search and found this page that might be of help.
Hope that helps,
Adam.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My WP website got attack by malware & now my website site:www.example.ca shows about 43000 indexed page in google.
Hi All My wordpress website got attack by malware last week. It affected my index page in google badly. my typical site:example.ca shows about 130 indexed pages on google. Now it shows about 43000 indexed pages. I had my server company tech support scan my site and clean the malware yesterday. But it still shows the same number of indexed page on google.
Technical SEO | | ChophelDoes anybody had ever experience such situation and how did you fixed it. Looking for help. Thanks FILE HIT LIST:
{YARA}Spam_PHP_WPVCD_ContentInjection : /home/example/public_html/wp-includes/wp-tmp.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-includes/wp-vcd.php
{YARA}Backdoor_PHP_WPVCD_Deployer : /home/example/public_html/wp-content/themes/oceanwp.zip
{YARA}webshell_webshell_cnseay02_1 : /home/example2/public_html/content.php
{YARA}eval_post : /home/example2/public_html/wp-includes/63292236.php
{YARA}webshell_webshell_cnseay02_1 : /home/example3/public_html/content.php
{YARA}eval_post : /home/example4/public_html/wp-admin/28855846.php
{HEX}php.generic.malware.442 : /home/example5/public_html/wp-22.php
{HEX}php.generic.cav7.421 : /home/example5/public_html/SEUN.php
{HEX}php.generic.malware.442 : /home/example5/public_html/Webhook.php0 -
Spammy structured data for http://www.heritageprinting.com/ might be dropped from search results
We received the above message, which I'm see may also have. Before I go making hours of edits can someone give me an opinion on what may need fixed? Here's a link to one of our products: http://heritageprinting.com/products/step-and-repeat.phpAll products are uniquely marked upIt may be the $ dollar sign, but I'm not certain.Looking at WMT > Search Appearance > Structured Data, I see no errors for Schema Markup. TY in advance :)KJr
Technical SEO | | KevnJr0 -
Getting high priority issue for our xxx.com and xxx.com/home as duplicate pages and duplicate page titles can't seem to find anything that needs to be corrected, what might I be missing?
I am getting high priority issue for our xxx.com and xxx.com/home as reporting both duplicate pages and duplicate page titles on crawl results, I can't seem to find anything that needs to be corrected, what am I be missing? Has anyone else had a similar issue, how was it corrected?
Technical SEO | | tgwebmaster0 -
Www vs non www - Crawl Error 902
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help. The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away. I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing. I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version. Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution? Many thanks,
Technical SEO | | ATP
Ben .0 -
"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical. The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
Technical SEO | | RosemaryB
https://support.google.com/webmasters/answer/44231?hl=en GWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard. I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links. Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content? Thanks for your feedback. Rosemary0 -
Why do some URLs for a specific client have "/index.shtml"?
Reviewing our client's URLs for a 301 redirect strategy, we have noticed that many URLs have "/index.shtml." The part we don'd understand is these URLs aren't the homepage and they have multiple folders followed by "/index.shtml" Does anyone happen to know why this may be occurring? Is there any SEO value in keeping the "/index.shtml" in the URL?
Technical SEO | | FranFerrara0 -
Best TLD for china
In China there are 2 commonly used tlds .cn and .com.cn. We own both versions for a new domain. Does anyone know if there is research done which one is the best TLD "in the eyes" of the search engines Baidu and Google? Or maybe there is a methodology to select the best? Thanks!
Technical SEO | | Paul-G0 -
Follow up from http://www.seomoz.org/qa/discuss/52837/google-analytics
Ben, I have a follow up question from our previous discussion at http://www.seomoz.org/qa/discuss/52837/google-analytics To summarize, to implement what we need, we need to do three things: add GA code to the Darden page _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.darden.virginia.edu']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Change links on the Darden Page to look like http://www.darden.virginia.edu/web/MBA-for-Executives/ and [https://darden-admissions.symplicity.com/applicant](<a href=)">Apply Now and make into [https://darden-admissions.symplicity.com/applicant](<a href=)" > onclick="_gaq.push(['_link', 'https://darden-admissions.symplicity.com/applicant']); return false;">Apply Now Have symplicity add this code. _gaq.push(['_setAccount', 'UA-12345-1']);_gaq.push(['_setAllowLinker', true]);_gaq.push(['_setDomainName', '.symplicity.com']);_gaq.push(['_setAllowHash', false]);_gaq.push(['_trackPageview']); Due to our CMS system, it does not allow the user to add onClick to the link. So, we CANNOT add part 2) What will be the result if we have only 1) and 3) implemented? Will the data still be fed to GA account 'UA-12345-1'? If not, how can we get cross domain tracking if we cannot change the link code? Nick
Technical SEO | | Darden0