Duplicate Content Resolution Suggestion?
-
SEOmoz tools is saying there is duplicate content for:
What would be the best way to resolve this "error"?
-
Does having the line:
DirectoryIndex index.html
Have any use in addition to the lines you posted?
Thanks.
-
Stephen I agree with the KISS method. Using an htaccess RewriteCond is not the simplest solution for someone who does not know htaccess syntax. In an effort to fully answer this, here is the typical code we are referring to:
Options +FollowSymLinks
RewriteEngine on
RewriteCond %{HTTP_HOST} ^yoursite.com
RewriteRule (.*) http://www.yoursite.com/$1 [R=301,L]
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://www.yoursite.com/ [R=301,L]The first 2 lines are typical commands to Follow Symbolic links, and make sure that the rewrite engine is in the on state.
The first RewriteCond looks at the host, and if it is not the www. version the RewriteRule will redirect the visitor to the http://www. version of your site.
The second RewriteCond looks at whether this is an index.html file, if it is, it will the RewriteRule will 301 redirect them to the version with out the index.html, just yoursite.com/
-
Im a KISS guy, duplicate content pages should just be handled with a rewrite, then they don't appear in any of your stats, attract links, spread your like/tweet numbers over multiple pages, if you are using xml files to keep tabs on your indexing etc and give you a better idea of whats going on on your site
Also, you have to take into account Facebook likes, page tweets, +1s etc - does rel canonical work on the social graph data?
rel canonical sort orders etc but if its a pure duplicate, then 301
"Dont link to page X on your site" isnt really a good solution in my eyes, too much room for error
-
Completely agree.
I think II may have been slightly confused by thinking the default for www.mydomain.com/ was not iindex.html
-
Yes, by the original posting your impression is correct and are the same page, but you can't 301 an index.html page to the domain where the index.html is the page that shows by default.
You could use an htaccess RewriteCond, but could be a little overkill for this situation, where adding a canonical will solve it.
-
I was under the impression that www.mydomain.com/ and www.mydomain.com/index.html were both indexing but are the same page
-
If the index.html is their home page that shows up when just doing the domain: http://www.mydomain.com/
then what would you 301 to? Are you assuming that it is a site that is using a index.php, index.htm, index.asp, etc.?
PlasticCards stated that there is a duplicate content, therefore the index.html page actually exists and should use a canonical.
-
I think in that particular situation I would use 301 as there really isn't a separate use for the /index.html page
-
or 301 redirect in your .htaccess and then you don't have to worry about link issues etc
-
Use a rel='canonical' and use the non index.html for the href;
also don't link to the index.html from anywhere.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International SEO And Duplicate Content Within The Same Language
Hello, Currently, we have a .com English website serving an international clientele. As is the case we do not currently target any countries in Google Search Console. However, the UK is an important market for us and we are seeing very low traffic (almost entirely US). We would like to increase visibility in the UK, but currently for English speakers only. My question is this - would geo-targeting a subfolder have a positive impact on visibility/rankings or would it create a duplicate content issue if both pieces of content are in English? My plan was: 1. Create a geo-targeted subfolder (website.com/uk/) that copies our website (we currently cannot create new unique content) 2. Go into GSC and geo-target the folder to the UK 3. Add the following to the /uk/ page to try to negate duplicate issues. Additionally, I can add a rel=canonical tag if suggested, I just worry as an already international site this will create competition between pages However, as we are currently only targeting a location and not the language at this very specific point, would adding a ccTLD be advised instead? The threat of duplicate content worries me less here as this is a topic Matt Cutts has addressed and said is not an issue. I prefer the subfolder method as to ccTLD's, because it allows for more scalability, as in the future I would like to target other countries and languages. Ultimately right now, the goal is to increase UK traffic. Outside of UK backlinks, would any of the above URL geo-targeting help drive traffic? Thanks
Technical SEO | | Tom3_150 -
Is this going to be seen by google as duplicate content
Hi All, Thanks in advance for any help that you can offer in regards to this. I have been conducted a bit of analysis of our server access file to see what googlebot is doing, where it is going etc. Now firstly, I am not SEO but have an interest. What I am seeing a lot of is that we have URL's that have an extension that sets the currency that is displayed on the products so that we can conduct Adwords campaigns in other countries, these show as follows: feedurl=AUD, feedurl=USD, feedurl=EUR etc. What I can see is that google bot is hitting a URL such as /some_product, then /someproduct?feedurl=USD and then /someproduct?feedurl=EUR and then /someproduct?feedurl=AUD all after each other. Now this is the same product page and just has the price shown slightly different on each. Would this count as a duplicate content issue? Should I disavow feedurl? Any assistance that you can offer would be greatly appreciated. Thanks, Tim
Technical SEO | | timsilver0 -
Duplicate Page Content and Titles from Weebly Blog
Anyone familiar with Weebly that can offer some suggestions? I ran a crawl diagnostics on my site and have some high priority issues that appear to stem from Weebly Blog posts. There are several of them and it appears that the post is being counted as "page content" on the main blog feed and then again when it is tagged to a category. I hope this makes sense, I am new to SEO and this is really confusing. Thanks!
Technical SEO | | CRMI0 -
How to fix duplicate content errors with Go Daddy Site
I have a friend that uses a free GoDaddy template for his business website. I ran his site through Moz Crawl diagnostics, and wow - 395 errors. Mostly duplicate content and duplicate page title I dug further and found the site was doing this: URL: www.businessname.com/page1.php and the duplicate: businessname.com/page1.php Essentially, the duplicate is missing the www. And it does this 2 hundred times. How do I explain to him what is happening?
Technical SEO | | cschwartzel0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Duplicate Content based on www.www
In trying to knock down the most common errors on our site, we've noticed we do have an issue with dupicate content; however, most of the duplicate content errors are due to our site being indexed with www.www and not just www. I am perplexed as to how this is happening. Searching through IIS, I see nothing that would be causing this, and we have no hostname records setup that are www.www. Does anyone know of any other things that may cause this and how we can go about remedying it?
Technical SEO | | CredA0 -
Duplicate content error from url generated
We are getting a duplicate content error, with "online form/" being returned numerous times. Upon inspecting the code, we are calling an input form via jQuery which is initially called by something like this: Opens Form Why would this be causing it the amend the URL and to be crawled?
Technical SEO | | pauledwards0 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0