Duplicate content
-
crawler shows following links as duplicate
How can i solve this issue?
-
Try this:
RewriteEngine on
<code>RewriteCond %{THE_REQUEST} ^.*\/index\.html?\ HTTP/ RewriteRule ^(.*)index\.html?$ "/$1" [R=301,L]</code>
-
This works.Now non www. to www redirection is set. Thanks
now how do i solve this
will your above code solve this one also?
-
try adding this to your .htaccess
RewriteEngine On
RewriteCond %{HTTP_HOST} ^princetown.in
RewriteRule (.*) http://www.princetown.in/$1 [R=301,L] -
I added following line to .htaccess
RedirectPermanent /index.html http://www.mysite.com/
Now whenever i open homepage i getting following message
Moved Permanently
The document has moved here.
-
Do i do it one by one?
Can i set all non www URL to www?
-
Google essentially sees those 5 domains as individual sites and will list each as individual sites in results.
Follow the guide bjgomer liks you to, you need to set up your .htaccess file to resolve www and either finishwith a trailing slash or index.html the former usually being the better. You also need to decide first whether you want your site to be http://mysite.com or http://www.mysite.com.
-
Setup a 301 re-direct in your .htaccess file that designates the 'official' home page url. This will eliminate the crawl errors.
http://www.bruceclay.com/blog/2007/03/how-to-properly-implement-a-301-redirect/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I change PDF content?
Hi everybody, My Website is ranking well for several keywords and long-tail keywords. However, all these visits are going directly to some .PDF guides that exist on our products and information on industry sectors the company is based around. I feel the PDF's are bad simply because they dont offer easy interaction with the rest of the website. I am considering making each PDF into a webpage but am not 100% sure of the pro's and cons of doing so. I will still need to the PDF's accessible for user to download but don't want my new webpages to get tagged as duplicate content. Is it possible to,
On-Page Optimization | | ATP
1 - change the PDF's so they send any link authority to the new webpage
2 - make google aware that I want the webpage not the PDF to be the "ranking" page What is the likely hood of destroying my rank for these keywords on the PDF by making these changes and then not being able to rank the webpage for the same keywords? It would be pointless if I just lost all the traffic lol.0 -
New Client Wants to Keep Duplicate Content Targeting Different Cities
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities. We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
On-Page Optimization | | waqid0 -
Help: my WordPress Blog generates too many onpage links and duplicate content
I have a WordPress Blog since November last year (so I'm pretty new to WordPress) and the effects on ranking for some keywords are really good. So I thought tag clouds are good. Crawl Diagnostics tell me now that I have too many onpage links for example my author page breaks the record: 256
On-Page Optimization | | inlinear
http://inlinear.com/blog/author/inlinear/ I think thats because there are links for each word in the tag cloud generated ... On this page (and many other pages) WordPress displays (teasers) the beginning of each post (read more ...) producing duplicate content and even new canonical tags.... The page titles are also too long because I installed "All in One SEO Pack" and now this plugin and wordpress itself mixes titles together ... But what can I do to avoid all this. Is there a PlugIn that can help... I think millions of blogs will have the same problems... I my blog yet has very few content. Thanks for your answers :))0 -
Duplicate Content - Delete it or NoIndex?
Last month I realized that one of my freelancers had been feeding my website with copied / spun content and sadly, there's lots of it. And of course it got my website to be hit hard by the last Panda update. Now that I've identified the content, what the best thing to do? Should I delete it permanently and get 404 errors or should I set the pages' robot meta tag to "nofollow"?
On-Page Optimization | | sbrault740 -
Static content VS Dynamic changing content what is best
We have collected a lot of reviews and we want to use them on our Categories pages. We are going to be updating the top 6 reviews per categories every 4 days. There will be another page to see all of the reviews. Is there any advantage to have the reviews static for 1 or 2 weeks vs. having unique new ones pulled from the data base every time the page is refreshed? We know there is an advantage if we keep them on the page forever with long tail; however, we have created a new page with all of the reviews they can go to.
On-Page Optimization | | DoRM0 -
How do I avoid duplicate content and page title errors when using a single CMS for a website
I am currently hosting a client site on a CMS with both a Canadian and USA version of the website. We have the .com as the primary domain and the .ca is re-directed from the registrar to the Canadian home page. The problem I am having is that my campaign produces errors for duplicate page content and duplicate page titles. Is there a way to setup the two versions on the CMS so that these errors do not get produced? My concern is getting penalized from search engines. Appreciate any help. Mark Palmer
On-Page Optimization | | kpreneur0 -
I have one page on my site... but still get duplicate name and content errors.
i have only the index.html page. my domain has a permanent 301 to the root. why am i getting duplicate problems? i only have one page the index .html???
On-Page Optimization | | one4u2see0 -
Filtered Navigation, Duplicate content issue on an Ecommerce Website
I have navigation that allows for multiple levels of filtering. What is the best way to prevent the search engine from seeing this duplicate content? Is it a big deal nowadays? I've read many articles and I'm not entirely clear on the solution. For example. You have a page that lists 12 products out of 100: companyname.com/productcategory/page1.htm And then you filter these products: companyname.com/productcategory/filters/page1.htm The filtered page may or may not contain items from the original page, but does contain items that are in the unfiltered navigation pages. How do you help the search engine determine where it should crawl and index the page that contains these products? I can't use rel=canonical, because the exact set of products on the filtered page may not be on any other unfiltered pages. What about robots.txt to block all the filtered pages? Will that also stop pagerank from flowing? What about the meta noindex tag on the filitered pages? I have also considered removing filters entirely, but I'm not sure if sacrificing usability is worth it in order to remove duplicate content. I've read a bunch of blogs and articles, seen the whiteboard special on faceted navigation, but I'm still not clear on how to deal with this issue.
On-Page Optimization | | 13375auc30