Multiple doamin with same content?
-
I have multiple websites with same content such as
http://www.example.org and so on. My primary url is http://www.infoniagara.com and I also placed a 301 on .org.
Is that enough to keep away my exampl.org site from indexing on google and other search engines?
the eaxmple.org also has lots of link to my old html pages (now removed). Should i change that links too? or will 301 redirection solve all such issues (page not found/crawl error) of my old webpages?
i would welcome good seo practices regarding maintaining multiple domains
thanks and regards
-
You want your redirect rules on the server, not client site. In Apache you can do this with mod_write and the .htaccess file like so.
To add the www
RewriteEngine on
RewriteCond %{HTTP_HOST} !^www.
RewriteRule ^(.*)$ http://www.%{HTTP_HOST}/$1 [R=301,L]To remove the www:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^(.*)$ http://%1/$1 [R=301,L]In IIS they have a rewrite command too. I've not used it myself but this should help: http://www.petermoss.com/post/How-to-redirect-non-www-domain-to-www-domain-requests-in-IIS-7.aspx
-
Anyway I am lucky being with a group of average team mates.
-
I would recommend either asking a new Q&A as to how an IIS redirect works, or checking Google. I lack experience working in that environment. I was spoiled by working with a very talented team who had performed all those changes and I never needed to learn any aspects of IIS.
-
Hi,
Thanks. As you said i was checking server side redirection. My site server is II7. I tried to make a server side rediretion but couldn't. I found a java script redirection and created a redirection. See the page; http://www.infoniagara.com/d-bed-roses.html
I think this too is not a correct redirection, is it?
thanks
-
Glad to be of help. You are always free to reach out here at the SEOmoz Q&A. If you feel a need to reach me specifically my contact information is in my user profile.
-
Ohh thanks so much Ryan. Let me learn the server side redirection and other aspects related to it.
Thanks once again for your consideration and time. I hope I can approach you in future too.
best regards
-
The javascript code you shared is not a proper redirect.
A proper redirect happens on the server. Instead of loading the original target page, the server instead will load the redirect page instantly along with a header response code of 301 which tells search engines the content has moved to a new URL.
If you use javascript in the manner you shared, the original page will load with a 200 "all ok" header code, and then 3 seconds later the javascript will trigger and load the new page with a 200 header code. All the backlinks will still be applied to the original page and not the redirected page.
The exact method of performing a redirect varies based on your server setup. If you have a LAMP server with cPanel, there is a Redirect tool which you can use.
-
hi,
It's really helpful and now I understand it.
I know that you are one of the true masters of SEO and I think you can clarify one more doubt. It is also regarding the issue of redirection. I want to know that a script i use for redirectining old pages to newpages is right or not?
This script is working properly and I want to know that what type of redirection is this and is it a proper redirection to get the backlink juice? (I add this script to the body just after the
header.)
thanks in advance
-
A 301 redirect is the proper solution and superior to the canonical. It is fine to have the canonical too, but add the redirect.
When you ask "should I do it for each page", understand a single redirect can forward all non-www traffic on your site to it's www equivalent. If you are unsure how to perform the redirect, simply ask your host. Most sites are on managed hosting and it is a very common and easy request.
-
hi dear Ryan,
thanks so much for your time and valuable suggestions. As you said, i am aware of this problem and therefore i added a canonical url to the homepage. should i make a 301 from non-www to www url? should i do it for each page?
thanks & regards
-
In short, you should not use duplicate content across various domains. Doing such will likely negatively affect rankings for either site. Try doing a search that would naturally return the results for a duplicated web page. You will likely find one page ranks well, while the other page ranks significantly lower due to the duplication.
I checked your .org site and it's pages are properly 301 redirected to the .com site. This change would cause any valid pages listed on the .org site to disappear from Google's index. It may take a month from the date the 301 was implemented for Google to crawl and update the entire site.
One point I would add is I suggest you perform a Google site:http://www.infoniagara.org search. Notice you do have a lot of search results for the .org site still present. Those pages are properly redirected to the .org site but they receive 404 errors. If the pages are really gone and there is no equivalent, that is fine and these results should disappear from Google's index over time. If there are similar pages on your site, you should 301 redirect these pages to them.
Another issue, your .com site appears in both the www and non-www form. If you take a URL, remove the "www" then the page appears normally showing the non-www URL. This is a problem which needs to be fixed as it is dividing your backlink juice. Pick one version of your URL, the www or non-www version, and 301 the other version.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Competitor has same site with multiple languages
Hey Moz, I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc. My first thought is this is basically a way to game the system but I could be wrong. They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it. TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
White Hat / Black Hat SEO | | HashtagHustler0 -
Dublicated content
I have someone to write new pages for my site. How do I know the pages she is writing is not duplicated from other other website. is there any website or software to do this? What is the best way to check? Thank you
White Hat / Black Hat SEO | | SinaKashani0 -
Can I Point Multiple Exact Match Domains to a Primary Domain? (Avoiding Duplicate Content)
For example, lets say I have these 3 domains: product1.com product2.com product.com The first 2 domains will have very similar text content, with different products. The product.com domain will be similar content, with all of the products in one place. Transactions would be handled through the Primary domain (product.com) The purpose of this would be to capitalize on the Exact match domain opportunities. I found this seemingly old article: http://www.thesitewizard.com/domain/point-multiple-domains-one-website.shtml The article states that you can avoid duplicate content issues, and have all links attributed to the Primary domain. What do you guys think about this? Is it possible? Is there a better way of approaching this while still taking advantage of the EMD?
White Hat / Black Hat SEO | | ClearVisionDesign0 -
How does Google decide what content is "similar" or "duplicate"?
Hello all, I have a massive duplicate content issue at the moment with a load of old employer detail pages on my site. We have 18,000 pages that look like this: http://www.eteach.com/Employer.aspx?EmpNo=26626 http://www.eteach.com/Employer.aspx?EmpNo=36986 and Google is classing all of these pages as similar content which may result in a bunch of these pages being de-indexed. Now although they all look rubbish, some of them are ranking on search engines, and looking at the traffic on a couple of these, it's clear that people who find these pages are wanting to find out more information on the school (because everyone seems to click on the local information tab on the page). So I don't want to just get rid of all these pages, I want to add content to them. But my question is... If I were to make up say 5 templates of generic content with different fields being replaced with the schools name, location, headteachers name so that they vary with other pages, will this be enough for Google to realise that they are not similar pages and will no longer class them as duplicate pages? e.g. [School name] is a busy and dynamic school led by [headteachers name] who achieve excellence every year from ofsted. Located in [location], [school name] offers a wide range of experiences both in the classroom and through extra-curricular activities, we encourage all of our pupils to “Aim Higher". We value all our teachers and support staff and work hard to keep [school name]'s reputation to the highest standards. Something like that... Anyone know if Google would slap me if I did that across 18,000 pages (with 4 other templates to choose from)?
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
Shadow Pages for Flash Content
Hello. I am curious to better understand what I've been told are "shadow pages" for Flash experiences. So for example, go here:
White Hat / Black Hat SEO | | mozcrush
http://instoresnow.walmart.com/Kraft.aspx#/home View the page as Googlebot and you'll see an HTML page. It is completely different than the Flash page. 1. Is this ok?
2. If I make my shadow page mirror the Flash page, can I put links in it that lead the user to the same places that the Flash experience does?
3. Can I put "Pinterest" Pin-able images in my shadow page?
3. Can a create a shadow page for a video that has the transcript in it? Is this the same as closed captioning? Thanks so much in advance, -GoogleCrush0 -
Creating multiple domains with key phrases and linking back and forth to them
There are several of my competitors who have built multiple sites with keywords in their domain names such as localaustinplumber.com, houstonplumbers.com, Dallasplumbers.com, localdallasplumbingservices.com...you get the picture. (These are just made up examples to illustrate what they are doing) They put unique content on each page and use alias whois using a different credit card to set up each domain to hide the fact from Google that they are the same entity and then link back and forth to each of the domains with appropriate keywords in the anchor text. They are outranking me on a lot of key search phrases due to the fact that they have the keywords in the domain name. They have no other outside links other than the links from the domains that they own. Is this a good idea? is it black hat? are they going to get slapped if someone reports them as a link farm? It's frustrating for me staying white hat and getting legitimate links and then these competitors come in and out rank me after only a few months with this scheme. Is this a common practice to rank highly for certain key phrases? Thanks in advance for your opinions! Ron10
White Hat / Black Hat SEO | | Ron100