Best practice for multiple domain links
-
A site i'm working on has about 12 language domains - .es, it, .de etc.
On each page of every domain the header has links to every homepage. At the moment these are all set to no-follow as an initial step to stop potential link profile issues spreading around.
Moving forward i'm not totally sure how to handle these links. On one side I see and agree that no-follow is not necessary, but do-follow is just filtering out and weakening link juice. What is the best way to handle this scenario?
-
Hi Michael,
First, I need to clarify something. If you have .es, .it. de, those are not language domains, they are ccTLDs, targeted at different countries. .es is for Spain based sites, not Spanish language sites. If your translations are on ccTLDs, you are sending the wrong signals to Google and Bing. A ccTLD is always geo-targeted to that country and for some of these domains, the language you mean to target is actually much larger than that country. For instance, only half of the world's French speakers live in France.
If you are just meaning to translate your content, I'd suggest moving that to your main domain and putting them in subfolders, so www.domain.com/es www.domain.com/it (for italian, not Italy).
Once that is all settled, linking to the others in the footer is not hurting your link equity, but it isn't necessary either. For translated content, you should be utilizing HREFLANG and the Meta Content Language tag (for Bing) to note to the search engines what pages are translations. Then you simply need to give users a way to change the language either at the top of the page (preferred) or the footer.
If you mean to geo-target (developing different sites to target different countries), then my answers change completely, let me know if that's what you are trying to do here rather than just translate. If you're not sure what to do, check out this tool I made to help you pick the best international strategy. http://www.katemorris.com/issg
-
"Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both
example.de/
andexample.com/de/
show German language content for users in Germany), you should pick a preferred version and redirect (or use therel=canonical link element) appropriately. In addition, you should follow the guidelines onrel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers."My scenario has completely independent domains served from a central CMS/database, so all that differs is the language content:
- example.co.uk
- example.de
- example.it
- etc...
I'm just not so sure I should be interlinking with every homepage on ever page of each domain with a do follow link, so hesitating on reverting.
-
Hi Michael, this article from Google might be useful https://support.google.com/webmasters/answer/182192?hl=en
-
thanks David. No there is no group homepage as such - each domain is only different in language so each homepage is the same and important to rank in the relevant Google engine.
The only 'central point' as such is a .com which is a page linking out to each domain.
-
If you have a main homepage which acts as the group homepage you could rel canonical the other domains to the one main homepage so google knows which is the main page and which are not.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will multiple internal links with the same anchor text hurt a site's ranking?
Hello, I just watched this video from the Google Webmasters channel at YouTube: http://www.youtube.com/watch?v=6ybpXU0ckKQ My question: If a site is built up on subdomains, will linking the different subdomains with exact anchor text hurt the site's ranking? Thanks
Technical SEO | | arnoldwender0 -
Parked Domains
I have a client who has a somewhat odd situation for their domains. They've been really inconsistent with how they've used them over the years, which makes for a slightly sticky situation. The client has two domains: compname.com and fullcompanyname.com. Right now, their website is just HTML (no CMS) and all of the URLs are relative, so both domains work. Since the new website will be in WordPress, they need to commit to one domain as the primary. Right now, it looks like compname.com is the one they've used the most in ads and such, so I'm going to recommend they go with that. However, the client has also used fullcompanyname.com a lot. They don't want to have to setup individual 301 redirects for everything. I think it's ridiculous, but you can lead a horse to water... Our developer has done some research and he may have found a solution that will satisfy the client. I just want to find out if there are any SEO implications. The possible plan is to us compname.com as the primary domain and to park fullcompanyname.com. That way, if someone visits fullcompanyname.com/products/my-favorite-product, it will still work without having to setup 301 redirects. Since the domain is parked, Google won't recognize it as duplicate content, correct? Just to be clear on the whole situation, I'm insisting that all of the website URLs need 301 redirects, regardless of the domain. The primary concern is with a lot of other stuff on the server that isn't related to the site (email campaign landing pages, image files, assets that are pulled in by the client's software, etc.). The client's concern is about redirecting all that other stuff (and there is a lot of it--thousands of files). The parked domain would seem to fix that, but I want to make sure that the client won't get Google slapped.
Technical SEO | | BopDesign0 -
Old domain vs. New keyword domain - Thoughts?
Okay. I'd like to get opinions as to what everyone thinks about domains lately. Here is any example: The current domain is general in nature, in fact, it's a persons name because they are a real estate agent. So the domain is something like JohnDoe.com. Current stats: Has approx. 130 linking domains pointing to it. Has over 300 incoming links from these linking domains. The link profile is clean and not spammy (not to say there are not a few that might be here and there) Was bough in 1994 The new domain would have very little value except it would be keyword rich such as PortlandHomesForSale.com (just an example). What are your thoughts. Thank you.
Technical SEO | | JordanRussell0 -
Best practices for switching site languages around
Hi folks. The site in question is at http://bit.ly/UDV186 It is split into English and Spanish versions, each at root/en and root/es respectively. The home page is in Spanish. We're trying to rank the site for English keywords so we want to switch the homepage to English and put the Spanish version as secondary. What are the best practices for this? Can we just literally swap the two versions around onto the existing URLs, i.e. take the English text and put it onto the home page? Provided all links point to the correct page, would that be fine? Are there any other best practice considerations to take? Thanks in advance.
Technical SEO | | MattBarker0 -
What's the best URL Structure if my company is in multiple locations or cities?
I have read numerous intelligent, well informed responses to this question but have yet to hear a definitive answer from an authority. Here's the situation. Let's say I have a company who's URL is www.awesomecompany.com who provides one service called 'Awesome Service' This company has 20 franchises in the 20 largest US cities. They want a uniform online presence, meaning they want their design to remain consistent across all 20 domains. My question is this; what's the best domain or url structure for these 20 sites? Subdomain - dallas.awesomecompany.co Unique URL - www.dallasawesomecompany.com Directory - www.awesomecompany.com/dallas/ Here's my thoughts on this question but I'm really hoping someone b*tch slaps me and tells me I'm wrong: Of these three potential solutions these are how I would rank them and why: Subdomains Pros: Allows me to build an entire site so if my local site grows to 50+ pages, it's still easy to navigate Allows me to brand root domain and leverage brand trust of root domain (let's say the franchise is starbucks.com for instance) Cons: This subdomain is basically a brand new url in google's eyes and any link building will not benefit root domain. Directory Pros Fully leverages the root domain branding and fully allows for further branding If the domain is an authority site, ranking for sub pages will be achieved much quicker Cons While this is a great solution if you just want a simple map listing and contact info page for each of your 20 locations, what if each location want's their own "about us" page and their own "Awesome Service" page optimized for their respective City (i.e. Awesome Service in Dallas)? The Navigation and potentially the URL is going to start to get really confusing and cumbersome for the end user. Think about it, which is preferable?: dallas.awesomcompany.com/awesome-service/ www.awesomecompany.com/dallas/awesome-service (especially when www.awesomecompany.com/awesome-service/ already exists Unique URL Pros Potentially quicker rankings achieved than a subdomain if it's an exact match domain name (i.e. dallasawesomeservice.com) Cons Does not leverage the www.awesomecompany.com brand Could look like an imposter It is literally a brand new domain in Google's eyes so all SEO efforts would start from scratch Obviously what goes without saying is that all of these domains would need to have unique content on them to avoid duplicate content penalties. I'm very curious to hear what you all have to say.
Technical SEO | | BrianJGomez0 -
No crawl code for pages of helpful links vs. no follow code on each link?
Our college website has many "owners" who want pages of "helpful links" resulting in a large number of outbound links. If we add code to the pages to prevent them from being crawled, will that be just as effective as making every individual link no follow?
Technical SEO | | LAJN0 -
Nofollow links if you have more than one link on a page to the same destination.
Hi, I am wondering if someone can confirm that its best practice to have nofollow on secondary links on a page. For instance the contact page may have a link in the navigation and in the the blurb down the page have another link to the contact page saying contact us here etc.. So in this instance i would put a nofollow on the secondary link in the blurb would this be the best way to impliment this. Many thanks Chris
Technical SEO | | InteractiveRed670 -
Duplicate content across multiple domains
I have come across a situation where we have discovered duplicate content between multiple domains. We have access to each domain and have recently within the past 2 weeks added a 301 redirect to redirect each page dynamically to the proper page on the desired domain. My question relates to the removal of these pages. There are thousands of these duplicate pages. I have gone back and looked at a number of these cached pages in google and have found that the cached pages that are roughly 30 days old or older. Will these pages ever get removed from google's index? Will the 301 redirect even be read by google to be redirected to the proper domain and page? If so when will that happen? Are we better off submitting a full site removal request of the sites that carries the duplicate content at this point? These smaller sites do bring traffic on their own but I'd rather not wait 3 months for the content to be removed since my assumption is that this content is competing with the main site. I suppose another option would be to include no cache meta tag for these pages. Any thoughts or comments would be appreciated.
Technical SEO | | jmsobe0