TLDs vs ccTLDs?
-
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came.
So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs.
What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits?
Thanks
V
-
Thanks, Jane, that's a much better answer/example!
-
Hi again,
Sorry it has taken a few days to get back to you. I replied in the other thread about ccTLDs versus using one site. Some additional info: in general, you will have an easier time using the subfolder structure recommended in the other question (again, as long as there are no factors which make it important to have country-specific domains). The Singapore / Indonesian sections of the website will naturally inherit authority because they sit on the strong .com. Just being linked to by the .com isn't enough to give them such a large boost.
Apple uses this strategy for internationalisation: http://www.apple.com/uk/ for the UK, http://www.apple.com/nz/ for New Zealand, http://www.apple.com/sg/ for Singapore and so forth.
On the other hand, BlackBerry uses subdomains: http://uk.blackberry.com/ and http://sg.blackberry.com/.
Amazon obviously uses ccTLDs.
All of these domains are hellishly strong in their own rights; traditionally it has been thought of as best to use one site like Apple does if you are no a mammoth already. However, you can make other options work with good link development. I think in your case, one domain is something to seriously consider.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Cookies - Organic vs PPC visitors
I am not a developer - I am researching this for our team, so please, be gentle... I am also not quite sure how to ask this question. We want to serve up custom pages for visitors from Google organic. We aren't doing anything underhanded - the pages will have very small differences that will not affect our rankings and won't land us in Google jail. When a Google visitor hits one of our pages, what specific piece of data are we looking for to determine: a. It's a Google visitor b. He/she came from organic results. I need to tell our developers to look for something that triggers the custom page. It's the same data that Google Analytics uses to trigger the appropriate visitor type. Please pardon my naivete.
Intermediate & Advanced SEO | | AMHC0 -
Cookieless subdomains Vs SEO
We have one .com that has all our unique content and then 25 other ccltd sites that are translated versions of the .com for each country we operate in. They are not linked together but we have href lang'd it all together. We now want to serve up all static content of our global website (26 local country sites, .com, .co.uk, .se, etc) from one cookie-less subdomain. Benefit is speed improvement. The question is whether from an SEO perspective, can all static content come from static.domain.com or should we do one for each ccltd where it would come form static.domain.xx (where xx is localised to the domain in question)
Intermediate & Advanced SEO | | aires-fb770 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
Canonical tag + HREFLANG vs NOINDEX: Redundant?
Hi, We launched our new site back in Sept 2013 and to control indexation and traffic, etc we only allowed the search engines to index single dimension pages such as just category, brand or collection but never both like category + brand, brand + collection or collection + catergory We are now opening indexing to double faceted page like category + brand and the new tag structure would be: For any other facet we're including a "noindex, follow" meta tag. 1. My question is if we're including a "noindex, follow" tag to select pages do we need to include a canonical or hreflang tag afterall? Should we include it either way for when we want to remove the "noindex"? 2. Is the x-default redundant? Thanks for any input. Cheers WMCA
Intermediate & Advanced SEO | | WMCA0 -
Microsites: Subdomain vs own domains
I am working on a travel site about a specific region, which includes information about lots of different topics, such as weddings, surfing etc. I was wondering whether its a good idea to register domains for each topic since it would enable me to build backlinks. I would basically keep the design more or less the same and implement a nofollow navigation bar to each microsite. e.g.
Intermediate & Advanced SEO | | kinimod
weddingsbarcelona.com
surfingbarcelona.com or should I rather go with one domain and subfolders: barcelona.com/weddings
barcelona.com/surfing I guess the second option is how I would usually do it but I just wanted to see what are the pros/cons of both options. Many thanks!0 -
Total Indexed 1.5M vs 83k submitted by sitemap. What?
We recently took a good look at one of our content site's sitemap and tried to cut out a lot of crap that had gotten in there such as .php, .xml, .htm versions of each page. We also cut out images to put in a separate image sitemap. The sitemap generated 83,000+ URLs for google to crawl (this partially used the Yoast Wordpress plugin to generate) In webmaster tools in the index status section is showing that this site has a total index of 1.5 million. With our sitemap coming back with 83k and google indexing 1.5 million pages, is this a sign of a CMS gone rogue? Is it an indication that we could be pumping out error pages or empty templates, or junk pages that we're cramming into Google's bot? I would love to hear what you guys think. Is this normal? Is this something to be concerned about? Should our total index more closely match our sitemap page count?
Intermediate & Advanced SEO | | seoninjaz0 -
Broken sitemaps vs no sitemaps at all?
The site I am working on is enormous. We have 71 sitemap files, all linked to from a sitemap index file. The sitemaps are not up to par with "best practices" yet, and realistically it may be another month or so until we get them cleaned up. I'm wondering if, for the time being, we should just remove the sitemaps from Webmaster Tools altogether. They are currently "broken", and I know that sitemaps are not mandatory. Perhaps they're doing more harm than good at this point? According to Webmaster Tools, there are 8,398,082 "warnings" associated with the sitemap, many of which seem to be related to URLs being linked to that are blocked by robots.txt. I was thinking that I could remove them and then keep a close eye on the crawl errors/index status to see if anything changes. Is there any reason why I shouldn't remove these from Webmaster Tools until we get the sitemaps up to par with best practices?
Intermediate & Advanced SEO | | edmundsseo0 -
Controlling PageRank vs flat site architecture
Hey all. Here's the scenario. I have this pretty trusted site with a relatively high PR. The navigation menu has around 300 links. But this is because it is a CSS menu that drills down into subcategories. Now, would restricting the amount of links in this menu be beneficial? I am not worried about any subcategory pages not being crawled or indexed, but I am concerned that subcategory pages will not receive as high of PageRank if they are not linked to directly from the home page, thereby lowering the ranking potential. Even with new pages that are created they receive a PR of 5 if linked to from the home page. But I'm also thinking that toning down the menu size would be beneficial by funneling more PageRank to category pages and increasing the likelihood of ranking for some core head/middle terms. I have seen sites that externalize the menu in JavaScript files and disallow it in Robots.txt to prevent too much PageRank from linking out, but SEO isn't really a one-solution-fits-all in my experience. I may try a test. Externalizing the menu may also increase the relevance for pages because I won't have a bunch of other content on the page not relevant to that page's specific keywords. Anyone with experience in this arena? I would love to hear your input. Thanks
Intermediate & Advanced SEO | | JeremyNelson580