TLDs vs ccTLDs?
-
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came.
So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs.
What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits?
Thanks
V
-
Thanks, Jane, that's a much better answer/example!
-
Hi again,
Sorry it has taken a few days to get back to you. I replied in the other thread about ccTLDs versus using one site. Some additional info: in general, you will have an easier time using the subfolder structure recommended in the other question (again, as long as there are no factors which make it important to have country-specific domains). The Singapore / Indonesian sections of the website will naturally inherit authority because they sit on the strong .com. Just being linked to by the .com isn't enough to give them such a large boost.
Apple uses this strategy for internationalisation: http://www.apple.com/uk/ for the UK, http://www.apple.com/nz/ for New Zealand, http://www.apple.com/sg/ for Singapore and so forth.
On the other hand, BlackBerry uses subdomains: http://uk.blackberry.com/ and http://sg.blackberry.com/.
Amazon obviously uses ccTLDs.
All of these domains are hellishly strong in their own rights; traditionally it has been thought of as best to use one site like Apple does if you are no a mammoth already. However, you can make other options work with good link development. I think in your case, one domain is something to seriously consider.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Single Folder vs Root
I'm working on a multi-state attorney website and I'm going back and forth on URL's. I thought I'd see what the community thinks. lawsite.com/los-angeles/car-accident-lawyer vs. lawsite.com/los-angeles-car-accident-lawyer I should note this site will have over a dozen city locations, with different practices.
Intermediate & Advanced SEO | | EdShull0 -
Sub domain vs. sub folder
I know this has probably been asked many times and answered too, but things change a lot, so I would like to know with current search engine algos and co. The scenario is as follows: Building an ecommerce site and also want to incorporate a Q&A section, for support and FAQ's and such. should we go ahead and sub domain this like: community.test.com or rater go with test.com/community. I would really like to know why, why not and maybe some real life examples. Thank you all
Intermediate & Advanced SEO | | s-s0 -
Total Indexed 1.5M vs 83k submitted by sitemap. What?
We recently took a good look at one of our content site's sitemap and tried to cut out a lot of crap that had gotten in there such as .php, .xml, .htm versions of each page. We also cut out images to put in a separate image sitemap. The sitemap generated 83,000+ URLs for google to crawl (this partially used the Yoast Wordpress plugin to generate) In webmaster tools in the index status section is showing that this site has a total index of 1.5 million. With our sitemap coming back with 83k and google indexing 1.5 million pages, is this a sign of a CMS gone rogue? Is it an indication that we could be pumping out error pages or empty templates, or junk pages that we're cramming into Google's bot? I would love to hear what you guys think. Is this normal? Is this something to be concerned about? Should our total index more closely match our sitemap page count?
Intermediate & Advanced SEO | | seoninjaz0 -
Relative paths vs absolute paths for links - is there a difference?
Is it better to use links like: some link VS some link is there a difference for the search engine algorithms? Thanks.
Intermediate & Advanced SEO | | cdolek1 -
Mobile SEO vs. normal SEO?
Hi everyone, I wanted to ask you abour your opinon on mobile SEO. Do we already have two different Indices, one for mobile, one for desktop? Except a few mobile listings I don't see a difference yet. If yes, do I need to do special mobile SEO for my site or is it enough to have e.g. a responsive webdesign which detects the device and shows a different page? Are there any other extra Mobile SEO measures that should be considered? I know of the Mobile Sitemap and directories but is there anything else? Best regards
Intermediate & Advanced SEO | | CrazySEO0 -
How to redirect www vs. non-www in IIS
I have been wanting to set our site up to redirect non-www to www for the SEO benefits so often described here on SeoMoz. I see a lot on Apache but not so much for IIS. Is there any developers here that can point me to a how tutorial for people with little IIS experiences?
Intermediate & Advanced SEO | | KJ-Rodgers0 -
A global brand with localised microsites - distinct TLDs or directories by territory?
Hello, Looking to create an export site for a gobal brand and considering the benefits of distinct domains/TLDs vs. directories by territory. I.e. brand.fr vs. brand.com/fr for our French content
Intermediate & Advanced SEO | | Urbanfox
brand.ca/fr vs brand.com/ca/fr for our French Canadian content Apple segregate their content by directory but we're not quite Apple to be fair... Directory route would be technically cleaner but I don't wish to discount the SEO benefit of unique TLDs. Any thoughts / considerations / similar experiences? Thanks, Jan0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0